It seems, at any rate, a good subject to broach in my first post for Cliopatria. My joining this blog, at the very kind invitation of Ralph Luker and his fellow Cliopatriarchs, signifies that I have come to terms with being a graduate student who blogs. Feeling comfortable with that fact has not been easy, however, especially since when the subject does come up, it is often in the form of cautionary tales about"blogging from the bottom" of the academic totem pole. These cautionary tales are usually a round-up of usual suspicions: hiring committees will wonder about the work ethic of academic bloggers; they will raise eyebrows at the political or professional views expressed by their job candidates online; they will wonder about the mixing of personal and professional life on blogs. Of course, blogging graduate students are not the only academic bloggers who wonder about how their presence online affects their professional prospects. The subject of how blogging affects tenure committees also comes around the horn every so often, but rather than representing a different issue, it addresses the same basic concern: What is the relationship between academic blogging and professional security? The question is only most acute for graduate students because our horizons of professional possibility are the most open-ended.
One reason the jury remains out on that question is because the trial has only recently begun. Since blogging is a relatively recent phenomenon in academia, and because its profile in the mainstream and academic media is just beginning to rise, hard data on how blogging correlates to job hiring or tenure decisions are scarce. This means that discussions of the subject are rife with anecdotal evidence that warrant, at most, a certain agnosticism. As a recent profile of MLA bloggers (including Cliopatria's own Miriam Burstein) at Inside Higher Ed put it,"It's hard to draw too many conclusions about these blogs. They haven't been around that long (the oldest one in this group started in 2002). And it's hard to know what impact the blogs will have on these academics' careers (the oldest is 38 and none have tenure)."
It's easy to see, however, why the question continues to be of interest, even if it is hard to know what impact blogging really has on professional prospects. We do have hard data on the job market, and especially for historians, that data can be discouraging. So it would be entirely natural for history graduate students to conclude that, given the harrowing conditions of the job market, it is better to be safe than sorry. It is better not to do anything that might possibly jeopardize one's career. If the jury is still out, it's best not to disturb their deliberations by rapping on the jury room door.
But the fact that this tendency is a natural one is one of many instances in which agnosticism really serves as a thin mask for a full-fledged opinion. After all, if the evidence really is out on whether blogging is professionally damaging, then why is the most reasonable position to conclude that it probably is? Why can't agnosticism just as easily validate behaving as if the evidence might come out the other way? The problem here is not that graduate student bloggers are acting in the face of clear risk to their careers, but rather that graduate student bloggers are resolving to pursue a certain course under conditions of uncertainty.
As James Kloppenberg pointed out in his magisterial study of fin-de-siecle Progressive thinkers, Uncertain Victory,"Uncertainty can animate or disable. When certainty inhibits exploration, its loss can be liberating, but when conviction fortifies resolution, doubt can end in paralysis." In the case of blogging, agnosticism about professional implications might potentially liberate graduate students to explore new possibilities for intellectual discourse. Why, then, should uncertainty necessarily paralyze? Conversely, though, why resolve to blog in the absence of a clear conviction about its professional value?
I thought of such questions while reading Timothy Burke's recent essay on why he blogs. I find that many of my reasons for blogging are the same as his. But in a comment thread on another blog, one of Burke's colleagues asked him about the"power dynamics" of blogging for untenured professors -- dynamics that are equally relevant for graduate student."Can they dare put out into the blogosphere what Tim (as a tenured professor) can put out there?" she asked. Burke's response was ...
... no, not unless they're unusually fearless.Those sentiments aptly summarize some of the confusing reservations that I feel as a graduate student blogger. On the one hand, I shouldn't worry about the content of my blogging offending someone. On the other hand, I should remember that my content will be forever associated with my name. On the one hand, I shouldn't be afraid of job committees finding my blog. On the other hand, to blog requires me to be"unusually fearless." My resolution to blog often founders on these vaguely incompatible shoals of uncertainty.
It's not because the content might offend someone, but because most academics still perceive blogging (if they perceive at all) as greasy kid's stuff, as something done by marginal scholars. ...
What graduate students who blog need to remember is that even if they later abandon their blog, it will not disappear. If their name is out there and associated with particular arguments, sentiments, claims, it can be found if someone really wishes to find it. Though I think it's pretty rare that someone does: I doubt if more than the smallest fraction of academics google their job candidates' names, for example.
Does my blogging, then, in the face of such uncertainty, really require an"unusually" large amount of fearlessness? It depends. The graduate student blogger is only unusually fearless if the most fearsome possibility imaginable is failure to secure a certain kind of academic job. And surely there are much greater things to fear which are far more usual. In comparison to many things, the fear of suffering the opprobrium of a tenure committee pales. Nonetheless, blogging as a graduate student or an untenured professor does require resolution unfortified by certainty. But it would be a sad thing indeed if this kind of fearlessness really is unusual in academia.
The kind of intellectual exploration that academic life is supposed to encourage often depends on venturing into the public sphere without the assurance of certain rewards or the guarantee of approbation. Perhaps, then, there is something to be said for graduate student blogging as an apprenticeship in learning to be animated by doubt, rather than disabled. It may seem that this kind of intellectual"fearlessness" -- the courage not to be paralyzed by doubt -- is of a different sort than the resolution required to blog despite agnosticism about professional gains. But I'm not sure those two kinds of resolution are unrelated. As Kloppenberg's book demonstrated, the decline of certainty in epistemology and metaphysics coincided historically with the rise of academic professionalization in philosophy. The gate-keeping procedures of modern university life are attempts to replace philosophical certainty with professional certification.
Certification is the rough approximation that we have now for the epistemological certainty that thinkers before the nineteenth century usually possessed. It is a way of allowing us to continue to think, to learn, to teach, to adjudicate, even without the assurance of reaching solidly certain conclusions about intellectual matters. Professionalization, in other words, is a kind of intellectual therapy that keeps the modern mind from being disabled by its doubts. What would happen, then, if we allowed ourselves to be disabled by professional uncertainty? What would then fortify our convictions and keep us thinking?
By pursuing this line of thought, I don't mean to exalt the blogging graduate student as the last action hero of academic life. On the contrary, to be an academic these days is, unfortunately, to be uncertain about what the virtues of intellectual heroism would look like. The hermeneutic of suspicion bequeathed to us by Kloppenberg's uncertain philosophers makes us equally suspicious of either triumphalism or dismissiveness when it comes to blogging or, for that matter, most other social practices. But given that we find ourselves in this doubt-full position, we must at least find practices that encourage us to keep moving despite our uncertainty -- either about our convictions or our careers. For me, at least, blogging has been a practice (like the best therapy) that is sometimes uncomfortable, but that slowly increases my range of intellectual motion and keeps me from being paralyzed by doubts. I don't claim to be unusually fearless by blogging; rather, what I'm trying to acquire is the usual quota of fearlessness (such as it is) required by contemporary academic life.
(Cross-posted at Mode for Caleb.)
Douthat doesn't go as far as Henry Adams; he doesn't refer to himself in the third person. But he might well have quoted Adams, his fellow Harvard alumnus, who made most of Douthat's points in 1918:
For generation after generation, Adamses and Brookses and Boylstons and Gorhams had gone to Harvard College, and although none of them, as far as known, had ever done any good there, or thought himself the better for it, custom, social ties, convenience, and above all, economy, kept each generation in the track. Any other education would have required a serious effort, but no one took Harvard College seriously. All went there because their friends went there, and the College was their ideal of social self-respect.Now, no one really takes Adams' lack of seriousness about Harvard seriously either. Indeed, Adams spends his entire autobiography complaining about the ill fit between his nineteenth-century education and his twentieth-century experience, but in the process of doing so he does a pretty good job convincing his readers that he's a pretty educated guy, in all senses of the word. Harvard must not have been that bad for Adams. After all, he went back to work there.
Harvard College, as far as it educated at all, was a mild and liberal school, which sent young men into the world with all they needed to make respectable citizens, and something of what they wanted to make useful ones. (p. 50)
But back to Douthat. There has been more discussion of the article at Left2Right, Brad DeLong, and Matthew Yglesias. Douthat speaks back here and here.
Most of these discussions concern the covering laws that Douthat suggests about contemporary philosophy -- that the dearth of metaphysics and morality in philosophy departments has doomed the discipline to popular irrelevance. Several philosophers in the above discussions have bristled. Since this is a history blog, though, I might as well point out that historians make out even more poorly in Douthat's article, what with all their pointless essay questions, microhistory, and, of course, their postmodern sensibilities. (I might respond by invoking Henry Adams too:"In essence incoherent and immoral, history had either to be taught as such -- or falsified. Adams wanted to do neither." Ah, the eternal dilemma of the conscientious history teacher.)
In both the article and one of his blog responses to critics, Douthat takes history and their disciplinary cousins to task for what he calls"the tendency of the humanities to become more scientistic in various ways over the last half-century -- via the dominance of Theory in the English and Literature departments; via the emphasis on primary research, material history, etc. in the History Departments; or more recently, via the rise of rational-choice theory in the realm of political science." This seems to me an odd way of thinking about what it means for the humanities to be"scientistic," and it is especially disorienting that the article dubs this baneful tendency as"postmodern." Emphasizing primary research and adopting rational-choice theory also seem to me horses of very different colors; I'm not sure I see what the tendency that Douthat is identifying is, unless it's the standard complaint that humanistic writing has become more science-like -- jargony and what not.
What I really wanted to point out, however, was Douthat's response to those (like me) who believe he underestimates the crucial role of the student in higher education. His response is that we overestimate that role. I'm inclined to agree (as Burke also seemed to suggest) that a balance has to be struck between administrative facilitation and student self-motivation in order for that mysterious cocktail called"education" to be shaken up. Douthat's complaint about those who place the onus on students reads like this:
I tend to think if you take a bunch of teenagers, however smart they may be, and drop them into a stress-ridden, hypercompetitive school in which the only academic guidance takes the form of a terrible, terrible Core Curriculum, most of them will take the path of least resistance, seek out easy classes and popular, potentially lucrative concentrations (hello, economics!), and generally fail to get the most of their four years. Is this a moral failure on the students' part, and therefore something that the administration and faculty shouldn't be concerned with? DeLong et. al. seem to think so. Their attitude is apparently that if you didn't do a good job picking, without any kind of guidance (I don't know what the advising system was like in DeLong's era, but it's nearly nonexistent now), thirty-two classes out of the hundreds and hundreds of potential offerings that Harvard flings at you -- well, then tough luck, buddy. And good luck at the consulting firm.It struck me while reading this how uncannily it sounds like a liberal view of government. We know from others of his writings that Douthat doesn't like what he calls"left-liberalism." But his impassioned critique of the university as a passive institution, and his defense of the betrayed student from the charge of"moral failure," sound an awful lot like he's a closet liberal. Try this experiment: go back through the paragraph above, and read it this way:
I tend to think if you take a bunch of [people], however smart they may be, and drop them into a stress-ridden, hypercompetitive [market] ... most of them will take the path of least resistance ... and generally fail to get the most of their [potential]. Is this a moral failure on the [people's] part, and therefore something that the administration and [government] shouldn't be concerned with? DeLong et. al. seem to think so. Their attitude is apparently that if you didn't do a good job picking, without any kind of guidance ... [life options] out of the hundreds and hundreds of potential offerings that [the world] flings at you -- well, then tough luck, buddy. And good luck [on the unemployment line].Okay, it's a highly selective edit, but it raises this question: Why are so many conservative critics willing to make structural arguments about the failures of"the system" when it comes to higher education, while they sneer at the insistence of liberal critics that"failure" in the school of hard knocks is not knock-down evidence of"moral failure" on the part of people who need help now? Are the rules that apply to universities different from those that apply to all other social institutions?
(Cross-posted at Mode for Caleb.)
"An introverted history major at Brown University is romantically involved with another history major. His hobby is guitar. Through writing songs to her about history, he is able to express his feelings about their relationship. Once she hears his songs, she is able to respond to his feelings. Their love is resolved successfully. The songs are:Casting suggestions? Alternate endings?
Baby we need a little common sense like Thomas Paine
Baby I need a Declaration of Independence
Baby you're like The Founding Fathers they had slaves
Baby we're having a Civil War
Baby this is Guadacanal
Baby this is DDAY
Baby this is Viet Nam
Baby this is Iraq
Baby we need a Bill of Rights"
One of the things I suggested was this:"There seems to have been an enormous rush to judgment that the evidence provided by Holly Jackson, the Brandeis graduate student aforementioned, is straightforward proof about Kelley-Hawkins' race." That was a sloppy sentence, because it implied that I think Jackson might be wrong about Kelly-Hawkins' whiteness. McLemee, who generously linked to my post from his main page, responded thus:
As for CM's idea that there is a"rush to judgment," that certainly crossed my mind. But Holly Jackson actually makes a strong case for EDKH as white, while nobody has any solid evidence that she was black -- and when you read the scholarship, you notice people bending themselves out of shape to find some reason to think that she was. (I could have said more about that in the piece, including one case that verged on outright dishonesty, but that seemed like overdoing it.)
I agree completely that Jackson's evidence on the matter demolishes the traditional story about Kelly-Hawkins, since the traditional story is now apparently supported by no evidence at all. But if you look closer at my original statement on the"rush to judgment," what I wanted to question was not Jackson's particular evidence about Kelley-Hawkins, but rather the general presumption that certain kinds of historical evidence about race are inherently more decisive than others.
The idea that Kelley-Hawkins was black seems to have rested largely on the ambiguity of a photograph that appeared in one of her novels. Jackson's case for her whiteness rests, on the other hand, on family memories and documentary sources like the census. What I wanted to offer, however, was a cautionary warning that documentary sources about race are not necessarily less ambiguous than pictures. For instance, I wondered in the post about how nineteenth-century enumerators recorded" color" or"race." If, in some cases, determining a person's"race" was left to the enumerator's discretion, then a"W" in the census is not necessarily less ambiguous than a picture. It could simply record what a contemporary saw when he looked at Kelley-Hawkins. Does the census always offer us more information about race than meets the eye, or simply the same"information" provided from a contemporary perspective and packaged in documentary form? Such, at least, is a question I thought worth asking about Jackson's evidence. (An article by Martha Hodes in the February 2003 American Historical Review raised these questions about another Massachussetts family in the nineteenth century.)
At Coffee Grounds, Evan Roberts replied with an extremely helpful post about census procedures, which includes links to instructions that were given to enumerators. This information supplemented comments that Julie Meloni made on my original post, drawing on her own experiences using the census for genealogical research.
I highly recommend Evan's post, and I entirely agree with this sentence:"What I think is significant is that Emma Dunham Kelley-Hawkins and her ancestors are always described as white. That is firmer evidence of being white, whatever 'being' and 'white' mean." What I wanted to say in my post was that we should continue to add that final clause to whatever we say about race in the nineteenth century. This does not mean that we cannot make judgments, on the basis of contemporary descriptions, about the"race" of historical actors. But it means we must always keep in mind how ambiguous and arbitrary such descriptions about race could be and still are. And we must careful not to assume uncritically that certain kinds of evidence -- like the census -- somehow offer glosses of what"being" and"white" meant that are more determinate than other kinds of evidence -- like photographs. (Incidentally, Coffee Grounds also has two recent posts on the distinction between"quantitative" and"qualitative" sources that might be relevant here. I'm suggesting that even"quantitative" sources like the census were"qualitative.")
That's why I also suggested that the scholarship that has been done under the assumption that Kelley-Hawkins was not white should not simply be thrown out with the bathwater as a bunch of political misdirection. By presuming that Kelly-Hawkins was a black author creating aggresively white characters, that scholarship had to wrestle with what"being" and"white" meant. Without having read any of the scholarship that McLemee profiles, I think its insight that race can be malleable and strategically deployed is a valuable one. Although in this case scholars turn out to have been barking up the wrong tree, the bending themselves out of shape that they had to do is the kind of exercise that now makes us more limber as historians of race. In that limited sense, the underlying assumptions and theoretical underpinnings of that scholarship were not worthless. And it would be a shame if the conclusion that people drew from Kelley-Hawkins' case was that any scholarship on the cultural construction of"whiteness" and"blackness" is simply the product of postmodern mumbling and political correctness.
It is especially difficult to make them cohabit in the classroom.
Discussion-based history classes are usually organized around historical texts -- novels, autobiographies, slave narratives, and so on. We assign such texts to students partly as primary sources. Their existence tells historians something about the times in which they were produced. Yet we also want students to approach these texts like literary scholars, to think about how texts work. The difficulty, for teachers and students alike, is to approach texts in both ways at the same time.
Consider a syllabus favorite like Frederick Douglass' Narrative. On the one hand, the text serves well as a window onto the experience of enslavement in antebellum Maryland. On the other hand, the Narrative is clearly not mere reportage. Douglass is reporting events that actually happened, but he is also engaged in particular rhetorical projects, which are shaped by still other events in his life and other texts he has read. For instance, Douglass foregrounds gruesome examples of slave women being whipped partly because he knows that antislavery readers expect such examples as part of the genre. Douglass also addresses the Narrative to particular defenses of slavery being offered in the North, interjecting at several points that if slaves sometimes seem contented, they are only pretending to be so for their own safety. His examples are selected and presented not just as episodes in a memoir, but as evidence for an argument.
Yet many students are more comfortable thinking of a text like the Narrative as a report rather than as a rhetorical project. How, then, does a teacher help students analyze the rhetorical and argumentative structure of the text without undermining its value as a piece of reportage?
Often the surest way of helping students to read a text as rhetoric is to present it to them as fictional or false. If you posit some disconnection between actual events and a text, it is easier for students to address the question of how the text"works."
Suppose, for example, you are teaching another syllabus favorite: Olaudah Equiano's narrative. In an earlier comment thread, Timothy Burke and Jonathan Dresner had a brief exchange about Vincent Carretta's hypothesis that Equiano was not born in Africa, as his autobiography suggests, but in North America. This hypothesis is still hotly debated by scholars (I've been reading Adam Hochschild's Bury the Chains, which includes a thoughtful appendix on the debate). But from a pedagogical standpoint, Carretta's hypothesis is useful because it unsettles students' expectations about how the text came to be and where Equiano came from. Once we are open to the possibility that Equiano was not born in Africa, it becomes easier to think about how his representations of Africa work. What conventions of abolitionist literature do they follow? How do they reflect Equiano's views as a Christian? How do they address particular arguments being circulated in the Atlantic World about the"savagery" of native Africans? Raise a question about how the text came to exist, and students eagerly discuss how the text works.
For similar reasons, it is easier to ask students about how proslavery texts work than it is to ask how antislavery texts work, because students are (hopefully!) constitutionally skeptical about the former but inclined to trust the latter.
For example, Catherine Clinton's new biography of Harriet Tubman quotes from a Philadelphian, John Bell Robinson, who published a fierce attack on Tubman after she brought her parents to the North in 1857. His"invective became even more lethal when he launched into a diatribe about [Tubman's] removal of her aging parents from a slave state. Robinson's reasoning was that of a quintessential proslavery apologist: 'Now there are no old people of any color more caressed and better taken care of than the old worn-out slaves of the South ...'" (p. 143).
Here students are likely to have no problem seeing that Robinson's text is doing certain kinds of rhetorical"work." At the very least, Robinson's claims are unlikely to be taken as simple reportage about the treatment of elderly slaves, especially once students learn that Tubman's parents were already free when Tubman brought them North. So Robinson has his facts wrong in more than one way. But then Clinton goes on to point out that"it suited both proslavery and abolitionist camps to portray Harriet's parents as an elderly enslaved couple. One side claimed their dependence upon some fictive master's goodwill, while the other painted the harsh cruelties of whips and chains if they did not escape" (p. 144). Even though Tubman's father had been manumitted in 1840 and her mother had been free since 1855, abolitionists sometimes folded their story into Tubman's other heroic rescues of enslaved family members.
Clinton's point would probably help students see how abolitionist texts"worked." But the lesson learned may come at a high cost. For it would now be easy for students to wonder:"If Robinson was lying and had his facts wrong ... did abolitionists also have their facts wrong?" The realization that Robinson had a rhetorical argument to make helps students call into question his facts. But once you point out that abolitionists also had a rhetorical argument to make, students might wonder whether their facts were wrong too. That's certainly not necessarily bad, but it can be if students conclude from this discussion that abolitionists were"as wrong" as Robinson was -- and wrong in the same ways.
What I'm getting at here are old and familiar problems -- about the relationship between authors and audiences, rhetoric and reality, texts and facts. But I'm encountering these problems for the first time from the perspective of a teacher. And I'm worried about the potential pitfalls in the pedagogical methods I've been describing -- using the Caretta hypothesis, for instance, to discuss the rhetorical structure of Equiano's narrative, or pointing out that abolitionists and proslavery apologists alike overlooked the freedom of Tubman's parents because that fact did not serve their arguments.
My worry is that students will learn to associate the idea of"rhetoric" with dissemblance. The strategies I've outlined might reinforce a preexisting sense that rhetoric can be equated with bias, which has an almost universally negative connotation as antithetical to truthfulness.
I remember facing a similar pedagogical challenge when I worked as a tutor in symbolic logic. Any Introduction to Logic course begins by drawing a basic distinction between the validity of an argument and its soundness. An argument is formally valid if the premises entail the conclusion. But a sound argument is a valid argument whose premises are also true. I often found that students had a difficult time understanding the distinction between validity and soundness. The easiest way to help was to present an argument that was valid but clearly unsound. For example ...
If the moon is made of green cheese, then two plus two equals four.
The moon is made of green cheese.
Therefore, two plus two equals four.
Clearly, if the premises to this argument are true, then the conclusion is also true. But in this case, also clearly, the second premise is false. (It throws students for another loop to inform them that the first premise is true, but that's another issue ...) The argument is valid but unsound. Usually such examples help students distinguish between validity and soundness, but inevitably some students will start to think of valid arguments as always unsound. That is, they will associate validity with moons of green cheese.
The analogy isn't exact, but the pedagogical problems with teaching texts are similar. You can show students how Equiano's arguments worked by calling into question whether he was born in Africa. But then you risk encouraging them to associate rhetoric with falsehood. And that would be to fail in your original objective, which was to show how even a report that gets its facts right is structured according to certain rhetorical and narrative conventions.
I talk about this as if it is merely a pedagogical problem, but of course it isn't. There are thorny issues of textual representation and rhetoric here that befuddle all historians and literary scholars. But as a beginning teacher, I'm discovering for the first time how especially thorny these problems can be in the classroom. And although I think one goal of education is to model informed and thoughtful befuddlement, confusion does not always signify an appreciation of complexity. Advice from non-beginning teachers (or swifter beginners) would be very much appreciated.
(Cross-posted at Mode for Caleb.)
One of the feature essays is by Albert J. Raboteau, Professor of Religion at Princeton University, author of Slave Religion: The"Invisible Institution" in the Antebellum South, and an Orthodox Christian. There is also an excellent piece by Gary B. Nash, emeritus Professor of History at UCLA, which uses three biographical sketches to show that the evangelical Christianity of the Great Awakening often empowered"ordinary people" to think and speak for themselves. Nash points to Richard Allen (one of the founding ministers of the African Methodist Episcopal Church), Jarena Lee (an African American woman who became an exhorter in Allen's church), and Lorenzo Dow (a poor but popular Methodist circuit rider without a formal education) as examples of what Nathan Hatch has described as the"democratization of American Christianity."
I was especially struck by Nash's discussion of the relationship between Jarena Lee and Richard Allen, because it contrasts with a similar relationship I've just been reading about. Lee's autobiography is published in Sisters of the Spirit, a collection that also includes the autobiography of Julia Foote, another African American woman who became conscious of a calling to preach after converting to Methodism. But Foote's experience with the elders of her AME church was very different from Lee's.
According to Nash, Richard Allen was initially reluctant to invite Jarena Lee into his pulpit. But after her repeated demonstrations of powerful preaching, he eventually relented. At the end of the 1830s, however, Julia Foote was cast out of the First AMEZ church in Boston, where the minister was African American abolitionist Jehiel C. Beman. Like Lee, Foote had ecstatic visions. She claimed to have heard the voice of God addressing her. But her charismatic sermons and her claims that she had been sanctified displeased her husband and the men of the church. In her autobiography, Foote recalled Beman forbidding her to preach her"holiness stuff" to the flock, despite the requests of some"elder sisters" to give her a hearing.
So Foote began preaching in small house meetings to disaffected members of the congregation. Needless to say, this did not make Beman happier, and according to Foote, he excommunicated her when she refused to stop preaching. Last Thursday, while reading about Foote and her visions in the beautiful new African American history reading room at the Enoch Pratt Free Library, I was moved by her description of Beman's skepticism. After one night of particularly vivid visions, Foote wrote,
... my minister, Jehial C. Beman, came to see me. He looked very coldly upon me and said: 'I guess you will find out your mistake before you are many months older.' He was a scholar, and a fine speaker; and the sneering, indifferent way in which he addressed me, said most plainly: 'You don't know anything.'
After reading Nash's article and reflecting back on Foote's story, I had two thoughts. They are not necessarily related to each other, and they are not directed at Nash's piece in particular:
The first thought: the democratization of American Christianity had important limits. On the one hand, Jehiel C. Beman himself could easily serve as another example for Nash's article. The son of a Connecticut slave who was freed after fighting in the American Revolution, Beman was a shoemaker who became a prominent pastor, a founding member of the American Anti-Slavery Society, and a leader of African American voluntary associations in various parts of New England. But Beman's faith, while serving as an important agent of his own empowerment, did not necessarily entail Foote's empowerment. This would come as no surprise, of course -- to Nash or to Hatch, whose book charts a retreat by many American Methodists from their most democratic impulses. But I think it's important to include the stories of Footes and Bemans alongside the stories of Lees and Allens, if only to emphasize that the democratization spurred by Christianity was incomplete and fitful. It is also important to stress that even the"ordinary people" empowered by evangelical Christianity did not thereby find themselves in agreement on the scope of that empowerment.
A second thought: As the special issue of the Boston Review demonstrates, there are currently many efforts afoot to reclaim the democratic (small"d" or big"D") legacies of American evangelicalism. And pointing to figures like Foote serves these projects well. Yet I sense that many on the Left would like to applaud Foote's empowerment while simultaneously pooh-poohing her charismatic faith or endorsing her visions. In this sense, though, many scholars find themselves in the position of Beman. While disagreeing with his patriarchal repression of Foote's voice, they echo his"sneering, indifferent" condescension toward her"mistake." But if this is intended as a strategy for wooing evangelical Christians toward the Left, it seems to me like one that is doomed to fail. It does no good for democrats to say to"The Believers" that their history is progressive and democratic, if at the same time they say,"You don't know anything." It could be, of course, that many Democrats do not really mean to talk to"The Believers" at all. Perhaps many are mainly talking to each other about"The Believers," as if to say,"Those people sure don't know anything, but at least they're ordinary." I trust I don't need to point out why this will seem as cold and condescending to the Footes of the twenty-first century as Beman's reaction seemed to Julia Foote herself.
I think it is important for historians to tell the stories of people like Dow, Allen, Beman, Foote, and Lee. Many people who identify America as a"Christian nation" tell the history of evangelicalism as if its telos were the radical social conservatism of people like Tom DeLay. And articles like Nash's are useful for proving that Christianity in this nation has often been more open and democratic than the Christianity of its latter-day adherents. But ultimately, these historical ripostes to conservative Christians have a limited usefulness. First, they require historians of Christianity to follow Christians propagandists in ironing out the wrinkles in their favored visions of the past. As I've argued before, the major problem with lobbyists like David Barton of Wallbuilders is that their historical arguments are selective and oversimplified: they wave a copy of a Thanksgiving Day sermon and declare that the nation has Christian foundations. It's tempting to do the same thing from the other side: to wave a copy of Julia Foote's autobiography and declare that American Christianity has democratic foundations. But the more important contribution for historians to make is to point out the complications in both attempts to make the past staightforwardly"usable" in the present.
Second, and more importantly, offering duelling histories of American Christianity often keeps discussions in the past. And although (as an historian, of course!) I am all for discussions about the past, we must be careful not to put off indefinitely the hard conversations between"The Believers" and"The Non-believers" that need to be taking place in the present. At some point, those who think that Julia Foote is a democratic exemplar of Christian empowerment are actually going to have to listen and talk to the"ordinary people" who believe in God as fervently as she did. I do think that in many cases these conversations are beginning to get underway, and I wish them long life.
All of the reviewers make clear that Reynolds wants to resuscitate Brown's reputation and to show that his violence was not maniacal or insane, but intelligible, radically egalitarian, and perhaps even necessary. Following the Transcendentalists' own celebration of Brown, Reynolds apparently portrays Brown as a hero. (In the best review I've read, David Blight reports that Reynolds casts Brown anachronistically as a"good terrorist.") But every hero needs a foil. And for the reviewers, as perhaps for Reynolds too, that part is furnished by Northern white abolitionists like William Lloyd Garrison, who are portrayed as though they were lily-livered sissies and passive pacifists until Brown came along to steel their nerves.
Gopnik, for instance, calls William Lloyd Garrison"the white Martin Luther King, Jr.," but he adds a"but.""But Garrison, like Dr. King, was a pacifist, and, right up to the moment when the war broke out, he had no really practical plan for ending slavery, aside from 'separation' (i.e., the decoupling of the North from the South) and moral suasion." While Gopnik tars"moral suasion" with the usual brush--it was not"really practical"--Hitchens goes farther. After noting that Reynolds goes to great lengths to rationalize Brown's violent methods, Hitchens glibly says that the"superfluity" of such apologies is"easily demonstrated. Not only had the slaveholders perpetrated the preponderance of atrocities, and with impunity at that, but they had begun to boast that northerners and New Englanders were congenitally soft."
Hitchens seems to agree that non-Brown abolitionists were"soft"; he relishes Reynolds' comparison between Oliver Cromwell and Brown, and he refers to moderate antislavery Northerners as"invertebrate Lincolnians." Ehrenreich echoes the spineless pacifists theme, writing that"antislavery activists ... were often pacifists and usually the victims of their political opponents -- a relationship symbolized by a South Carolina congressman's crippling beating of the abolitionist Charles Sumner on the floor of the United States Senate. With his guns and pikes, Brown reversed the equation -- stiffening the backbones of Northern abolitionists, terrifying the white South." Even Gopnik, who goes relatively easy on the Garrisonians, cannot resist saying,"Where Garrison, though utterly passionate and courageous in his denunciations, was a thorough man of the North, with lawyerly-journalistic gifts of argument and irony, Brown was a man of romantic feeling." And, referring to an 1835 incident in which a violent crowd of rioters attempted to lynch the radical editor, Gopnik avers that"even Garrison, a man of unexampled courage, could not face down a mob in Boston but had to be saved by the police."
Umm ... he was facing a lynch mob that had managed to tie a noose around him. Is Gopnik really prepared to say that Garrison was less courageous than Brown because the police rescued him from his attackers? (Actually, that's not even entirely accurate: Garrison was lifted to safety by a couple of burly rioters who took pity on him. And he was driven away to safety by an unidentified black hackney-driver, who used his whip to keep the crowd at bay. The police assisted in Garrison's rescue only grudgingly, if at all, and when Garrison was brought for protection to City Hall, he was told that he could not stay there because his presence made the building unsafe.)
There was probably never a day in Garrison's adult life when there was not a bounty on his head somewhere in the South. But was he somehow less courageous than Brown because, unlike the Old Man, he was unwilling to lop off the heads of Southerners?
When historians compare radical reformers, it is certainly appropriate to ask about the practicability of their different methods and even to judge the consistency of their convictions, not because historians are the best judges of character, but because making those judgments can help reveal what their bedrock convictions were. But there seems to be something more going on in these comparisons between Brown and Garrison. What seems to be driving the resuscitation of Brown's reputation is not just an historical judgment but an ethical judgment about his superior courage and radicalism. Read between the lines and you'll find the essentially ahistorical insinuation that principled pacifists are really cowards; that those who choose liberty or death to its enemies are more radical than those who would rather die than kill; that meekness is weakness; that the vision of lions lying down with lambs is a pleasing fantasy invented by lambs; that, by a process of elimination, people turn to pacifism when they don't have"practical" plans for making society more just. Pacifists, to paraphrase Ehrenreich, are seen as"victims." Only the violent are thought of as valient.
Clearly these statements are moving outside the realm of purely historical analysis and into the realm of ethics. I'm not necessarily uncomfortable with that movement; I don't believe people who tell me they can study history without allowing their thoughts to at least drift in the direction of ethical reflection. But so long as we are headed in that direction, allow me to point out the irony of book reviewers consigning"moral suasion" to the dustbin of history and dead pacifists to the ship of fools. What are historians and writers, after all, if they are not persons who believe that the word is more powerful than the sword? And if they do not believe this, then why are they in a byline instead of on a frontline?
I realize that question might not be entirely fair. Not everyone who believes that violence is more radical than moral suasion is thereby obligated to take up arms and rid the world of its wrongs. But such reviewers are essentially castigating people like Garrison for failing to live up to their convictions, to stiffen their spines, to get their hands dirty or bloody. So forgive me if I can't avoid poking a bit at the inconsistency of"inverterbrate Hitchensians." (One could point out the same thing about the Transcendentalist scriveners who were most responsible for Brown's apotheosis. Was Emerson really more courageous than Garrison simply because his words celebrated antislavery violence?)
But that's not the main point I want to make here. Skeptical as I am of the ethical claim that violence is always more radical than nonviolence, I am even more concerned that this view is historically suspect, for at least two reasons.
First, nonviolence was not merely an instrumental strategy for many radical abolitionists; for many of them, it was integral to their most radical ideologies. If we view their pacifism as nothing more than a strategy or personal trait, then it is easier to portray that pacifism as a sign of whimsy or weakness. But in fact, for many Garrisonians, a commitment to"nonresistance" was much more than a mere strategy, and certainly more than a simple sign of courage or its lack. It was at the core of their critique of slavery, government, and much else. According to nonresistants, any exercise of violence was an unjust usurpation of God's authority, an immoral abuse of power. From their perspective, that was a large reason why slavery was wrong--it assigned to the master violent power that did not belong to him or her. For many Garrisonians, then, their renunciation of violence was of a piece with their renunciation of slavery. To call their pacifism a mere lack of spine ignores how it shaped their posture towards slavery and other violent abuses of power--like the treatment of Native Americans, the hawkish expansionism that sparked the Mexican War, and unequal marriages.
I could generalize this point to other theorists of nonviolence like Gandhi or Martin Luther King, Jr. For both men, nonviolence was not simply a strategy or a practical plan. Both argued that direct nonviolent action was more expedient than violence, but this was not their only defense of pacifism. Rather, the commitments that informed their pacifism also informed their views of the state, of the human person, of justice; to remove the pacifism would not just be to make a change of"plan," it would force thinkers like Gandhi and King to rethink their entire philosophies. It's also simply false in their cases, as in the case of the Garrisonians, to suggest that nonviolence put a brake on their radicalism. Some of their contemporaries certainly did suggest that--think John Brown or Malcolm X--but they were not necessarily right. Progressives today usually praise King in his later years for moving in more radical directions in his thinking about poverty and the war in Vietnam, but they often forget that this trajectory was an outgrowth of his philosophy of nonviolence. His radicalism, like Garrison's, did not view pacifism as a mere tool in the reformer's hand, but as part of the hand itself.
Second: not only is nonviolence often integral to radical programs; violence is often integral to conservative or reactionary worldviews. It may seem as though John Brown's belief that slaves and abolitionists needed to rise up in holy war against the South could only have radical and egalitarian overtones. But that very belief was also integral to the arguments of those who opposed racial equality and emancipation. In an article in the Journal of American History that recently won the ABC-CLIO Award from the OAH, my friend Francois Furstenberg has argued persuasively that the definition of"freedom" as"resistance" to oppression might actually have served to legitimate personal slavery, since it allowed defenders of the system to claim that slaves who did not resist their enslavement were somehow" choosing" their plight autonomously. Implicitly, I think, calling Garrisonians or Lincolnians"spineless" can potentially point in a similar direction, since it suggests that those who do not, like Brown, put their swords where their words are must not"want" freedom as much.
There are also gendered overtones to the idea that Brown was more genuinely radical than Garrison, since violent resistance was defined throughout the antebellum period (as it probably still is for many people today) as a"masculine" virtue, an act in which men prove that they are manly men. The word"sissy" itself carries that overtone, and to imply, even indirectly, that Garrison was a sissy also comes across as a derogative accusation of effeminacy. The connected implication is that women are incapable of proving their mettle the way that John Brown could. Hitchens' review opens with a paragraph that suggests I'm not making this up. He relates the story of Lincoln's telling Harriet Beecher Stowe that she was the woman who started the Civil War. Says Hitchens,"That fondly related anecdote [about Stowe] illustrates the persistent tendency to Parson Weemsishness in our culture. It was not all the tear-jerking sentiment of Uncle Tom's Cabin that catalyzed the War Between the States. It was, rather, the blood-spilling intransigence of John Brown, field-tested on the pitiless Kansas prairies and later deployed at Harpers Ferry." Women novelists become"tear-jerkers" and sentimentalists, on this view, and thus incapable of catalyzing social change. Rather, it's the"field-tested" violence of John Brown and his manly men that get the credit for emancipation.
In sum, while it certainly is appropriate for historians to compare and contrast Brown and Garrison, and to weigh the relative radicalism of their approaches to emancipation, it is historically misleading to suggest that their positions on violence are failproof indicators of their radical commitments. I'm looking forward to reading Reynolds because I think that Brown's reputation is in need of some resuscitating and subtle revising. But why is it that reputation-revivals in history must so often be zero-sum games, so that someone else's stock has to fall for someone else's stock to rise? In this case, I've suggested, it's unfair to praise Brown's radicalism at the expense of Garrison--at least if one is doing so by suggesting that Garrison's pacifism was nothing more than a lack of courage or clear thinking. It certainly is true that nonviolence sometimes is a sign of cowardice, but so is violence. It's always startling to me that despite the fact that most people accept detailed taxonomies of different kinds of violence, which range along a spectrum from justified and heroic violence to illicit abuse, very few of us have similarly well developed taxonomies of different kinds of pacifism, which can also range from the heroic to the thoughtless. I have suggested that a simple dichotomization of radicalism that places"fight" on the one hand and"flight" on the other does violence to history. I also think it does violence to our moral intuitions, but I don't need to make that argument to prove that, historically, (a) nonviolence is often integral to radicalism and that (b) violence is often integral to conservatism.
(Cross-posted at Mode for Caleb.)
Of course, to have access to this resource, you have to be somewhat savvy, because there is not yet a portal page on Google's site for searching books. If you don't already know it, you can tap the vast resources of Google Print in one of at least two ways:
(1) When searching at Google, begin your search string with the word"book" or"books" and then enter your query as usual. If Google Print has book pages that match your query, you should see about two or three"book results" listed above your search. (Example.) You can either click on the individual results or on the headline that sends you to all of your book results. (Example.)
From there you can search within particular books (check the sidebar of an individual result page), look at the index and table of contents for a book, and even scroll through about two or three pages around your result page. Once you are within Google Print, you can also"search all books" by using the form entry box located either at the top of the page or at the bottom. (Hat-tip: Search Engine Watch.)
(2) Another way to get into Google Print is to use this link and then enter your search at the top of the page. (Hat-tip: NT Gateway.)
The scholarly possibilities here are staggering. Google Print makes it possible, for instance, to search for published books that cite a certain book or article--a feature that was difficult to do before without access to some kind of citation-tracking database. Most of all, Google Print makes it possible to see whether there are books that mention a particular name or word, even in passing--something that was nearly impossible to do before.
For instance, if I want to see books that mention the Kentucky abolitionist"Cassius M. Clay," I just do this and get 76 hits. In the"real" world, as they say, I would have had to determine that those 76 books were relevant to Clay, find them on the shelf, and then hope that the book's author or editor had listed"Clay" in the index. All of this was possible before, of course, for academic journals and other kinds of periodical literature. And it was even possible in digital collections of historical books, like the Making of America site or the Samuel May Anti-Slavery Collection at Cornell. But with Google Print, the digital keyword revolution has truly arrived, and the end is not in sight.
What should we make of this revolution, and how revolutionary is it? In the latest issue of Perspectives, there's an article by Carlo Ginzburg considering that question. (There are also two fantastic articles on history blogging by my fellow Cliopatriarchs, Ralph Luker and Manan Ahmed.) Ginzburg argues persuasively that keyword searching in library catalogs is good for scholarship, primarily because"the computer multiplies the possibilities that an unforeseen fact will take us by surprise." (In the above search, for instance, I was surprised to see Clay mentioned in The Education of Henry Adams as one of the morose young man's diplomatic"masters." It turns out that Clay is listed in the index of my printed copy of Education, but I don't know that I would have looked there intentionally for a mention of Clay.)
Of course, that capacity for surprise is not limitless, because we must have some reason for entering in the keywords that we do, and usually our intuitions here are guided by our prior research or the work of others. But it is significant that keyword searches allow us to navigate through texts largely without the mediation of editors, authors, and publishers.
On the other hand, the excitement of surprise can be misleading to a researcher. The temptation when doing keyword searches is always to think that your results are more representative than they are. (This is something I've mused about before.) If I look in the printed index to a book and see one page listed for Clay out of 450 or 500, I can make a rough and ready judgment about how important he is in the context of that book. But when I look at a Google results page, I depend on Google's relevancy algorithms to make that determination for me, and it's easy to forget that when I'm looking at a long list of hits. (I can still tell in Google Print how many times a word appears in a book, and how many pages the book has, but the linearity and ephemerality of a results list can be seductive. It doesn't have the same weight in your hand that the actual book does, and perhaps, subconsciously, that actual, physical extension of the book in space helps our brains make determinations about proportionality and significance.) For all the virtues of keyword searching, then, this revolution warrants some careful reflection.
You can find such reflection in a recent article by David Bell in The New Republic on"The Bookless Future." (Full disclosure: Professor Bell is the incoming Director of Graduate Studies in the Johns Hopkins history department, where I am pursuing said graduate studies.) Unfortunately, and perhaps ironically,"The Bookless Future" is only available online to subscribers. But I found the full text by using Hopkins' institutional subscription to Lexis Nexis and strongly recommend it if you can find a copy.
The bulk of the article (if, following my musings above, it is not a category mistake to talk about the"bulk" of hypertext) wonders about the future of electronic books, and it canvasses several kinds of technology, currently in development, that will hopefully make electronic books easier to read. I think Bell is right that the only thing missing is a vehicle for text that is as optimal for reading as a printed book. The technology to scan full-page images of books and make them searchable is clearly already upon us; it won't be too much longer, I predict, before you can pay a fee and pull a book from Google Print onto your PDA or some other electronic device.
But Bell also expresses warranted concern about the deleterious effects these changes might have on the practice of reading.
The very nature of the computer presents a different problem. If physical discomfort discourages the reading of [online] texts sequentially, from start to finish, computers make it spectacularly easy to move through texts in other ways--in particular, by searching for particular pieces of information. Reading in this strategic, targeted manner can feel empowering. Instead of surrendering to the organizing logic of the book you are reading, you can approach it with your own questions and glean precisely what you want from it. You are the master, not some dead author. And this is precisely where the greatest dangers lie, because when reading, you should not be the master. Information is not knowledge; searching is not reading; and surrendering to the organizing logic of a book is, after all, the way one learns.
If my own experience is any guide,"search-driven" reading can make for depressingly sloppy scholarship. Recently, I decided to examine the way in which the radical eighteenth-century thinker d'Holbach discussed warfare. I could have read his book Universal Morality in the rare-book room of my university library, but I decided instead to download a copy (it took about two minutes). And then, faced with a text hundreds of pages long, instead of reading from start to finish, I searched for the words"war" and"peace." I found a great many juicy quotations, which I conveniently cut and pasted directly into my notes. But at the end, I had very little idea of why d'Holbach had written his book in the first place. If I had had to read the physical book, I could still have skimmed, cut, and pasted, but I would have been forced to confront the text as a whole at some basic level. The computer encouraged me to read in exactly the wrong way, leaving me with little but a series of disembodied passages.
This has often been my troubling experience as well: Henry Adams makes a great quip about Clay, for instance--as a teacher, Clay had"no equal though possibly some rivals." But having previously submitted myself to the organizing logic of Adams' book by reading it cover to cover, I know better than to take Adams' quips at face value. (Sure enough, according to an editor's footnote, Adams referred to Clay in private as a"noisy jackass.") I wonder, though, whether I'm as careful with books that I haven't read. The keyword revolution at least means that I need to be especially careful--I need to balance the subversive virtues of keyword search (the"surprise" of which Ginzburg speaks) with the virtues of"surrendering to the organizing logic of a book."
All of this got me wondering, though, about whether the dangers of"strategic, targeted" reading are really that new. After all, the printed index compiled by an author or editor presents the reader with the same potential for targeted reading, and it is the rare researcher who does not rely heavily on these indexes to quickly jump to parts of a book that are relevant to his or her research. (Here are threepapersonline that allude to the similarity between online and offline indexes.)
The index, like the codex, predates the printed book. According to Guglielmo Cavallo and Roger Chartier in their edited collection, The History of Reading in the West,
Even beyond its immediate derivation from the manuscript, the book--both before and after Gutenberg--and the manuscript were similar objects composed of sheets folded and gathered into quires and assembled within one binding or cover. It is thus hardly surprising that all the systems of reference that have somewhat hastily been credited to printing existed well before its invention. One of these was the use of signatures and catchwords to help assemble the pages in the right order. Other signalling devices aided reading: folios, columns, or lines might be numbered; the page could be divided up more visibly by the use of devices such as ornamented initials, rubrics and marginal letters; an analytical (rather than a simple spatial) relationship between the text and its glosses could be set up; different characters or different colours of ink could be used to distinguish between text and commentary. Thanks to its organization in quires and to its clear divisions, the codex, whether manuscript or printed, was easy to index. Concordances, alphabetical tables and systematic indexes were common practice even in the age of the manuscript, and it was in monastic scriptoria and stationers' workshops that these modes for the organization of written material were invented. Printers picked them up later. (p. 23)
And programmers picked them up even later. It would be an interesting research question to see (and maybe a medieval historian can correct me if this has already been done) whether the invention of the index in the age of manuscript provoked the same kinds of anxieties we feel today about targeted access to texts. One of the contributors to the Cavallo and Chartier volume, Jacqueline Hamesse, suggests that scholastic modes of reading were shaped in part by these innovations. Unlike monastic readers, scholastics could jump from page to page and cross-reference works without the same kind of intensive, devotional reading:
"Here we enter into a new world that suggests modern reading habits. After the pioneering labours of the Cistercians to organize the content of a manuscript, other aids appeared and flourished: the table of contents, the concept index, concordances of terms, alphabetically arranged analytical tables, summaries and abridgements. Even the great twelfth-century summae were abridged: they were admittedly easier to handle when reduced to a single volume. The abridgements were a pale reflection of the originals, however.
The rise of this new literary genre inevitably meant that reading was no longer direct: now a compiler served as an intermediary, and reading was filtered by selection. Reference to the book changed. Its contents were no longer studied for themselves with the aim of acquiring a certain wisdom, as Hugh of Saint Victor had recommended. Henceforth knowledge was primary, and it too precedence over everything else, even when it was fragmentary. Meditation gave way to utility in a profound shift of emphasis that completely changed the impact of reading.
Certain scholars are quite aware of the important role of these working tools for learning in the Middle Ages, but others have failed to grasp their influence among intellectuals. As any fourteenth-century inventory will show, florilegia, concordances and tables abounded, not only in the libraries of the religious Orders, but also in college and university libraries. Such compilations often replaced consultation and, a fortiori, direct reading of authors' works, and even though they constitute a second-tier literature, their sizeable role in the intellectual preparation of medieval men cannot be denied. Today we have such different methods for acquiring culture that it is difficult for us to comprehend that even the great writers of the age of scholasticism made use of these handy tools for easy access to documentation that was indispensable to their work. The large number of manuscripts that have come down to us bear witness to the use and dissemination of such compilations. (p. 110)
Of course, electronic keyword searching takes concordances to another level. But perhaps this is a good thing. The etymological roots of" concordance" are, after all, entangled with the roots of" concord," and it is sometimes good to introduce discordance into our readings of texts. If Ginzburg is right, then we have a real advantage over our scholastic forbears; unlike them, we don't have to rely on the compilations of other scholars, who might use indexes as a way to assert too much control over the text. But if Bell is right, then we also have a greater responsibility to handle that advantage with care, and to prevent our liberty from becoming license.
You can be the judge of whether I've done that here, because (in a burst of self-referentiality) I found the quotes from the Cavallo and Chartier book by using Google Print, and I've never read the whole thing. When bloggers advise readers to"read the whole thing," do they really mean it? And do we ever really follow that advice?
(Cross-posted at Mode for Caleb.)