Cliopatria: A Group Blog
Aaron Bady (∞); Chris Bray (∞); Brett Holman (∞); Jonathan Jarrett (∞); Robert KC Johnson (∞); Rachel Leow (∞); Ralph E. Luker (∞); Scott McLemee (∞); Claire B. Potter (∞); Jonathan T. Reynolds (∞)
The culminating irony is that the comfort they seek is largely illusory. Whatever Roosevelt did or did not know, the fact remains that Japan deliberately planned and executed a surprise attack, and it was their attack that plunged the two nations into open war. The attack was a tactical masterpiece, but a strategic blunder of amazing proportions in uniting the United States against Japan.
A group of activists has occupied the offices of University of Hawai'i System Interim President John McLain for several days now, demanding that the UH system back out of an agreement to establish a military research program and undertake substantial research projects. As much of a throwback as this is, we live in a new age, where protests have websites and live video feeds and, of course, blogs. The symbolic nature of protest actions is greatly amplified by these technologies, though it does raise the question of effectiveness: can anyone think of examples where office-occupation protests produced real results?
The protestors, though they have a variety of positions related to war and research, have backed away from demands that the programs be cancelled, and now are proposing that the approval process be more open and rigorous. In other words, they are making procedural demands which they believe will enhance their push for substantive changes, but which are not, actually, directly related. Sounds a lot like a certain congressional issue, to me.
Then there's the actual substance of the protests. The main components of the protest, as I read it, have to do with secrecy and with environmental degradation, and the conflict between military and educational goals. Military research is not all bad [via The 50th Star] but then it's not all good, either. But statements like
"Research that facilitates military aims is the same as creation of the weapon itself. ... The person who makes the plutonium or pulls the trigger are equally culpable."
bother me because it extends the idea of moral complicity so far that only constant, active resistance to legitimate governments would be acceptable. One could limit the argument specifically to government contracts, but even then, even the military funds some research which has fundamental roots and broad applications: why should a researcher turn down money to do research which they'd do anyway?
You could argue that historians don't have to worry much about this, but we do, for two reasons. First, we are institutional citizens, and both the ethical position and financial status of our universities matters to us. Second, military history.
Ralph E. Luker,"What Passes for ‘Conservative' in America," Cliopatria, 18 December 2003;
Tim Burke,"Hooray for Watergate," Cliopatria, 22 December 2003;
Caleb McDaniel,"Lakoff on Theology and Politics," Cliopatria, 23 May 2005;
Greg Robinson,"Same-Sex Marriage: A Victory for Conservatives?" Cliopatria, 29 June 2005; and
Chris Bray,"The Conservative Political Tradition," (1), (2), and (3), Historiblogography, 5 December 2005.
Through Inside Higher Ed, I found this post: Loving the Liars, by Ryan Claycomb, a professor at West Virginia University. He and I and countless other profs are dealing with the influx of finals and term papers this time of year, and we're also dealing with the ancient bugbear of plagiarism.
Ryan writes of one of his early experiences with confronting a young fellow who had lifted his final paper entirely from the Washington Post.
When I figured it out I was so angry--furious--that all afternoon long I was literally seeing red around the edges of my vision, my face was flushed, I couldn't sleep that night.
When I confronted him, long after I had gotten my emotional response in check, he wept like a chil... and still, I thought, "Man, is seeing him cry making up for the anger I felt? That would make me a horrible person." But I just couldn't figure out why it had made me just so mad--it had completely ruined my day. It ruined his, too, but HE did something to deserve it.
You know, we are often reminded by our students how much power we have over them, but we really do give so much back to them--we lay our hearts in their little fingers every time we assign a paper, and have them broken dozens of tiny ways, and mended in another dozen.
For God's sake, we didn't go into this for the money. We went into it because we love it--we love the material, and at our best we love them--maybe not individually, but collectively. And sometimes, just like all the people we love do, they betray us, in little ways and big ways.
My point is, the moment I stop feeling just a little betrayed by my students is a scary one for me. Maybe that's not a bad thing for many people, but for me, and I suspect for others, too, it's a moment I dread, because then it might become just a job, and I never wanted just a job.
In the hypercritical field we're in, it's really very hard to talk about something so unrigorous as love--for the books we read; for the time we spend in front of the classroom; for the stupid little crushes we get on students with bright ideas, and with potential; for the silly idealism of it all. It's important for me to remember, right now especially, as papers pile up, as cribbed papers slide across my desk, as identical wrong answers appear on consecutive quizzes. And I want to tell my students, yes, dammit, it makes me mad. It makes me mad because unrequited love always makes us mad.
(The bold emphases are mine.) I read this last night, and wanted to call Ryan up on the phone and cry "My brother! You get it!" Claycomb nails perfectly exactly why it is that I still get so upset when students plagiarize and cheat. Even after a dozen years and close to ten thousand students, I still find myself deeply invested in the work they do and the possibility of witnessing their growth and transformation.
If I can trust my teaching evaluations (not the ones at the uncontrolled online sites, but the in-class ones), I'm a pretty good professor. And I don't doubt for a second that there is a direct correlation between my love for my students and my success as an instructor. I teach four classes spread over seven different sections. This fall I've got three sections of ancient history, which means I give the same lecture three times a week. No matter how passionate I am about the subject, I will get bored and frustrated if I focus on the fact that I'm delivering the same information thrice weekly. What turns me on is the certainty that each class is hearing this information for the very first time.
When I lecture, I pace back and forth, making eye contact. I try to focus on my students as people (it becomes easier as their faces become more familiar). It may be my 326th time lecturing about Constantine, but it's the first time -- and probably the only time -- that Maria, or Soon, or Armen, or Mike, or Cynthia will hear this information. That thought excites me every time, week after week, semester after semester. Thus the only way I can be even remotely effective as a teacher is to personalize the relationship I have with my students. I'm focused less on my own delivery and more on my students' reception, and if I didn't have intense feelings about my students I wouldn't be able to do that.
Beneath Ryan's post, there's a rather snarky comment from a Mike Lee, who teaches Electromagnetics at Kent State. Mike writes:
We are educators, not parents. We have classes, not families. We are paid to educate, not sit and decide what students "deserve". Maybe you never "just wanted a job", but the students who are in your classes and the university that hired you did not have that in their contract. You do not get to add your morality and emotional responses to the job.
Perhaps this is the difference between someone who teaches electromagnetics and those of us who teach English (as Ryan does) or history! No, we are not parents. But I hate the word "educator", and never use it. It should only be employed by administrators and union hacks, not by real teachers. I can't quite articulate it properly, but the word "educator" is a "distancing" word -- there's a cold, clinical tone to it that seems utterly at odds with what it is that we're supposed to be doing.
I'll be honest: the teaching style that works for me is modeled on seduction. Not the sexual seduction of individual students, but the emotional and intellectual seduction of a group. I walk into my class meeting of the semester confident that a great many of my students don't care about the subject they've signed up for. They just want their grade and their units and they want to do as little work as possible. My job is not to convey information -- that's what textbooks are for! My job is to seduce the students into taking a genuine interest. I want to arouse passion, not for the teacher but for the subject. I want them to become fascinated with what they once considered dull; I want them to be turned on by what they once considered deathly and alienating. In order to do this work and do it well, I need of course to care about history itself -- but I've got to, just like Ryan, care with great intensity about my students.
We use the same word "cheating" to describe both adultery and plagiarism. Frankly, Ryan's post reminds me that we are right to do so. My students and I are in relationship together. Like all relationships, it has rules, both spoken and unspoken. It has expectations and hopes. It has its little compromises and little bargains (I'll let them out early in return for a good overall class score on a test). It also has its little betrayals. Like so many relationships, especially in high school and college, it is sometimes one-sided. Sometimes (not so often now that I'm older), students get crushes on me that I can't and won't reciprocate. Far more often, I try and try to motivate a student to get the work done, take an interest in the course -- and I fail. I don't fall to pieces when this happens, but I do feel a little twinge, even now, of real disappointment. And sometimes, the great betrayal comes: I'm cheated on in the form of plagiarism.
Like Ryan Claycomb, I still respond to plagiarism the same way a betrayed spouse responds to infidelity. Disbelief, followed by rage, followed by a nagging sense that it might all have been my fault! I ask the same sorts of questions: Was I unclear about what the boundaries were? Should I give the student another chance? I have no problem giving failing grades to those who cheat, just as I have no problem justifying terminating a relationship based upon adultery. But the fact that I can move to a swift and just punishment does not mean that I am not personally affected by the betrayals.
I am in love with one woman. But I fall in love with my students, collectively, over and over and over again, semester after semester after semester. I grow older and older each year, and they, for the most part, stay forever youthful. Where once I was but five years older than my average student, today I am old enough to be their father. The nature of the love I feel has changed a bit over time, the way all relationships change and grow as we age and mature. I'm more patient; I'm a heck of a lot less insecure. But I still love them, just as I love Clio whom I serve, and the day I take that intense emotional urgency out of my teaching is the day I will cease to be a useful professor.
One of the things I suggested was this:"There seems to have been an enormous rush to judgment that the evidence provided by Holly Jackson, the Brandeis graduate student aforementioned, is straightforward proof about Kelley-Hawkins' race." That was a sloppy sentence, because it implied that I think Jackson might be wrong about Kelly-Hawkins' whiteness. McLemee, who generously linked to my post from his main page, responded thus:
As for CM's idea that there is a"rush to judgment," that certainly crossed my mind. But Holly Jackson actually makes a strong case for EDKH as white, while nobody has any solid evidence that she was black -- and when you read the scholarship, you notice people bending themselves out of shape to find some reason to think that she was. (I could have said more about that in the piece, including one case that verged on outright dishonesty, but that seemed like overdoing it.)
I agree completely that Jackson's evidence on the matter demolishes the traditional story about Kelly-Hawkins, since the traditional story is now apparently supported by no evidence at all. But if you look closer at my original statement on the"rush to judgment," what I wanted to question was not Jackson's particular evidence about Kelley-Hawkins, but rather the general presumption that certain kinds of historical evidence about race are inherently more decisive than others.
The idea that Kelley-Hawkins was black seems to have rested largely on the ambiguity of a photograph that appeared in one of her novels. Jackson's case for her whiteness rests, on the other hand, on family memories and documentary sources like the census. What I wanted to offer, however, was a cautionary warning that documentary sources about race are not necessarily less ambiguous than pictures. For instance, I wondered in the post about how nineteenth-century enumerators recorded" color" or"race." If, in some cases, determining a person's"race" was left to the enumerator's discretion, then a"W" in the census is not necessarily less ambiguous than a picture. It could simply record what a contemporary saw when he looked at Kelley-Hawkins. Does the census always offer us more information about race than meets the eye, or simply the same"information" provided from a contemporary perspective and packaged in documentary form? Such, at least, is a question I thought worth asking about Jackson's evidence. (An article by Martha Hodes in the February 2003 American Historical Review raised these questions about another Massachussetts family in the nineteenth century.)
At Coffee Grounds, Evan Roberts replied with an extremely helpful post about census procedures, which includes links to instructions that were given to enumerators. This information supplemented comments that Julie Meloni made on my original post, drawing on her own experiences using the census for genealogical research.
I highly recommend Evan's post, and I entirely agree with this sentence:"What I think is significant is that Emma Dunham Kelley-Hawkins and her ancestors are always described as white. That is firmer evidence of being white, whatever 'being' and 'white' mean." What I wanted to say in my post was that we should continue to add that final clause to whatever we say about race in the nineteenth century. This does not mean that we cannot make judgments, on the basis of contemporary descriptions, about the"race" of historical actors. But it means we must always keep in mind how ambiguous and arbitrary such descriptions about race could be and still are. And we must careful not to assume uncritically that certain kinds of evidence -- like the census -- somehow offer glosses of what"being" and"white" meant that are more determinate than other kinds of evidence -- like photographs. (Incidentally, Coffee Grounds also has two recent posts on the distinction between"quantitative" and"qualitative" sources that might be relevant here. I'm suggesting that even"quantitative" sources like the census were"qualitative.")
That's why I also suggested that the scholarship that has been done under the assumption that Kelley-Hawkins was not white should not simply be thrown out with the bathwater as a bunch of political misdirection. By presuming that Kelly-Hawkins was a black author creating aggresively white characters, that scholarship had to wrestle with what"being" and"white" meant. Without having read any of the scholarship that McLemee profiles, I think its insight that race can be malleable and strategically deployed is a valuable one. Although in this case scholars turn out to have been barking up the wrong tree, the bending themselves out of shape that they had to do is the kind of exercise that now makes us more limber as historians of race. In that limited sense, the underlying assumptions and theoretical underpinnings of that scholarship were not worthless. And it would be a shame if the conclusion that people drew from Kelley-Hawkins' case was that any scholarship on the cultural construction of"whiteness" and"blackness" is simply the product of postmodern mumbling and political correctness.
One of the feature essays is by Albert J. Raboteau, Professor of Religion at Princeton University, author of Slave Religion: The"Invisible Institution" in the Antebellum South, and an Orthodox Christian. There is also an excellent piece by Gary B. Nash, emeritus Professor of History at UCLA, which uses three biographical sketches to show that the evangelical Christianity of the Great Awakening often empowered"ordinary people" to think and speak for themselves. Nash points to Richard Allen (one of the founding ministers of the African Methodist Episcopal Church), Jarena Lee (an African American woman who became an exhorter in Allen's church), and Lorenzo Dow (a poor but popular Methodist circuit rider without a formal education) as examples of what Nathan Hatch has described as the"democratization of American Christianity."
I was especially struck by Nash's discussion of the relationship between Jarena Lee and Richard Allen, because it contrasts with a similar relationship I've just been reading about. Lee's autobiography is published in Sisters of the Spirit, a collection that also includes the autobiography of Julia Foote, another African American woman who became conscious of a calling to preach after converting to Methodism. But Foote's experience with the elders of her AME church was very different from Lee's.
According to Nash, Richard Allen was initially reluctant to invite Jarena Lee into his pulpit. But after her repeated demonstrations of powerful preaching, he eventually relented. At the end of the 1830s, however, Julia Foote was cast out of the First AMEZ church in Boston, where the minister was African American abolitionist Jehiel C. Beman. Like Lee, Foote had ecstatic visions. She claimed to have heard the voice of God addressing her. But her charismatic sermons and her claims that she had been sanctified displeased her husband and the men of the church. In her autobiography, Foote recalled Beman forbidding her to preach her"holiness stuff" to the flock, despite the requests of some"elder sisters" to give her a hearing.
So Foote began preaching in small house meetings to disaffected members of the congregation. Needless to say, this did not make Beman happier, and according to Foote, he excommunicated her when she refused to stop preaching. Last Thursday, while reading about Foote and her visions in the beautiful new African American history reading room at the Enoch Pratt Free Library, I was moved by her description of Beman's skepticism. After one night of particularly vivid visions, Foote wrote,
... my minister, Jehial C. Beman, came to see me. He looked very coldly upon me and said: 'I guess you will find out your mistake before you are many months older.' He was a scholar, and a fine speaker; and the sneering, indifferent way in which he addressed me, said most plainly: 'You don't know anything.'
After reading Nash's article and reflecting back on Foote's story, I had two thoughts. They are not necessarily related to each other, and they are not directed at Nash's piece in particular:
The first thought: the democratization of American Christianity had important limits. On the one hand, Jehiel C. Beman himself could easily serve as another example for Nash's article. The son of a Connecticut slave who was freed after fighting in the American Revolution, Beman was a shoemaker who became a prominent pastor, a founding member of the American Anti-Slavery Society, and a leader of African American voluntary associations in various parts of New England. But Beman's faith, while serving as an important agent of his own empowerment, did not necessarily entail Foote's empowerment. This would come as no surprise, of course -- to Nash or to Hatch, whose book charts a retreat by many American Methodists from their most democratic impulses. But I think it's important to include the stories of Footes and Bemans alongside the stories of Lees and Allens, if only to emphasize that the democratization spurred by Christianity was incomplete and fitful. It is also important to stress that even the"ordinary people" empowered by evangelical Christianity did not thereby find themselves in agreement on the scope of that empowerment.
A second thought: As the special issue of the Boston Review demonstrates, there are currently many efforts afoot to reclaim the democratic (small"d" or big"D") legacies of American evangelicalism. And pointing to figures like Foote serves these projects well. Yet I sense that many on the Left would like to applaud Foote's empowerment while simultaneously pooh-poohing her charismatic faith or endorsing her visions. In this sense, though, many scholars find themselves in the position of Beman. While disagreeing with his patriarchal repression of Foote's voice, they echo his"sneering, indifferent" condescension toward her"mistake." But if this is intended as a strategy for wooing evangelical Christians toward the Left, it seems to me like one that is doomed to fail. It does no good for democrats to say to"The Believers" that their history is progressive and democratic, if at the same time they say,"You don't know anything." It could be, of course, that many Democrats do not really mean to talk to"The Believers" at all. Perhaps many are mainly talking to each other about"The Believers," as if to say,"Those people sure don't know anything, but at least they're ordinary." I trust I don't need to point out why this will seem as cold and condescending to the Footes of the twenty-first century as Beman's reaction seemed to Julia Foote herself.
I think it is important for historians to tell the stories of people like Dow, Allen, Beman, Foote, and Lee. Many people who identify America as a"Christian nation" tell the history of evangelicalism as if its telos were the radical social conservatism of people like Tom DeLay. And articles like Nash's are useful for proving that Christianity in this nation has often been more open and democratic than the Christianity of its latter-day adherents. But ultimately, these historical ripostes to conservative Christians have a limited usefulness. First, they require historians of Christianity to follow Christians propagandists in ironing out the wrinkles in their favored visions of the past. As I've argued before, the major problem with lobbyists like David Barton of Wallbuilders is that their historical arguments are selective and oversimplified: they wave a copy of a Thanksgiving Day sermon and declare that the nation has Christian foundations. It's tempting to do the same thing from the other side: to wave a copy of Julia Foote's autobiography and declare that American Christianity has democratic foundations. But the more important contribution for historians to make is to point out the complications in both attempts to make the past staightforwardly"usable" in the present.
Second, and more importantly, offering duelling histories of American Christianity often keeps discussions in the past. And although (as an historian, of course!) I am all for discussions about the past, we must be careful not to put off indefinitely the hard conversations between"The Believers" and"The Non-believers" that need to be taking place in the present. At some point, those who think that Julia Foote is a democratic exemplar of Christian empowerment are actually going to have to listen and talk to the"ordinary people" who believe in God as fervently as she did. I do think that in many cases these conversations are beginning to get underway, and I wish them long life.
Of course, to have access to this resource, you have to be somewhat savvy, because there is not yet a portal page on Google's site for searching books. If you don't already know it, you can tap the vast resources of Google Print in one of at least two ways:
(1) When searching at Google, begin your search string with the word"book" or"books" and then enter your query as usual. If Google Print has book pages that match your query, you should see about two or three"book results" listed above your search. (Example.) You can either click on the individual results or on the headline that sends you to all of your book results. (Example.)
From there you can search within particular books (check the sidebar of an individual result page), look at the index and table of contents for a book, and even scroll through about two or three pages around your result page. Once you are within Google Print, you can also"search all books" by using the form entry box located either at the top of the page or at the bottom. (Hat-tip: Search Engine Watch.)
(2) Another way to get into Google Print is to use this link and then enter your search at the top of the page. (Hat-tip: NT Gateway.)
The scholarly possibilities here are staggering. Google Print makes it possible, for instance, to search for published books that cite a certain book or article--a feature that was difficult to do before without access to some kind of citation-tracking database. Most of all, Google Print makes it possible to see whether there are books that mention a particular name or word, even in passing--something that was nearly impossible to do before.
For instance, if I want to see books that mention the Kentucky abolitionist"Cassius M. Clay," I just do this and get 76 hits. In the"real" world, as they say, I would have had to determine that those 76 books were relevant to Clay, find them on the shelf, and then hope that the book's author or editor had listed"Clay" in the index. All of this was possible before, of course, for academic journals and other kinds of periodical literature. And it was even possible in digital collections of historical books, like the Making of America site or the Samuel May Anti-Slavery Collection at Cornell. But with Google Print, the digital keyword revolution has truly arrived, and the end is not in sight.
What should we make of this revolution, and how revolutionary is it? In the latest issue of Perspectives, there's an article by Carlo Ginzburg considering that question. (There are also two fantastic articles on history blogging by my fellow Cliopatriarchs, Ralph Luker and Manan Ahmed.) Ginzburg argues persuasively that keyword searching in library catalogs is good for scholarship, primarily because"the computer multiplies the possibilities that an unforeseen fact will take us by surprise." (In the above search, for instance, I was surprised to see Clay mentioned in The Education of Henry Adams as one of the morose young man's diplomatic"masters." It turns out that Clay is listed in the index of my printed copy of Education, but I don't know that I would have looked there intentionally for a mention of Clay.)
Of course, that capacity for surprise is not limitless, because we must have some reason for entering in the keywords that we do, and usually our intuitions here are guided by our prior research or the work of others. But it is significant that keyword searches allow us to navigate through texts largely without the mediation of editors, authors, and publishers.
On the other hand, the excitement of surprise can be misleading to a researcher. The temptation when doing keyword searches is always to think that your results are more representative than they are. (This is something I've mused about before.) If I look in the printed index to a book and see one page listed for Clay out of 450 or 500, I can make a rough and ready judgment about how important he is in the context of that book. But when I look at a Google results page, I depend on Google's relevancy algorithms to make that determination for me, and it's easy to forget that when I'm looking at a long list of hits. (I can still tell in Google Print how many times a word appears in a book, and how many pages the book has, but the linearity and ephemerality of a results list can be seductive. It doesn't have the same weight in your hand that the actual book does, and perhaps, subconsciously, that actual, physical extension of the book in space helps our brains make determinations about proportionality and significance.) For all the virtues of keyword searching, then, this revolution warrants some careful reflection.
You can find such reflection in a recent article by David Bell in The New Republic on"The Bookless Future." (Full disclosure: Professor Bell is the incoming Director of Graduate Studies in the Johns Hopkins history department, where I am pursuing said graduate studies.) Unfortunately, and perhaps ironically,"The Bookless Future" is only available online to subscribers. But I found the full text by using Hopkins' institutional subscription to Lexis Nexis and strongly recommend it if you can find a copy.
The bulk of the article (if, following my musings above, it is not a category mistake to talk about the"bulk" of hypertext) wonders about the future of electronic books, and it canvasses several kinds of technology, currently in development, that will hopefully make electronic books easier to read. I think Bell is right that the only thing missing is a vehicle for text that is as optimal for reading as a printed book. The technology to scan full-page images of books and make them searchable is clearly already upon us; it won't be too much longer, I predict, before you can pay a fee and pull a book from Google Print onto your PDA or some other electronic device.
But Bell also expresses warranted concern about the deleterious effects these changes might have on the practice of reading.
The very nature of the computer presents a different problem. If physical discomfort discourages the reading of [online] texts sequentially, from start to finish, computers make it spectacularly easy to move through texts in other ways--in particular, by searching for particular pieces of information. Reading in this strategic, targeted manner can feel empowering. Instead of surrendering to the organizing logic of the book you are reading, you can approach it with your own questions and glean precisely what you want from it. You are the master, not some dead author. And this is precisely where the greatest dangers lie, because when reading, you should not be the master. Information is not knowledge; searching is not reading; and surrendering to the organizing logic of a book is, after all, the way one learns.
If my own experience is any guide,"search-driven" reading can make for depressingly sloppy scholarship. Recently, I decided to examine the way in which the radical eighteenth-century thinker d'Holbach discussed warfare. I could have read his book Universal Morality in the rare-book room of my university library, but I decided instead to download a copy (it took about two minutes). And then, faced with a text hundreds of pages long, instead of reading from start to finish, I searched for the words"war" and"peace." I found a great many juicy quotations, which I conveniently cut and pasted directly into my notes. But at the end, I had very little idea of why d'Holbach had written his book in the first place. If I had had to read the physical book, I could still have skimmed, cut, and pasted, but I would have been forced to confront the text as a whole at some basic level. The computer encouraged me to read in exactly the wrong way, leaving me with little but a series of disembodied passages.
This has often been my troubling experience as well: Henry Adams makes a great quip about Clay, for instance--as a teacher, Clay had"no equal though possibly some rivals." But having previously submitted myself to the organizing logic of Adams' book by reading it cover to cover, I know better than to take Adams' quips at face value. (Sure enough, according to an editor's footnote, Adams referred to Clay in private as a"noisy jackass.") I wonder, though, whether I'm as careful with books that I haven't read. The keyword revolution at least means that I need to be especially careful--I need to balance the subversive virtues of keyword search (the"surprise" of which Ginzburg speaks) with the virtues of"surrendering to the organizing logic of a book."
All of this got me wondering, though, about whether the dangers of"strategic, targeted" reading are really that new. After all, the printed index compiled by an author or editor presents the reader with the same potential for targeted reading, and it is the rare researcher who does not rely heavily on these indexes to quickly jump to parts of a book that are relevant to his or her research. (Here are threepapersonline that allude to the similarity between online and offline indexes.)
The index, like the codex, predates the printed book. According to Guglielmo Cavallo and Roger Chartier in their edited collection, The History of Reading in the West,
Even beyond its immediate derivation from the manuscript, the book--both before and after Gutenberg--and the manuscript were similar objects composed of sheets folded and gathered into quires and assembled within one binding or cover. It is thus hardly surprising that all the systems of reference that have somewhat hastily been credited to printing existed well before its invention. One of these was the use of signatures and catchwords to help assemble the pages in the right order. Other signalling devices aided reading: folios, columns, or lines might be numbered; the page could be divided up more visibly by the use of devices such as ornamented initials, rubrics and marginal letters; an analytical (rather than a simple spatial) relationship between the text and its glosses could be set up; different characters or different colours of ink could be used to distinguish between text and commentary. Thanks to its organization in quires and to its clear divisions, the codex, whether manuscript or printed, was easy to index. Concordances, alphabetical tables and systematic indexes were common practice even in the age of the manuscript, and it was in monastic scriptoria and stationers' workshops that these modes for the organization of written material were invented. Printers picked them up later. (p. 23)
And programmers picked them up even later. It would be an interesting research question to see (and maybe a medieval historian can correct me if this has already been done) whether the invention of the index in the age of manuscript provoked the same kinds of anxieties we feel today about targeted access to texts. One of the contributors to the Cavallo and Chartier volume, Jacqueline Hamesse, suggests that scholastic modes of reading were shaped in part by these innovations. Unlike monastic readers, scholastics could jump from page to page and cross-reference works without the same kind of intensive, devotional reading:
"Here we enter into a new world that suggests modern reading habits. After the pioneering labours of the Cistercians to organize the content of a manuscript, other aids appeared and flourished: the table of contents, the concept index, concordances of terms, alphabetically arranged analytical tables, summaries and abridgements. Even the great twelfth-century summae were abridged: they were admittedly easier to handle when reduced to a single volume. The abridgements were a pale reflection of the originals, however.
The rise of this new literary genre inevitably meant that reading was no longer direct: now a compiler served as an intermediary, and reading was filtered by selection. Reference to the book changed. Its contents were no longer studied for themselves with the aim of acquiring a certain wisdom, as Hugh of Saint Victor had recommended. Henceforth knowledge was primary, and it too precedence over everything else, even when it was fragmentary. Meditation gave way to utility in a profound shift of emphasis that completely changed the impact of reading.
Certain scholars are quite aware of the important role of these working tools for learning in the Middle Ages, but others have failed to grasp their influence among intellectuals. As any fourteenth-century inventory will show, florilegia, concordances and tables abounded, not only in the libraries of the religious Orders, but also in college and university libraries. Such compilations often replaced consultation and, a fortiori, direct reading of authors' works, and even though they constitute a second-tier literature, their sizeable role in the intellectual preparation of medieval men cannot be denied. Today we have such different methods for acquiring culture that it is difficult for us to comprehend that even the great writers of the age of scholasticism made use of these handy tools for easy access to documentation that was indispensable to their work. The large number of manuscripts that have come down to us bear witness to the use and dissemination of such compilations. (p. 110)
Of course, electronic keyword searching takes concordances to another level. But perhaps this is a good thing. The etymological roots of" concordance" are, after all, entangled with the roots of" concord," and it is sometimes good to introduce discordance into our readings of texts. If Ginzburg is right, then we have a real advantage over our scholastic forbears; unlike them, we don't have to rely on the compilations of other scholars, who might use indexes as a way to assert too much control over the text. But if Bell is right, then we also have a greater responsibility to handle that advantage with care, and to prevent our liberty from becoming license.
You can be the judge of whether I've done that here, because (in a burst of self-referentiality) I found the quotes from the Cavallo and Chartier book by using Google Print, and I've never read the whole thing. When bloggers advise readers to"read the whole thing," do they really mean it? And do we ever really follow that advice?
(Cross-posted at Mode for Caleb.)
I wasn't aware, though, that Lakoff thinks you can frame almost any liberal/conservative divide by using the Nurturant Parent versus Strict Father metaphor. Did you know, for instance, that theological debates also boil down to this simple dichotomy? In an online forum at Lakoff's Rockridge Institute, he has recently argued that ...
the difference between conservative and progressive Christianity is whether God is seen as a strict father or nurturant parent.
The strict father God is punitive: Follow His commandments and you go to heaven. Disobey and you go to hell. Since you’re all sinners, He’ll give you a second chance. His son has suffered so much he has built up enough moral credit to pay for the sins of everybody. If you accept Jesus as your savior, He’ll wipe the slate clean as if you’ve been born again; but this time you’d better get it right or else. Do what your church says and you’ll go to heaven; disobey and you’ll go to hell.
The nurturant God offers Grace, which is metaphorical nurturance. To get grace, you have to be close to God; you can’t earn Grace; it’s given freely and unconditionally; it must be accepted actively; it fills and nourishes you, protects you, heals you, makes you a moral person. Moral Politics is the link between theology and politics. Conservative theology and politics are both structured around strict father morality, just as progressive theology and politics are both structured around nurturant parent.
What I found is that conservative Christians understand their theology and its relation to politics but that progressive Christians have trouble articulating theirs.
The fuller exposition is in Chapter 14 of Moral Politics, avaliable here. (In the chapter, Lakoff does admit that this view about what makes conservative Christians think conservatively is a"guess," and he begs our indulgence for his"oversimplification" of Christian theology,"which will of necessity sound like the text of a comic book called, 'Christ for Beginners.'")
Elsewhere in the Rockridge Forums, there is a response to Lakoff from"a historian's perspective." Dean Grodzins, the author of what will long be the definitive biography of the Transcendentalist minister Theodore Parker, argues that while Lakoff's posited link between theology and politics does not always hold up historically, it often does hold. In fact, Grodzins suggests that Lakoff's idea of organizing theology around the poles of"Strict Father" and"Nurturant Parent" metaphors, instead of around the poles of liberalism and orthodoxy, might make sense of more religious history in America. For instance, although eighteenth-century New England Calvinists all agreed on basic doctrinal creeds, they developed different views of God the Father that either stressed his Strict or Nurturant nature, and by the nineteenth century, those divergent metaphors led to actual splits in American churches that ramified in the political sphere. Progressive Christians who preferred the model of God as a Nurturant Parent flocked into antebellum movements to reform education and abolish slavery, while those who preferred the Strict Father view tended to favor the conservative theology and politics of proslavery advocates.
Grodzins makes his case by pointing to the historical convergence of the"Nurturant Parent" theology worked out by Protestants like Horace Bushnell and the"Nurturant Parent" politics of reformers like Horace Mann, Elizabeth Palmer Peabody, and Samuel Gridley Howe. But he doesn't chart other points on the map of antebellum reform that would complicate Lakoff's attempt to connect the dots between progressive theology and politics. For instance, it was entirely possible for some antebellum reformers to see God as a Nurturant Parent but to see the state as authoritarian and thus ungodly. A small but vocal group of radical abolitionists thought that all human government was sinful precisely because a state could not be nurturant in the way that God was. (See Lewis Perry's classic book on these Christian anarchists.) These were people, in other words, who had a Nurturant Parent view of God and a political posture that Lakoff and Grodzins would probably label"progressive," but who also failed to see the state as a Nurturant Parent.
If it is possible to be theologically progressive, opposed to conservative politics, and socially reformist without making the metaphorical link between God as Nurturant Parent and the state as Nurturant Parent, then I'm having a hard time seeing what kind of explanatory power Lakoff's metaphors can offer us, either historically or politically.
(Cross-posted at Mode for Caleb.)
History can present itself as the universal memory of humanity. But there is no universal memory. All collective memory is supported by a group that is limited in space and time. One cannot collect into one tableau the totality of past events except by detaching them from the groups that guard memory ... history is interested above all in the differences [between societies], and makes abstractions of the similarities for which there exists no memory ...
[crappy translation is my own]
To bring immediacy to historical elements, social groups must deploy rituals and symbols that takes history from classroom into the public sphere and gives it emotional import. Consequently, meaningful history, capably of becoming memorialized, is precious, circumscribed by spatial and temporal boundaries (especially of a nation.)
Some events are capable of being imagined even though they do not belong to an unbroken memorial tradition (like the Trojan War to the early modern English readership.) Nevertheless, the notion that what is taught in Western Civ courses will probably never find any meaning outside of the classroom weighs heavily on those who teach it.
Hugo Schwyzer's recent complaint that the first half of Western Civ tends to be a quick-step march from Sumer to the Bastille stresses the point further (to be fair, we modernists should be able to pick up the story at Aquinas). However, I don't worry that something important will be lost as the span of pre-modernity is stretched beyond recognition. Rather, I worry that despite the best judiciousness, speedy lecturing produces lacunas that betray the founding suppositions of Western Civ courses: an ongoing tradition that becomes recognizable as the West.
Jonathan Reynolds:"I just find it funny that"Western Civ" starts in Mesopotamia..."
Jonathan Dresner:"And ends there, perhaps?"
Funny, but also telling. One can easily lead from the proto-Agriculture of Jericho to Babylon, but how does one get from Euphrates to Nile? Does the continuity die after Hammurabi? Or do we merely sow seeds that will become plants that will be cooked up by Classical civilization? The endpoints are not the only thing in question.
If Western Civ were closer to the history of ideas, it would describe the intimacy between one system of thought and the one that 'superceded' it, how past discursive fields remained adjacent to the present, not separate, but ready to break out through, exerting ongoing relationship between past and present fields. Bounding from one civilization to the next, the potentially minute fissures become chasms. The continued modernization of Western Civ -- dividing it between one very long ancient, medieval, and sometimes early modern half and a modern half; the inclusion of modern interests with the pre-modern; opting to exclude the early eras -- threatens to undercut the possibility of continuity (or even contiguity.)
[Crossposted to Rhine River]
I was reminded of Knutson this morning reading to the front page of today's New York Post. It turns out that the husband of GOP Senate nominee Jeanine Pirro originated a covert campaign to pressure her out of the Senate race, with a face-saving solution of standing for state AG. Albert Pirro, whose conviction for income tax problems had previously derailed a Pirro statewide bid, appears to have conspired with NY Senate majority leader Joe Bruno to begin a public pressure campaign to get Pirro to drop her bid. Bruno is concerned that Pirro will be a drag on the ticket and might cost the Repubs control of the state Senate.
Pirro hasn't been the greatest of candidates--her campaign got off to a bad start when, in the midst of a fiery denouncement of Hillary Clinton in her announcement speech, she misplaced a page and had a 40-second or so pause. But it's hard to see how another nominee could do better, or how Pirro could possibly be elected AG after getting into the race under these circumstances.
There are some national ramifications, potentially, to this story. The GOP has kept control of the state Senate in NY only through gerrymandering. If the Dems ever recaptured it (they'd need to pick up four seats), they could promptly redraw the lines and create a near-unassailable Dem majority--and, at the same time, they could similarly push through a DeLay-like redistricting of the state's US House districts, which would probably net the Dems three or four seats.
I'm sure that J. Pirro has a good sense today, though, of how Knutson felt a half-century ago.
In William R. Hutchinson,"Strong Objections," HNN, 5 December, Josiah Strong (1847-1916) returns from the dead to accuse Samuel P. Huntington of plagiarism.
Jenny Turner,"As Astonishing as Elvis," London Review of Books, 1 December, won't be loved by Ayn Rand's admirers, but it tells you what you need to know about her and the Rand cult.
The claim of Steven Levitt and John Donohue, the authors of Freakonomics, that abortion rights accounts for the decline of crime rates in the United States over the last three decades is challenged by economists at the Federal Reserve Bank of Boston. See: Christopher Foote and Christopher Goetz,"Testing Economic Hypotheses with State-Level Data: A Comment on Donohue and Levitt (2001)," Federal Reserve Bank Working Papers, 22 November. The findings of Foote and Goetz are summarized in"Oops-onomics," The Economist, 1 December. Thanks to Dale Light of Light Seeking Light for the tip. [more ...]
Michael Gurian,"Disappearing Act," Washington Post, 4 December, seeks to understand why young women are doing well in our"industrial classrooms," but that young men are not doing so well. It's a tendency that was first noticed over 25 years ago in the United States. Although Gurian doesn't point it out, African American students exemplify in extreme the phenomenon that affects the whole population. See: Scott Jaschik," See: Scott Jaschik,"The Missing Black Men," Inside Higher Ed, 5 December.
Our name, with its allusions, is found in James Joyce's Finnegin's Wake. As with much else in Finnegans Wake, however, I'm not sure what it is doing there.
Our name vaguely recalls the memory of Cleopatra, her beauty, her mystery, and her contingent power. More directly, it invokes the name of Clio, one of the nine muses in Greek mythology. Clio the Proclaimer was the muse of history, who was credited with bringing the Phoenician alphabet to Greece. She is often depicted in western art with a scroll and a small library of books. In his work for the Spectator, Joseph Addison, who perfected the essay and pioneered the novel as English literary forms, used her name as a pseudonym. The Latinate patria would refer to one's place of origin, a father's home or a native land. We speak from and of history as our place of beginnings, in which we act, through which we move, and to which we owe some allegiance. As a word of both Greek and Latin roots, to say nothing of the Egyptian allusion, Cliopatria is also a barbaric hybrid. It suggests the plurality of our origins and degrees of alienation. We are not obliged to agree with, only to listen carefully and respectfully to, each other.
We've gotten better at that last part of it. Cliopatria didn't even have a face until The Cliopatria Awards forced the issue. Happy 2nd birthday, ol' girl!
Training for war, I spent an afternoon in an army classroom listening to presentations on improvised explosive devices and the insurgents who plant them. Droning through one of the inevitable PowerPoint presentations, a sergeant first class read directly from the slide in front of us: The insurgency, he read, will probably die down after we capture Saddam Hussein. Except that the class was taught this October, a couple of years after that former dictator had been dragged out of his spider hole. The sergeant stopped for the briefest moment, mumbled that the slides were a little out of date, and went right on reading.
In the current issue of Foreign Affairs, the former Secretary of Defense Melvin Laird describes the war in Iraq as an effort to "preserve modern culture, Western democracy, the global economy, and all else that is threatened by the spread of barbarism in the name of religion." Those would be some pretty big stakes. But from where I sit, the operation of the institutional machinery behind the war has all the markings of a halfhearted hobby, a project undertaken and promptly regretted but not yet possible to quit. Something significant has been lost between the declaring of war and the waging of it. The slides are all out of date, and no one can quite be bothered to rewrite them.
Preparing for a bitter and knotty counterinsurgency against an enemy that mixes with the civilian population and strikes mostly with hidden bombs, we ran the familiar battle drills on moving through the woods in a wedge formation, reacting to contact and flanking enemy bunkers under the cover of suppressing fire. What we didn’t get was even a single class on language – even to learn a very few useful phrases in Arabic – or on the principles and practice of counterinsurgency. We trained to fight the Wehrmacht.
If we trained for the wrong enemy, we also trained for the wrong battlefield. Camp Shelby, Mississippi, where my battalion trained, is thickly vegetated and brutally humid – and the principal place the army is using to prepare National Guard and reserve troops for combat in the cities and deserts of Iraq. I was constantly reminded of the late David Hackworth's discussion of his training for combat in Vietnam, which took place in a mock Vietnamese village in the snow of the Pacific Northwest.
These tensions between idea and action – between the thing needed and the thing chosen – have also been all over the newspaper for quite some time. I remember reading in September that the U.S. and Iraqi militaries were sweeping the Iraqi village of Tal Afar clear of insurgents for the second time in a year. The operation was, of course, a total success. By October I was reading about suicide bombings in Tal Afar. We are, as the military axiom has it, mistaking motion for action.
That's a choice I hope we won't continue to make. The current choice is not, as it is so often represented, between staying the course or quitting; the choice is between quitting or raising the fight to the level of its rhetoric. Staying the course is just a slower and more carefully veiled way of giving up, a retreat on longer terms. Stern talk is as cheap as any other form of talk.
Assuming the validity of the goal, we would have defeated the insurgency in Iraq with steps that we have never apparently begun to take in earnest. One would have been to declare a national emergency in language skills, rapidly building a training infrastructure to expand our pool of highly proficient Arabic speakers in the military and the diplomatic corps. Another would have been to quickly develop and sustain an intellectually disciplined counterinsurgency doctrine that showed up every day in the training or ordinary soldiers and their leaders. Did we mean to do all of this, or any of it, or did we not?
Sitting these days on an army forward operating base in Kuwait, I have weekly access to hip-hop nights and spa days; I have ice cream at every meal, and Burger King in the mini-mall. Couples pair off in the movie room, blinking at the light and untangling their bodies as the movies end. And we all wait for the next move, in a setting that feels more like high school than a war. To frame this effort as critical to our national well-being while simultaneously allowing it to shamble along lethargic and undefined is to suggest that we never really meant what we said about the meaning of our curiously desultory war in the first place.
Er, no, not that kind of reproduction--academic reproduction. The most recent issue of the Journal of the Historical Society includes an article by Brendan McConville, "Early America in a New Century: Decline, Disorder, and the State of Early American History," which argues against current claims that early American history isn't sufficiently political or military enough. McConville concedes that the approach to politics and military history is different from that of a generation ago, but that nevertheless, "there may be more professional scholars studying military-related issues today than ever before" (467). "But you don't write about military-related issues," says the puzzled Gentle Reader. "You write about historical novels." There is that, yes. But I'm mostly interested in two of McConville's theories about shifting trends in historiography.
The first is that the students of some Eminent Historians have failed to give their parents scholarly grandchildren; as McConville notes, "[t]he students of Bernard Bailyn [...] have not gone on to replicate themselves by means of large graduate student empires, despite their prolific publishing careers" (463). Similarly, McConville estimates that "no more than twelve or so" of Gordon Wood's own graduate students are "active scholars" (463). He goes on to note that, by contrast, the numerous offspring of historians like Jack Greene have gone forth and multiplied. In other words, the shift towards cultural studies in early American history has partly to do with academic forces that cannot be reduced to simple political trends (or, as McConville tartly puts it, "Jack Greene as leftist cultural studies advocate--I know of at least ninety-two people who will get a laugh out of that" ).
It would be interesting to see a study of academic "descent" in English literature. For example, we might note the sudden emergence of doctoral dissertations on British periodicals during the 20s, 30s, and 40s. Who directed those dissertations? What careers did those students have? (One of them, Leslie Marchand, went on to write the still-standard biography of Byron.) And what students did they, in turn, produce?
Of course, it's rather difficult to "reproduce" yourself if you aren't at a doctoral institution. While McConville argues that "[t]he expansion of universities beyond the East Coast" (465) has reshaped the face of the profession , his insight could be pushed further: the black hole that is the job market, along with the oversupply of Ph.D.s, means that many students will spend their careers at liberal arts colleges, comprehensives, and other relatively teaching-centric schools. ("Like SUNY Brockport?" inquires my reader. Why, yes, now that you mention it.) These students will still publish, but few of them will see graduate students who intend to go on for doctoral work, and even fewer will direct a doctoral dissertation themselves. And adjuncts will almost certainly not be training anyone at all. It will be interesting to see how current academic career paths intensify some trends and disperse others.
 Dad the Emeritus Historian of Graeco-Roman Egypt, who called while I was typing, thinks that McConville is over-exaggerating here.[X-posted at The Little Professor.]
As Ralph notes below (scooped again, damn!), Another Damned Medievalist has Carnivalesque XI up, and it's a veritable festival of Western Civilization! Not much of the rest of the world, though. That's not a criticism of the carnival or its Mistresses: I went looking for things to submit and came up pretty dry, as well. It's just that there's not, as near as I can tell, a lot of ancient/medieval blogging (in English) outside of the traditional Western regions.
I was meeting with my Historiography students last week, in a slightly informal session (it's late in the semester and we've had a break in the rain, so we were meeting outdoors) and one of them, deep in the throes of pre-registration, asked me
If you're not going on to grad school, why do a thesis?
I stumbled around for a good answer, but didn't come up with much. I talked about how well-prepared these students are for this project, how satisfying it is to reach that level of expertise and to delve into real discovery, how interesting some of their proposals and some of our previous theses have been. But basically it boils down to the very intangible value of personal satisfaction in highly abstruse achievement. I feel like I missed something, but thinking about it off and on for a few days (in between justifying my professional existence; more on that later) I can't think of what it might be. So I throw it out there: The AHA recommends a capstone research course, our accrediting agency seems to think it's a great thing, it seems like a good idea to me (though I didn't do an undergraduate thesis, myself), it's required by the department (or a major history-related service project, as an alternative, though there's only been one taker so far). But why? What do you tell students who ask you that question?
At the risk of attracting scorn and brickbats, I would like to suggest that Vidal is one of our nation’s most significant historians, and that his historical writing deserves more intensive study by students of history.
In addition to Vidal's acerbic historical essays and book reviews, on subjects ranging from George Washington to the Amistad Mutiny to Richard Nixon, I would recommend study of his novels. Although I blush to confess how long it is since I read them, their impact remains strong on me. Vidal’s LINCOLN, for example, provides a well-researched and rounded picture of the character of Abraham Lincoln, his use of power, and his great political skill, and the ways in which he contributed to centralizing national power in Washington. His BURR is a wonderful portrait of Jacksonian America and the imperishability of American political trickery through journalistic scandal-mongering. In quite another way, JULIAN, a portrait of the late Roman Emperor Julian, is both witty and poignant in the ways it takes apart the Christian self-image of historical innocence amid Roman depravity, and shows how Christians pulled strings and not too gently maintained themselves in seats of power. I might mention in passing Vidal’s 1967 novel WASHINGTON, D.C. Although it is not such a rigorous historical study, it has a particular importance for me as the first mass-market work I ever saw to criticize the wartime removal and confinement of Japanese Americans.
I do not wish to leave people with the impression that I agree with all of Vidal’s historical judgments. The intense isolationism that he inherited from his Populist grandfather and hero, Oklahoma Senator T.P. Gore, and that continues to mark his view of the 20th century (Vidal has said that the first and only political organization of which he was ever a member was the America First Organization) tends to scant his view of reality. It is silly, in my view, to claim that Charles Lindbergh was a great national hero done in by the machinations of British propaganda. Still, even where I disagree, I find his ideas illuminate debate.
My first year at the University of Illinois when I was teaching the survey, I was called in by our most distinguished U.S. historian. I am sure he meant well, but he informed me that there were many problems with my teaching ... he thought my biggest problem was that I was confusing the students by discussing how different historians thought differently about issues. “This is the only history course that most of these students will ever take,” he told me,"and they need to know the facts." I disagreed. If this is the only history course students ever take, it was all the more important that they know that historians disagree over what the facts are as well as over interpretations. I still believe that.
"This is the only history course that most of these students will ever take." That platitude, or should I say that attitude, bugs me, even though I often repeat it to myself when struggling to make decisions about what to include on a syllabus or in a lecture. I think that many history teachers take for granted what Burton's colleague did: that we have one shot at reaching the students in our classes before they throw history on the dustbin forever. And maybe that's why we agonize about how much to cover in a course. It is because we believe, at some basic level, that this course is the only chance we have to teach students the history of the United States, or the history of Western Civilization, or the history of (gulp) the world.
Set aside for the moment the question of how statistically sound the assumption is. It seems like common sense, but I'm sure that at specific institutions and in specific survey courses, the generalization probably is not as iron-clad as it sounds. Implicit in the assumption of Burton's colleague is another presumption--that our undergraduate history students are just less interested in taking history classes than other classes--that is probably even harder to substantiate statistically. Some rough-and-ready figures in this Perspectives article suggest that national average enrollment in undergraduate history courses has been increasing in recent years, even if only marginally.
But even if it could be statistically proven, in any given course, that the majority of our students will never take another history class, I think it helps our teaching very little to know this. For one thing, it bespeaks a certain fatalistic pessimism, a world-weary attitude that students don't care about history and won't care to know more. If history teachers think that about their students, it's bound to come across to those students in their teaching. And if we convey to our students a pessimism about their interest in history, our Cassandra-like prophecies about the ahistorical wasteland of their futures will likely become self-fulfilling. Why should they take other history courses if they can sense our fear that they won't? Shouldn't our goal as teachers be to inspire students to take more history courses, rather than to assume from the beginning that they will not?
To be sure, many students will not take other history courses, despite our best efforts to encourage them to do so. Certain majors require undergraduates to run through so many rigorous paces that they won't have time in their schedules for other history classes, even if they want to take them. Even so, the fact that a student may take only one history class does not mean that they will never have another encounter with history. Indeed, if we cannot ensure that students will seek out other history courses, we can and should be conveying to students that knowledge of the past is important enough for them to seek out in whatever way they can. Suppose it is the only history class they will ever take: that does not mean the assigned reading has to be the only history book they ever pick up.
Charming idealism, some might say. Perhaps. But let's imagine the worst-case scenario: that the majority of our students, despite our enthusiasm and encouragement, will never take another history class, never enter a museum, never read another book or article about history, never watch a history documentary, never see a historical film or read a historical novel. (The scenario is ludicrous, if you put it that way. But so-called"realism" often turns out, on closer inspection, to be less realistic than idealism.) Even if that dismal scenario were to come true, it would not settle a single question about how to teach a history course. It would establish that those questions were incredibly important, that they deserve our serious thought and careful attention. But the fact that a history course may be the only exposure to history a student has entails nothing whatsoever about what that course should include or how it should be taught.
That's not what Burton's anonymous colleague concluded, of course. For him, the likelihood that his students would never take another history course made it imperative to pass along"facts" and downplay scholarly disagreement. But those pedagogical choices are not at all implicit in the bare fact that this is"the only history class they will ever take." And to be fair, Burton's counter-argument is no more valid than his colleague's: it is not"allthe more important" that students learn about disagreements between historians if this is the only history class they will ever take. It is either more important that they learn about differing interpretations, or it is more important that they learn"facts." Whether this will be a student's only history course does not help a teacher decide on the relative importance of those two pedagogical strategies.
It is common for people to confuse the value of a particular decision with the values they will use to make that decision. Tourists tell themselves that this is the only time they will visit Rome, so they have to see the Coliseum. But in fact, nothing about this being their only time in Rome makes the value of seeing the Coliseum appreciably higher. The tourist's sense that his time in a city is limited sharpens the importance of making considered, rather than casual, decisions about what to see. But it doesn't actually help him make choices about what to see. He believes that his sense of urgency is directly informing his decisions about what to include in an itinerary, but that's what we might call a"logical illusion." To give another example, suppose I am visiting a restaurant that I know I will not visit again. That makes me study the menu with special care, but it doesn't at all help me decide what to order. (Least of all does it mean that it would be more rational for me to order everything on the menu rather than only one thing. That would probably lessen the pleasurableness of my one visit.)
The truth is that when I am deciding what to order at the restaurant, or what to see in Rome, or what to include on a syllabus or in a lecture, my awareness that the decision is momentous does not actually help me make a decision. Retrospectively, I may think that it does, by telling myself that I saw the Coliseum because it was the most important thing to see, or ordered the caviar because it was the chef's specialty. But really what I'm doing is trying to reassure myself that my decision about what was most important to see, or eat, or teach was the right one. In reality, I made those decisions based on some other logic that may not even be perceptible to me. I had some way of ordering the values of different possible choices, but nothing about the fact that I could make a limited number of choices actually helped me order them. (If you disagree, consider that every choice we make occurs within a context of limited possible choices, since we are temporal and mortal beings. Does a knowledge of your ultimate end really help you decide what to have for breakfast?)
By my lights, at least, the historian's platitude that"this is the only history class our students will ever take" is nothing but a pedagogical red herring. It settles nothing in debates over what to teach or how to teach it. Those debates have to be settled by appeal to some other standard of adjudication, especially since (to make an obvious point that really makes this whole post superfluous) in a debate like the one between Burton and his colleague, either side can appeal to the fact that he only gets one chance with his students. Neither interlocutor gains the upper hand by pointing out a bare fact, if indeed it is one. I could just as easily win an argument over what to teach my students by pointing out that semesters come to an end. Well, yes, but how does that fact speak at all to the question of what to do with a semester?
(Cross-posted at Mode for Caleb.)