History People Are Talking About Archives 7-18-03 to 8-29-03





  • Michael Beschloss: Kissinger's Secret Papers and Tapes

  • We Have a Holocaust Museum, How About a Communism Museum?

  • Historian: New Movie Mangles the True Story of Zapata

  • Germans Confront the Truth About the Nazi Murderers in Their Own Families

  • Revisionism Run Amuck

  • Dump Western Civ?

  • Psychologists Plunge into Politics

  • Australia's History Wars

  • How the Civil Rights Revolution Freed Blacks, Gays and Women

  • 'Silent' Wartime Pope Was Anti-Nazi, Papers Reveal

  • In Early Reports, Size of Gathering Overshadowed King's Words

  • Vatican Rewrites History To Insist It Did Not Persecute Galileo

  • The National Park Service Rewrites the History of America's Civil War Battlefields--And Braces for Trouble

  • History and Myth

  • Why Are Students So Ignorant of History?

  • Were the Nazis Christian?

  • Was Canadian Explorer the Model for Coleridge's Rime of the Ancient Mariner?

  • Rock N Roll Is Part of History

  • Is the Movie The Magdalene Sisters Anti-Catholic?

  • The Nearly Invisible Founding Father

  • Meriwether Lewis Deserves Better

  • Genghis Khan Is Now Regarded as"Cool" in Mongolia

  • The Slug that Gave Consumer's Legal the Right to Sue Manufacturers for Damages

  • Why the National Constitution Center Works

  • Newton Brings Winston Back to Earth

  • The Next Generation: The Millennials

  • The Kansas Museum Devoted to the History of Flight

  • The Best Book Ever Written About Washington, DC

  • Museum Features History of Contraceptives

  • Controversy About What to Do With Last Remaining Hut in British POW Camp

  • Donald Ritchie: The McCarthy Transcripts

  • The History of the Family: Changing Interpretations

  • A Scientific Explanation for the Trances of the Oracle of Delphi

  • The First Drink of Wine--Ever

  • Did Drake Explore the West Coast of the United States?

  • Gar Alperovitz: NYT's Kristof Is Wrong

  • Admiral Perry's 150th Anniversary: How the Japanese Remember His Visit

  • A New Theory Suggests the Inca Actually Did Possess a Written Language

  • Museum: How Belgium Brought Civilization to the Congo

  • Fresh Evidence that Caligula Really Was Crazy

  • Did the Beatles Kill the USSR?

  • Anti-Americanism's Long Roots

  • Rehashing the Case of the New Zeland Graduate Student Who Denied 6 Million Jews Died in the Holocaust

  • Douglas Brinkley: Those Anonymous Writers at the WPA Are No Longer Anonymous

  • Eric Rauchway: Why Kristof's Wrong About the Hiroshima Bomb

  • Holocaust Deniers Countered by Harvard History Project

  • Why Writing Systems Die

  • Was Hiroshima Really About Revenge?

  • Hiroshima: Blood on Our Hands?

  • Postage Stamps: Sanitized History

  • Lewis & Clark: Maya Lin's Monuments to Their Expedition

  • Walt Whitman: Quakerism, Homosexuality and the Sea

  • The Pyramids Were Not Built by Slaves

  • The Importance of Risk as a Concept in Writing History

  • The Richest Man of Color in America Who Wanted to Establish a Colony in Africa of Freed Blacks

  • Samuel Insull: The Ken Lay of His Day

  • What Was Germany Like During the Occupation Following WW II?

  • The Shifting View of Winston Churchill

  • China's History Textbooks: Slowly Admitting the Truth About the Korean War

  • UK's Hidden History of Immigration Revealed

  • Garry Wills: Hillary's Thin Memoir

  • Was Seabiscuit the Hero of the 1930s?

  • Sojourner Truth's Truths

  • David Greenberg: Does Magruder's Allegation Mean Much?

  • Niall Ferguson: In Defense of the British Empire

  • Pakistan Upset After Scholar Says the Koran Has Been Mistranslated

  • Real-Life Archaeology Is Getting More Dangerous Than in the Movies

  • Kathryn Weathersby: Did the Soviets Back North Korean Invasion Because of Acheson's Speech? No.

  • NPR: The Impact of the Korean War on American Race Relations

  • Daniel Kevles: Why We Wrote a Textbook Based on Science and Technology

  • Why Conservatives Are Denouncing Ann Coulter's Book

  • Ann Coulter's Dangerous Book

  • America's First Spaceman

  • Are TV Histories a Threat to the Study of History?

  • Was What Happened in Goliad, Texas a Massacre or an Execution?

  • Was Newton Driven Mad by Mercury Poisoning?

  • John Taylor: Why Was Truman Given a Pass and Nixon Never Was?

  • No One Knows How the Wright Brothers Plane Worked

  • James F. Brooks: Rewriting the History of Slavery

  • The Origin of the White House Memoir

  • Truman: Man for Our Times?

  • Truman, Prejudice and America

  • The Media's Double Standard: Comparing Coverage of Nixon's Anti-Semitism and Truman's

  • Should We Be Selling Our National Heritage?

  • We Are In Danger of Losing Our Historical Memory

    Click here to return to top of page.

    Michael Beschloss: Kissinger's Secret Papers and Tapes (posted 8-29-03)

    Michael Beschloss, in the course of a review of Henry Kissinger's latest book, Crisis; in Newsweek (August 11, 2003):

    As a memoirist, [Henry] Kissinger has enjoyed a formidable advantage. His books have been based on his papers and other materials. By his orders, these were secreted in the Library of Congress and were to be closed to outsiders until five years after his death. Although they were produced on government time and by government employees, Kissinger successfully argued that they were “private” and twice prevented the government’s National Archives from examining them to decide whether they were or not.

    Kissinger’s monopoly on this historical record has driven many scholars to distraction. Groups of lawyers, scholars, journalists and archivists have used pronunciamento, lawsuit and other crowbars in a usually vain effort to open Kissinger’s Library of Congress cache.

    In 2001, a quarter century after his departure from government, Kissinger volunteered to let the National Archives begin processing 10,000 pages of documents from his State Department years for ultimate release. This collection includes the telephone transcripts that form the basis for “Crisis”—once dubbed the “Dead Key Scrolls” by columnist William Safire because Kissinger’s aides made them using a “dead key” extension on his phone system.

    His Scrolls do not quite have the tantalizing aura of the Nixon tapes. Typed up by Kissinger’s staff for use in daily business, they lack the unpredictability and pungent language (“I don’t give a s—t about the lira!”) that bring Nixon—and the Kissinger of those years, when he appears—back to life. Still, since the revelation of Nixon’s secret taping system outraged the public, high U.S. officials have not systematically preserved their private conversations by taping them. Thus for the tumultuous years from the Watergate summer of 1973 through Jimmy Carter’s inauguration in 1977, we will probably get no more intimate source than Kissinger’s Scrolls.

    It is not hard to imagine why Kissinger chose the Yom Kippur War and the Vietnam collapse as the subjects for this book. Both are dramatic turning points and show Kissinger to excellent advantage. Truman’s Secretary of State Dean Acheson said that no man comes out of his own memorandum of conversation looking second best. The same may be said of “Crisis.” The leading man looms as a tower of sanity, cool and broad-minded, negotiating with wit, stamina and skill. He is surrounded by a cast of lesser characters ranging from the beleaguered Nixon, distracted by Watergate, to the last U.S. ambassador to Saigon, the emotional Graham Martin, who shows himself inclined to make himself into a human sacrifice as other Americans flee the North Vietnamese victors.

    Kissinger has defended his and Nixon’s decisions on Vietnam in earlier volumes. In this new one, he manages to convey the difficulty with which, as the Viet Cong pushed for final victory, he had to balance conflicting demands from other U.S. officials, angry conservatives, angrier South Vietnamese allies and an impatient Congress.

    Kissinger’s fellow Republicans have far more use today for Reagan’s “Why Not Victory?” strategy than for the Nixon-Kissinger detente with the Soviet Union. He may hope that they might reconsider after reading his you-are-there rendition of the Yom Kippur War, showing how his private collaborations with Soviet Ambassador Anatoly Dobrynin helped to prevent a superpower smash-up. Readers of this book should certainly recognize that Kissinger’s effective 1973 Mideast diplomacy was the forerunner of President Bush’s current efforts to broker an Arab-Israeli peace.

    More than anything else, “Crisis” recaptures the quality, now forgotten by many Americans, that made “Super K” in 1973 the most admired man in the country (according to the Gallup poll). Americans who worried about Nixon’s psychiatric balance during Watergate or Ford’s schooling to be president believed, through crisis after crisis, that they need not worry as long as Kissinger’s steady hand was on the tiller. As Kissinger faces the bar of history, he shows himself with this book shrewd enough to understand that whatever future critics may think about Cambodia, Christmas bombings or the ouster of the Chilean government, this quality will be one of the strongest arguments in his favor.

    Click here to return to top of page.

    We Have a Holocaust Museum, How About a Communism Museum? (posted 8-29-03)

    Radley Balko, writing for FoxNews (August 28, 2003):

    One of the most powerful museums in Washington, D.C., is the Holocaust Memorial Museum. It’s the one site I always recommend to people visiting the city, even though it takes a couple of days to shake off the malaise that settles in after you’ve seen it.

    It’s a fitting memorial that accurately documents and catalogues the horrors of the Holocaust, without much propagandizing. It allows history to stand on its own. The events as they happened are quite enough.

    It’s time we had a similar museum to memorialize the devastation wrought by communism (search).

    Adolf Hitler (search) has become the embodiment of human evil, yet he wasn’t the biggest killer of the last century. He didn’t even come in second. He was third, behind two communists, Joseph Stalin (search) and Mao Tse-Tung (search).

    According to the historian R.J. Rummel, Hitler’s Nazis killed about 21million people between 1933 and 1945, (a figure that includes Roma gypsies, homosexuals, the handicapped, Poles, Russians, Jehova Witnesses and Germans, as well as six million Jews.) Stalin killed twice that many, and Mao killed just under 38 million. When you add in the murders attributable to Lenin (search), Pol Pot (search), Tito (search) and the remaining communist dictators of Asia, Africa, Eastern Europe and Latin America, communism claimed more than 100 million lives. These estimates vary, but it’s generally accepted now among historians that communism took far more lives than Nazism (search).

    My aim here isn’t to minimize the atrocities of the Holocaust. My point is that communism also killed millions -- perhaps hundreds of millions -- this last century; it enslaved, and continues to enslave, billions more.

    And those are merely the costs we can estimate.

    Far more speculative and difficult to measure are the ways in which communism killed human potential. The last century was the most productive in human history: We cured diseases, went to the moon, improved the human condition in almost every way imaginable. Think of what the human race might have accomplished had billions of us not been imprisoned by communism but been free to explore, stretch and reach our potential through competition, innovation and creativity.

    There’s really no telling what we might have done.

    Unfortunately, nearly 14 years after the fall of the Berlin Wall (search), the embers of communism haven’t yet flickered out. Anti-communists cannot invoke the Holocaust survivors’ cry of “Never Again.” They can’t even cry, “Not Right Now, At This Moment.”

    Right now, North Korea’s communist regime (search) is imposing a famine on its own people, with resulting deaths estimated in the millions. Communist regimes continue to hold captive the people of China, Laos, Vietnam and Cuba. Human rights abuses abound in all five countries.

    Yet communism is rarely regarded with the same enmity we hold for Nazism. In fact, communism today is downright trendy.

    Most of us are justifiably revolted at the sight of a teenage kid wearing a T-shirt emblazoned with a swastika (search). But glimpse the same kid in a shirt featuring a sickle and hammer, or a portrait of Che Guevara (search), and many of us will find him quaint, perhaps idealistic -- at the very worst, naïve and misguided. In New York City, you can get tipsy at the KGB Bar, a chic spot featuring Soviet-era symbolism and paraphernalia. Imagine what might become of the entrepreneur who tried to open a nightspot themed with Nazi regalia.

    Click here to return to top of page.

    Historian: New Movie Mangles the True Story of Zapata (posted 8-29-03)

    From the Guardian (August 28, 2003):

    The sea of outsized sombreros, cartridge-filled bandoliers and scruffy peasants scurrying round the set of Zapata gives a thoroughly conventional first impression of Mexico's most expensive film yet. But it isn't long before hints appear of the dramatic makeover that Mexico's favourite moustachioed revolutionary hero is receiving. "Emiliano Zapata was not just a revolutionary political and military leader - he was a spiritual leader too." Writer, director and producer Alfonso Arau, on set in Cuautla, is holding forth on his reinvention of the Caudillo del Sur, the "Boss of the South". "My film is the story of a mythic hero, a predestined leader who passes through a series of tests that end with death that is his passage to eternal life."

    Zapata's iconic status has risen globally since the Chiapas insurgency, led by the pipe-smoking sub-comandante Marcos, erupted in 1994. One of the great leaders of Mexico's 1910 revolution, Zapata was idolised as the only revolutionary who sought a wholesale transformation of society in the peasants' interests, before being tricked into an ambush and killed in 1919. His reputation helped make Zapata a key part of the revolutionary myth that was built up in the 1930s and used by the Institutional Revolutionary party to legitimise its claim on power for decades.

    Arau's script includes a scene in which exploited, poverty-stricken Indians proclaim the baby Zapata as their saviour. In another, the full-grown guerrilla displays mysterious powers over his enemies' horses. A key moment has the warrior surrounded by fireflies, which then metamorphise into faithful followers.

    Such artistic liberties, the director insists, are in the name of a greater truth he discovered through quizzing spiritual healers in Zapata's old stamping ground in the central Mexican state of Morelos. It is also here that the movie is being filmed, mostly in a crumbling hacienda and abandoned sugar mill where the real historical figure looked after horses before the 1910 revolution started.

    "I found out that Zapata was a sacred warrior for his own people and that he was a shaman, a real shaman," says Arau. "Aside from the reality that we see, smell and touch, there are other parallel realities, and that's the one I am telling in this movie. I expect the historians are going to object."

    He's right; they do. "The idea that Zapata was a spiritual leader is a complete misconception," says Harvard history professor John Womack Jr. Womack's 40-year-old biography is still the standard reference book on the life of the mixed-race leader of Mexico's most radical revolutionary faction, which fought on when the ideals of "land for the peasants who work it" were betrayed.

    "Zapata was someone who was tough, reliable, trusted, practical and the logical person to choose as a leader," Womack says, adding that he also developed some very respectable skills as a guerrilla leader as the war went on. "The rest is fantasy."

    Click here to return to top of page.

    Germans Confront the Truth About the Nazi Murderers in Their Own Families (posted 8-28-03)

    From the Chronicle of Higher Education (August 28, 2003):

    A glance at the fall issue of "Holocaust and Genocide Studies": How family history can obscure the past

    Denial and silence about family members' involvement in the Holocaust are important but neglected parts of Germany's relationship with its past, says Katharina von Kellenbach, an associate professor of religious studies at St. Mary's College of Maryland. She explores the issue through her own family history.

    When she was a child, her family would not discuss the Holocaust, even though information about it was presented in the news, at school, and in church. She eventually discovered that an uncle, Alfred Ebner, had been accused of killing 20,000 Jews during the Second World War. Her family insisted that the charge was not true. "The momentary glimpse of a murderer in my family's midst was gradually erased by the weight of silence and anxiety, and by my need to maintain amicable family relations," she writes.

    While she lived in Germany, she says, she forgot or repressed the knowledge. It was only after immigrating to the United States and meeting Jewish survivors of the Holocaust and their children that she remembered and followed up on her family's past. "The historical record shows that Ebner was directly responsible for the implementation of Nazi extermination policies," she writes: "Yet he was never convicted, and as far as I know, he never regretted his actions."

    Even with the evidence in front of her, it was difficult for Ms. Kellenbach to believe and look into the charges, because her family had always spoken of her uncle as a victim of false allegations and postwar harassment, she says. "His status as a victim depended upon and sealed the erasure of the Holocaust," she writes, but it is the duty of younger generations "to resist these vanishing acts."

    The article is not online. Information about the journal is available at http://www3.oup.co.uk/holgen/current

    Click here to return to top of page.

    Revisionism Run Amuck (posted 8-28-03)

    Brad Cameron, writing for the National Association of Scholars (August 26, 2003):

    History professors who assign their students term papers on aspects of the Second World War, particularly with the Nazi concentration camps, sometimes find themselves handed bibliographies that include articles from the journal of the Institute of Historical Review. It is highly unlikely that any of these students have found these articles in any college or university library. They find them on the internet and recognize no difference between them and others they find elsewhere.

    However, despite its innocuous title, the IHR is not a scholarly foundation, but a crank front organization providing apologetics for Nazi Germany and "research" denying the Holocaust. Early neo-Nazi internet sites often boldly proclaimed their purposes, even including swastikas. But the IHR is more discreet, reprinting articles by obscure imitators of David Irving. The site blandly presents itself as a gathering place for 'dissenting' scholars, with no aim beyond a search for truth. Anything more than a superficial look ought to reveal what it is really all about. Unfortunately, the internet is the classic venue of the superficial look, especially for students doing term papers.

    That the IHR boasts of its "revisionism" is unlikely to provide them with much warning, because many history students of recent decades are much more likely to have been provided with plenty of arguments about the merits of revisionism itself than they are to have been given much of a foundation in the history that is supposedly in need of revising. Even before the recent postmodernist wave, history graduate faculties were already loaded with future professors who had not so much learned history as historiography, learning "American history," for example, mainly as a review of the various conceptual schemes of Charles Beard, Frederick Jackson Turner, and other such grand explainers, while often remaining very poorly informed about anything that actually happened in the United States between 1776 and the last quarter of the twentieth century.

    During the 1960s, many of these students were taught far more about what was claimed by revisionists on the left, like William Appleman Williams and Gabriel Kolko, than they were about what presidents from Truman through to Reagan actually did, much less about the moves and countermoves of the leaders of the Soviet Union. Granted, some of their professors would also encourage them to revise these revisionists in turn. However, while this usually amounted to the rediscovery that the world of 1945 to 1980 was more like what it was commonly believed to be off university campuses than it was claimed to be on them, academia saw only a younger generation achieving its place in the sun by another ritual devouring of their elders.

    This permanent relativism eventually laid the groundwork for a revisionism that would put the most daringly wrongheaded of past years to shame. Revisionists of the 1960s tried to select documents that would support otherwise improbable explanations of which forces had most importantly shaped the behaviour of past historical figures. The revisionists of this era need few documents, new or old, since they treat all accounts of the past as mere 'narratives' to be mangled and dismembered on their feminist/post-colonial/anti-racist/gender-sensitive Procrustean bed.

    Even more alarming than the young academics who engage in this exercise are the academic administrators who watch over this solipsist nonsense with benign smiles. A surprising example can be found in the correspondence columns of the 11 August 2003 National Review. An earlier issue had reported, with understandable horror, that a Marine captain had presented a paper at the United States Naval Academy arguing that the Iwo Jima landing was a "racist" operation. The NR article about it drew a response from the academic dean and provost of the USNA. After some legitimate but irrelevant celebration of the Naval Academy in general, the dean declared:

    The Academy's history department conducts regular discussions of scholarly works-in-progress by military and civilian faculty. In this crucible, ideas are challenged, assumptions questioned, factual support assessed, and clarity enhanced. Such was the discussion of this junior officer's draft treatise . . . All present recognized the preliminary nature of the paper, and the young officer is greatly offended that someone not even present misused his draft research to bolster preconceived notions.

    This response recalls the kind of answers that the comic strip character "Dilbert" gets from his nincompoop boss when he dares to point out the obvious. The boss is not just scatterbrained; he keeps entirely missing the point at issue. Either the academic dean and provost of the USNA is being disingenuous, or he lives in the same fog as Dilbert's boss. No one knowing anything of the war in the Pacific would deny that there were elements of blanket anti-Japanese racial prejudice mixed in with the primarily justifiable motives with which the U. S. fought the war. But Iwo Jima was a battle, one of many that had to be fought to defeat Imperial Japan. Centering a paper on its "racist" aspects is comparable to studying the destructive effects of the Battle of the Atlantic on halibut stocks.

    This kind of research is not instructive, but clever: a display of the student's familiarity with fashionable preoccupations, not the historical events on which these are brought to bear. The arrival of this subjectivism in an officer training school is positively frightening. The Naval Academy dean clearly needs to warm up his crucible, and to have his own preconceived notions given a new bolstering. What could arouse his alarm? A reinterpretation of D-Day as an attempt to widen the market for Coca-Cola? A study of MacArthur and Nimitz as closet queens, engaged in homoerotic rivalry? Or would he rejoice in these exciting prospects? I think we should be told.

    Click here to return to top of page.

    Dump Western Civ? (posted 8-27-03)

    A post by Larry Schweikart on Richard Jensen's conservative list (August 24, 2003):

    Just this Friday, my history faculty voted unanimously, save moi who voted"nay," to eliminate a"Western Civilization" requirement in favor of a"global studies/world history" requirement. But wait . . . Not only did the faculty eliminate"Western Civ," but the new"global studies" course is"thematic," meaning that any"theme" covered over a 200-year period is an acceptable topic. Specifically, the faculty rebelled against" content," saying, in essence, students can't learn content anyway, and emphasized . . ."WAYS OF KNOWING." One interesting comment was that"everyone else" is moving in this direction and we didn't want to"lag behind." I noted that"leadership" is not lagging, it is leading. Does Mr. Summers agree with me? ("He wants to change the undergraduate curriculum so that students focus less on ''ways of knowing'' and more on actual knowledge.")

    Click here to return to top of page.

    Psychologists Plunge into Politics (posted 8-27-03)

    John Ray, writing in frontpagemag.com (August 27, 2003):

    Like most college and university teachers in the social sciences and humanities, academic psychologists are overwhelmingly Leftist in their orientation. So it will be no surprise to hear that at least since the 1950's psychologists have been doing their best to find psychological maladjustment in conservatives. To anyone with a knowledge of history the results have been quite absurd (See here) but psychologists rarely seem to know much about history so that has not disturbed them.

    I spent 20 years from 1970 to 1990 getting over 200 articles published in the academic journals of the social sciences which subjected the various politically relevant theories of psychologists to empirical test. The only test that psychologists normally give to their theories is to seek the opinions of their students on a variety of issues and present THAT as evidence about how the world works. My consistent strategy was to do the same sort of test among random samples of people in the community at large. I found that people in the community at large are not nearly as accommodating to the theories of psychologists as psychology students are!

    My non-conformist behaviour in actually doing a serious test of these theories won me no kudos, however. I appear to have had far more articles on political psychology published in the academic journals than anyone else and so would therefore -- by conventional academic criteria -- normally be considered the No. 1 world expert on the subject but in fact my writings have always been comprehensively ignored. My findings did not produce the RIGHT CONCLUSIONS, you see. In fact my findings showed the theories concerned to be wrong in almost every respect.

    So it was no surprise to me at all to read the latest effort in the long line of attempts by psychologists to discredit conservatives. The article "Political Conservatism as Motivated Social Cognition" was published recently by John Jost and his collaborators at Berkeley in The Psychological Bulletin -- one of the premier journals of academic psychology. The "powers that be" at Berkeley were so pleased with this article that they put out a press release that was designed to publicize the findings of the article as widely as possible.

    The result was great derision from conservative political commentators. The study was so obviously one-eyed that it was very easy to deride. Their claim that Stalin was Right-wing, for instance must be some high-point of twisting the evidence. If the most prominent Communist of the 20th century was Right-wing, who on earth would be Left-wing? Black might as well be white. Here is what Jost and his crew actually said:

     

    "There are also cases of left-wing ideologues who, once they are in power, steadfastly resist change, allegedly in the name of egalitarianism, such as Stalin or Khrushchev or Castro (see J. Martin, Scully, & Levitt, 1990). It is reasonable to suggest that some of these historical figures may be considered politically conservative"

    It is hard to know where to start in commenting on this breathtaking statement. To say that the instigator of huge (and disastrous) changes in almost everything in Russian life resisted change is incomprehensible. And to call Communists of that era conservatives is equally perverse. One has to say that "conservative" obviously has a pretty strange meaning in the ivory towers of Berkeley. In their world even Stalin can be blamed on conservatism.

    Apparently as an attempted explanation of their perverse definitions, they go on to say that the worldwide legion of Communist tyrants that they allude to are not typical of Leftists. The fact that Communists at their height controlled nearly half the world is not apparently enough to get them counted as typical Leftists.

    Click here to return to top of page.

    Australia's History Wars (posted 8-25-03)

    Robert Manne, professor of politics at La Trobe University, writing smh.com.au (August 25, 2003):

    Within 30 years of the British arrival in Tasmania, the near-extinction of the indigenous people had occurred. Ever since the 1830s, civilised opinion has regarded Tasmania as the site of one of the greatest tragedies in the history of British colonialism. At least in Australia, this view is presently under challenge. Late last year Keith Windschuttle published The Fabrication of Aboriginal History. It claimed that in the story of the empire, Tasmania was probably the place where "the least indigenous blood of all was deliberately shed".

    Windschuttle claimed that in Tasmania only 118 Aborigines had been killed, a little over half the number of British settlers who had died violent deaths at Aboriginal hands. Such clashes arose, he claimed, not because, as all previous historians had believed, the Aborigines were defending their lands from intruders, but because of the pleasure these savage people took in the act of murder and because they had come to covet British "consumer goods".

    Windschuttle attributed the large number of Aboriginal deaths, almost entirely, to introduced diseases, to the brutal disregard of Aboriginal men for their women, whom they wantonly sold into prostitution, and the maladaptation to their environment of a people so primitive that their survival for 35,000 years could rationally be explained only by a rather extended period of good luck.

    The most unsettling aspect of the publication of Fabrication was the enthusiasm with which it was greeted by the right, including by the Prime Minister, who awarded Windschuttle a Centenary Medal for services to history. Geoffrey Blainey described Fabrication as "one of the most important and devastating books written on Australian history in recent years". There was clearly something about the song Windschuttle was singing that was both familiar and appealing to certain ears.

    Following the reception of Fabrication two things seemed clear to me. If Windschuttle's interpretation of the dispossession came to be widely accepted, then all prospect for reconciliation - that is to say for a history which indigenous and non-indigenous Australians might share - was dead. And if the flaws in Windschuttle's interpretation were ever to be understood, it could only be through the publication of a non-polemical, scholarly book, written by those who knew, through their different expertise, that what Windschuttle had produced was not a genuine history, but plausible, counterfeit coin. Whitewash, which I edited and which was launched on Saturday at the Melbourne Writers' Festival, by Malcolm Fraser, Patrick Dodson and Henry Reynolds, is the result of these two thoughts.

    In order to demonstrate the falsity of Fabrication, let me consider what contributors to Whitewash show about just one of Windschuttle's most famous claims, namely that in Tasmania only 118 Aborigines were deliberately killed.

    The first problem with this figure is that the scholar on whom Windschuttle is almost entirely reliant, Brian Plomley, made it absolutely clear that he believed no even remotely reliable total of Aboriginal killings could ever be reached. The reason, according to Plomley, was simple. The written record had one great "defect"; "it was concerned only with attacks by Aborigines on British settlers" and not with British attacks on Aborigines. Plomley was aware, in particular, that an unknowable number of Aborigines had been killed by stock-keepers, sealers, timber-cutters and escaped convicts, who had no reason to report their killings and good reason not to do so.

    Windschuttle apparently believes that in Tasmania a death unreported or unrecorded is a death which did not occur. By use of a similar methodology, it would be possible to prove that virtually no sexual abuse of children occurred in Western societies before the 1970s.

    There is a second reason why Windschuttle's figure cannot be taken seriously. Assume, for the sake of argument, that every time a settler shot an Aborigine some record was made. Still no remotely accurate figure of Aboriginal killings could be produced. The reason is straightforward. As Henry Reynolds points out, in violent encounters between the British and the Aborigines, while some Aborigines died on the spot, others merely suffered wounds. There is obviously no way of knowing now the ratio of wounded to killed. It is, of course, quite certain that a sizeable proportion of the wounded subsequently died. No settler would ever have known.

    There is a third reason for rejecting Windschuttle's pseudo-precision about the 118 dead. If anyone claims to be able to arrive at an exact number of Aboriginal killings, at the very least it can be asked of them that they have examined all the published and unpublished sources which exist. Windschuttle has not, even remotely, done this work. According to James Boyce, of the 30 books on Van Diemen's Land published between 1803 and 1834, Windschuttle has consulted at most five, more likely three. Moreover he has examined almost none of the unpublished diaries or collections of letters available to scholars where records of killings or attitudes to Aborigines are likely to be found. Given his claim to omniscience, this failure to do the basic research is, quite simply, scandalous.

    There is, moreover, a systemic bias in his work which distorts his calculations. As Phillip Tardiff shows in the case of the killings at Risdon Cove, and as Ian McFarlane shows in regard to the massacre at Cape Grim, where there are disagreements between witnesses to Aboriginal deaths, Windschuttle invariably accepts the witnesses who supply the lower figure, even where their evidence is less plausible, or where they have far greater motive to lie.

    By now I hope it is clear that Windschuttle's claim - about Tasmania as the place where, in the history of British colonialism, the least indigenous blood was shed - is fatally flawed. Yet what must be stressed is that this is only one of a dozen or more issues of equal importance which are exposed by the writers assembled in Whitewash.

    Click here to return to top of page.

    How the Civil Rights Revolution Freed Blacks, Gays and Women (posted 8-24-03)

    Paul Berman, writing in the NYT (August 23, 2003):

    t the 1963 March on Washington for Jobs and Freedom, the Rev. Dr. Martin Luther King Jr., proclaimed his dream of a future in which everyone, no matter who, would be able to cry: "Free at last! Free at last! Thank God almighty, we are free at last!"

    Among King's followers, no one doubted what kind of freedom he had in mind. It was freedom from racist laws and the social customs of Jim Crow segregation — a legal and social freedom for millions of black Americans who, in the past, had never experienced anything of the sort.

    But then something odd and unpredicted took place. Once the issue of freedom had been successfully raised, all kinds of people, black and nonblack, began to cast an inquisitive eye on laws and customs that had nothing to do with racism or Jim Crow, and began to identify other sorts of oppressions. The subsequent outpouring of fresh understandings and novel insights about rights and freedom — for gays, women, the disabled and more — has dominated American thinking for 40 years now, which is quite a long time. And among those many fresh understandings and insights has come a new retrospective understanding of the 1963 march itself.

    The chief organizer of the Washington march was Bayard Rustin, who was duly applauded at the time for the efficiency of his labors. And yet, Rustin, who died in 1987 at 75, played a much larger role in organizing both the march and the wider civil rights movement than most people ever suspected at the time. Two new books, one a biography and the other a collection of Rustin's papers, now turn a brighter light on his life and his role, explaining how his homosexuality forced his contribution to be obscured.

    Rustin was a Quaker and a pacifist, Gandhi style — which meant being a fighter, though without violence. By 1963, Rustin had been fighting a long time. In the 1940's, he worked as an assistant to A. Philip Randolph, the Harlem-based labor leader, and helped organize a campaign against racial discrimination in war-related jobs — a successful campaign, all in all, that played a role in integrating the armed forces, eventually. But for all his success at Randolph's side, Rustin did not glide from triumph to triumph.

    In World War II he refused the draft, and in his pacifist zeal, refused even to accept a conscientious-objector status. Instead, he spent 28 months in federal prisons, a gruesome experience — though even there, unstoppable, he campaigned to end racial segregation in the prison dining hall. Released from jail, he went to work for a tiny pacifist organization in New York, which sent him around the country during the late 1940's and early 50's to agitate against war, against nuclear arms, against European imperialism in Africa and, always, against Jim Crow. He was arrested frequently. He was beaten. He served 28 days on a North Carolina chain gang.

    But then, in December 1955, the black citizens of Montgomery, Ala., led by Dr. King, mounted a boycott of the city buses in protest against the Jim Crow seating rules. King was brilliant, and yet he was only 26 and did not always know what to do. But Bayard Rustin did. He went to Montgomery, met King, visited his home — and was dismayed to discover guns in the living room. Rustin spoke. King listened. Rustin was less than popular among some of the other leaders of the Montgomery boycott — was it his phony British accent? his monarchical air? — but King and the others dutifully put away their guns and agreed to be arrested in a Gandhian spirit of nonviolence and spiritual superiority, which was Rustin's advice, exactly.

    Rustin proposed a strategy of reaching out to black churches elsewhere in the South, to broaden the boycott's base of support. And he offered King a larger coalition still, which was organized by Randolph and the handful of New York pacifists. They called on friends and allies in the labor movement, on good-hearted politicians, on singers and actors and on wealthy liberals with money to donate. And so, the young King, from his pulpit in Alabama, found himself soon enough at the head of a fledgling national coalition. And the coalition grew until, by 1963, with 250,000 people attending the March on Washington for Jobs and Freedom, it proved to be a national power.

    King was the leader of that gigantic coalition. But Rustin was the principal strategist.

    Click here to return to top of page.

    'Silent' Wartime Pope Was Anti-Nazi, Papers Reveal (posted 8-24-03)

    Newly discovered US diplomatic documents, including a confidential memo written by the future Pope Pius XII, indicate that despite the pontiff's failings to publicly confront Adolf Hitler, he privately came to believe that compromise with the Nazi regime was "out of the question".

    A year before Cardinal Eugenio Pacelli - the future Pius XII (1939-58) - cautioned against compromise in a 1938 memo to president Franklin Roosevelt, a US diplomat reported that Pacelli had described Hitler as "an untrustworthy scoundrel" and "a fundamentally wicked person". The findings by Catholic historian Charles Gallagher, of St Louis University, are to be published in the September 1 issue of the Jesuit magazine America. They are certain to renew debate over Pius's attitude to Nazism and his public silence in the face of Hitler's Final Solution, which resulted in the extermination of 6 million Jews.

    The discovery of the two documents at the John F. Kennedy Library and in an archive at Harvard University may be the first written evidence of Pius's antipathy towards Hitler and Nazism.

    "We've always known that Pius XII disliked Hitler and probably always thought he was an evil man," said Michael Phayer, author of The Catholic Church and the Holocaust and professor emeritus of history at Marquette University. "But we never had that in words before. We now know he's already formed his opinions clear back in 1937, while he was still Vatican secretary of state."

    Mr Gallagher agreed, saying in an interview: "It gives a record of Pacelli having made a moral determination about Hitler."

    Even the pontiff's critics are intrigued by the revelations. Rabbi Marvin Heir, founder and dean of the Simon Wiesenthal Centre in Los Angeles, called the documents "very interesting" but all the more "puzzling" when held up against Pius's failure to publicly condemn the Holocaust.

    "When he has an opportunity to practise what he preached that Nazism is so terrible... he ducked and wouldn't do it," Mr Heir said.

    Mr Gallagher said it was too early to say whether the documents would help to rehabilitate Pius's reputation. He said that in Rome in 1938, Cardinal Pacelli had assured Joseph Kennedy, father of the future president who was US ambassador to Britain at the time, that any political compromise with the Third Reich was "out of the question". Mr Phayer said that Cardinal Pacelli was undoubtedly speaking from his own disillusionment after the Nazis failed to live up to a concordant he had negotiated in 1933 - the year Hitler came to power. In the concordant, the church accepted the dissolution of all Catholic political groups. In return, Germany was to have allowed the Vatican tight control over German bishops and purely religious matters. The concordant has been blamed for silencing the church as moves began against Jews.

    Click here to return to top of page.

    In Early Reports, Size of Gathering Overshadowed King's Words (posted 8-24-03)

    Theo Lippman, Jr., writing in the Atlanta Journal and Constitution (August 24, 2003):

    As the Washington correspondent for The Atlanta Constitution in August 1963, I wrote a 32-paragraph story on the historic March on Washington and the speeches that day.

    My report did not mention what were destined to become some of the most famous spoken words in history: "I Have a Dream."

    I had plenty of company. In his book "The Dream: Martin Luther King Jr. and the Speech that Inspired a Nation" (Ecco. $23.95), Drew D. Hansen observes that many journalists focused on the extraordinary spectacle of the march itself, rather than on the speeches. Hansen notes that Norman Mailer, for instance, covering the event for Esquire, wrote that King's speech and the others were anticlimactic to the march.

    That so many people could peaceably assemble and make a dignified, intelligent, persuasive claim for redress of their extreme grievances seemed unlikely to many. District of Columbia police assumed that rioting or clashes with counter-demonstrators were possible and planned accordingly.

    But that didn't happen. So couldn't it be said that the peaceful event, particularly King's speech, justified the risk and led to change? Yes and no. Public opinion propelled in part by the march led to landmark civil rights laws in 1964, 1965 and 1968. But as Hansen puts it, "Between 1963 and 1968, few people spent substantial time talking or thinking about what King had said at the march." Immortality built slowly over the coming decades.

    I recently reviewed the Aug. 29, 1963, Constitution, Atlanta Journal, Washington Post, New York Times, Baltimore Sun and the Associated Press and United Press International stories used by many newspapers. Few of them quoted from the "dream sequence," except the Times, which had a page one "news analysis" lauding and quoting King's speech and a lead story that devoted seven paragraphs to King's dreams.

    The Constitution certainly covered the event in full. In addition to my story, Eugene Patterson wrote an "editor's analysis" for page one; Political Editor Reg Murphy wrote a page one story about President Kennedy's and congressional leaders' reaction to the march; and Publisher Ralph McGill devoted his daily page one column to the event, which he watched on television. Only Patterson among us touched on the dream sequence, quoting a brief fragment (but he wrote more fully about that part of the speech in his daily editorial page column the following day).

    The Atlanta Journal, like other afternoon papers, was unable because of deadlines to cover the march and speeches in full for editions of Aug. 28. Journal Washington Correspondent Margaret Shannon did a long page one story describing the scene, focused on Georgia marchers. The paper also had a page one UPI dispatch, quoting speakers, including King -- but not the "dream" part.

    In his book, Hansen examines the speech as historian as well as rhetorician. He does a good job, but to get the true temper of the times, a lot more detail is needed.

    For that there is "Reporting Civil Rights: American Journalism 1941-1963" and "Reporting Civil Rights: American Journalism 1963-1971." Edited by Clayborne Carson, David J. Garrow, Bill Kovach, Carol Polsgrove. (Library of America. 996 and 986 pages, respectively. $40 each.) They were published this year to commemorate the 40th anniversary of the March on Washington.

    Both volumes offer a disheartening look at a country in which otherwise decent white men and women ignored, accepted, enforced or participated in physical, economic and emotional violence directed at blacks.

    Click here to return to top of page.

    Vatican Rewrites History To Insist It Did Not Persecute Galileo (posted 8-23-03)

    Peter Popham, writing for the Independant (London) (August 23, 2003):

    THE BELIEF that the Catholic church persecuted Galileo Galilei for pointing out that the Earth goes round the Sun was quite wrong, the new secretary of the Vatican's Doctrinal Congregation, Archbishop Angelo Amato, has claimed.

    Citing a letter recently discovered in the Vatican's archive, Archbishop Amato, who heads the body formerly known as the Holy Office - or the Inquisition - said it proved the church had treated him very well.

    The letter, sent by the Commissioner of the Holy Office to Cardinal Francesco Barberini in 1633, expressed the Pope's concern that the trial of the scientist accused of heresy be concluded quickly as his health was poor. Archbishop Amato told the Italian weekly La Famiglia Cristiana that the letter proved that the church's attitude to the great astronomer was benign. The idea, he said, that"Galileo was incarcerated and even tortured so he would abjure" was no more than a legend,"transmitted by a false iconography", the cleric insisted.

    He claimed Galileo was accorded every civility while residing at the Inquisition's pleasure:"His room was the apartment of the attorney - one of the highest officials of the Inquisition - where he was assisted by his own servant ... During the rest of his stay in Rome he was the guest of the Florentine ambassador at the Villa Medici."

    At worst, the Archbishop said, Galileo's reception was mixed."When, in 1610, Galileo published Sidereus Nuncius, which upheld the centrality of the Sun in the universe, he received the applause both of the great astronomer Johannes Kepler and the Jesuit Clavius, author of the Gregorian calendar. He even had great success among the Roman cardinals ... They all wanted to see the sky through his famous telescope," he said.

    Archbishop Amato's remarks are the latest attempt in a long-running campaign by the Vatican to re-cast the church not as a persecutor but as a relaxed friend of modern science. Attempts by Pope John Paul II to mend bridges with science go back to the start of his papacy, 25 years ago in October.

    On 10 November 1979, at an audience marking the 100th anniversary of the birth of Albert Einstein, the Pope asked theologians, scholars and historians to study the Galileo case more deeply. The commission set up to do this reported its conclusions in 1992. Responding, the Pope admitted errors committed by the Inquisition which condemned the astronomer."Allow us to deplore certain mental attitudes ... derived from the lack of perception of the legitimate autonomy of science," he said.

    Endorsement of the truth of science has been as persistent a theme of John Paul II's papacy as his doctrinal conservatism, Vatican observers say. In a speech to the Pontifical Academy of Science in 1996, he came close to endorsing the theory of evolution."New knowledge has led to the recognition of the theory of evolution as more than a hypothesis," he said, adding that the idea of natural selection had"been progressively accepted by researchers, after a series of discoveries in various fields of knowledge"....

    See also: "1633 Letter Resolves the Legend About the Galileo Case, Says Vatican Aide."

    Click here to return to top of page.

    The National Park Service Rewrites the History of America's Civil War Battlefields--And Braces for Trouble (posted 8-22-03)

    Delia M. Rios, writing for the Newhouse News Service (August 17, 2003):

    Along Antietam Creek, witness to the Civil War's bloodiest day, battlefield Superintendent John Howard watches for signs that the struggle over how we remember the war is making its way to these quiet fields.

    In nearby Washington, D.C., where wagonloads of Union wounded passed President Lincoln in the street that September of 1862, Dwight Pitcaithley keeps a vigil of his own from the downtown office where he presides as chief historian of the National Park Service.

    But at Antietam -- as at Shiloh, Richmond, Manassas or Chickamauga and Chattanooga -- the visiting public largely shows quiet acceptance as a century-old tradition falls away and the Park Service confronts the still volatile subject of slavery as the core cause of a war that compelled 620,000 men to their deaths.

    "No outrage, no protests, no controversy," Pitcaithley repeats with each recitation of changes at the battlefield sites.

    Even at South Carolina's Fort Sumter -- where the war's first shots were fired on April 12, 1861, and where members of the park staff, their superintendent says, sit "on the point of the sword" -- slavery has entered the storyline without incident.

    It wasn't the battle the Park Service had braced for.

    The arguments continue

    The cause of the Civil War is one of the most enduring arguments of American history. Academics began reaching a consensus half a century ago that the political, economic and social divisions of the 1860s inevitably led back to "the peculiar institution," as slavery was called.

    But there remains deep attachment to the idea -- historians flatly call it myth -- that the South, overwhelmed by Northern might, waged a heroic contest for states' rights in what's known as "The Lost Cause."

    From the 1890s to the 1990s, a tacit agreement on the battlefields held firm: There would be no talk of what had set brother against brother, only recounting of the military strategies and tactics played out in places like Devil's Den at Gettysburg and Chattanooga's Missionary Ridge.

    Veterans of the Blue and Gray first forged this contract, allowing generations to come together in peace. But the battlefield superintendents concluded in 1998 that the price was too high, and Congress concurred. Keeping silent, as Gettysburg Superintendent John Latschar put it, meant that "surely these men would have struggled and died in vain."

    Tens of millions of dollars is being spent to bring the battlefields in line by the war's 150th anniversary.

    First-person accounts

    What may have quelled controversy are the unequivocal voices of the past. An army of first-person accounts and original documents has been deployed in new exhibits and interpretive materials, allowing the drama's actors to explain themselves. They speak plainly to the present, breaching the long silence on the single most contentious question of the war.

    At Corinth, Miss., for instance, next summer's visitors to the southern gateway of the Shiloh battlefield will view Mississippi's declaration of secession in its entirety. "Our position is throughly identified with the institution of slavery," the second paragraph reads, ". . . and a blow at slavery is a blow at commerce and civilization."

    The change has required deft handling.

    In Richmond, Va., the old capital of the Confederacy, the superintendent ran exhibit language by city fathers "to make sure every word is what we meant," Pitcaithley said. To allow for public comment, planned displays at Fort Sumter were put on temporary full-color vinyl panels before installation. Copies of interpretive text were sent to South Carolina officials and university historians.

    Fort Sumter Superintendent John Tucker, realizing there were disputes over approach among his own staff, assigned a military historian and a social historian, Carlin Timmons, to the project, letting them "fight it out," as Timmons said. One disagreement concerned a Harper's magazine illustration of a slave scarred from whipping: Would it imply that all slaves were whipped? Would African-American visitors cringe?

    To be sure, the tales of "who shot whom, where" -- as rangers call them -- remain at the center of battlefield interpretation. But among all the stories now being told, those may be the least complex.

    Gettysburg's "Civilian Wall of Faces" now features local black people, including Basil Biggs, a veterinarian whose home was a stop on the Underground Railroad. A photograph at Fort Sumter of a Confederate soldier and his armed body servant, a slave, acknowledges the war's "difficult realities, limited choices and questionable allegiances," Timmons said.

    At Corinth, local tradition holds that the county was decidedly pro-secession. But it actually sent pro-Union candidates to the secession convention.

    Church elders held slaves

    Evidence surfaced that in Sharpsburg, just outside the Antietam battlefield, local church elders held slaves -- against the dictates of their religion. Howard arranged to share the information privately with current leaders of the Church of the Brethren "so they wouldn't be surprised."

    Certainly, there is strong opposition to this reinterpretation, even in the absence of a public backlash. The recent outcry in Richmond against a statue of Lincoln, seated with his son Tad, is a reminder of the emotion surrounding the war's memory.

    Allen Sullivant, chairman of the Heritage Defense Committee of the Sons of Confederate Veterans, objects to what he views as a partisan assault on Confederates as "the bad guy in the situation." Jerry Russell, national chair of the Civil War Round Table Associates, is adamant that the battlefields, which draw some 11 million visitors a year, should remain solely about the battles and the men. He and Pitcaithley -- who are friendly -- have publicly crossed swords on the issue, as Russell put it. But he realizes "we will never have a meeting of the minds."

    Click here to return to top of page.

    History and Myth (posted 8-22-03)

    Noel Malcolm, writing in the Sunday Telegraph (August 17, 2003):

    IF YOU LOOK up Marina Warner's website (as she suggests at the start of this book), you will find it proclaiming, in large letters on the homepage: "Marina Warner: Novelist & Mythographer". You may feel that there is something rather fancy and self-conscious about the use of such a recondite term. Pedants might object that a mythographer just writes down myths, whereas Ms Warner writes about mythology (and, indeed, about mythography). But, pedantry apart, the word "mythographer" does capture the special nature of what Marina Warner has been doing, in a whole series of fascinating books, for more than a quarter of a century.

    She has written about the cult of the Virgin Mary; about the image of Joan of Arc; about religious iconology, medieval monsters, Victorian paintings and Hollywood films; about fairy-tales and folklore and their resonances in Western literature. You might describe her as a cultural historian, except that historical explanation is not really her stock-in-trade. You could call her a literary critic, except that distinguishing the good from the bad is not her primary concern. "Critic, historian and novelist" says the jacket blurb; "Novelist and mythographer" says the website, and the website gets it right.

    The myths that have captured her interest go back to ancient Greece (and beyond), and are still selling books and films today; her range of subject-matters is accordingly immense. Her latest book, a collection of occasional pieces written over the last 31 years, includes treatments of such things as shape-changing in ancient mythology, animals in Aesop and La Fontaine, incest taboos, saintly women in the Middle Ages, magic in Shakespeare, the myth of Don Juan, the horrors of Struwwelpeter, the humour of Lewis Carroll, the art of Angela Carter, and the kitsch of Madonna.

    With questionable judgment, Marina Warner has also reprinted here a handful of magazine articles written at the start of her career, including a profile of Mrs Thatcher and a piece of reportage from the Vietnam war. These have little to do with the rest of the book, and suggest a "let's bung it all in" approach to its compilation. But they are not without their incidental delights, such as this marvellous account of her meeting with Mrs Thatcher: "She made me feel less like a frightened pupil than an anxious parent who has come with a complaint from her child, but, confronted with the imperturbable composure of the headmistress, cannot bring herself to broach the subject and leaves feeling foolish and ill at ease."

    Marina Warner is a deft writer, alert to nuances and equipped with a rich verbal palette of her own. She travels within hailing distance of much fashionable theory (Parisian postmodernism, American "new historicism", and various sorts of feminism), but always charts her own course, steering well clear of jargon. Large-scale theorising never takes over her writing; she is constantly offering the reader something specific to think about, whether it be the special role of "fumigations, perfumes and vapours" in Shakespeare's Cymbeline, or the touching reversal of image and reality in the making of King Kong. ("Off screen, the Kong figure stood at about eighteen inches tall, and was covered in nothing more ferocious than rabbit fur.")

    And yet, and yet . . . For all the incidental riches and manifold fascinations of this book, there is something about reading hundreds of pages of Marina Warner that sets one's critical faculties on edge. Everything is grist to her mill, because everything, it seems, is just another reworking of the same body of material - that great body of myths which has existed since time immemorial, and of which particular works of art or literature are just occasional instances.

    The watchword of this approach is "only connect". But while some of the connections presented here involve real sources and cultural influences, others may amount only to pleasing resemblances occurring in the mind of Ms Warner - and her approach makes it difficult, sometimes, to tell these two cases apart. When she writes, for example, that the Twin Towers of the World Trade Center "may echo the original model for the dollar symbol", is she really claiming that their architect (who merely characterised his design as "the gateway to a large city") was aware of the complex iconological history of the dollar sign, which goes back, apparently, to the Pillars of Hercules portrayed on early Spanish coins?

    If that is her claim, she produces no evidence for it. If it is not, then "echo" here means no more than "remind me of", and the mythology of Hercules has nothing to do with the former skyscape of Manhattan.

    Where works of literature are concerned, this sort of unhistorical pattern-making is easier to engage in; modern theorists have taught us to think that texts can easily contain "meanings" which the author may not have intended. But there is still something limiting about the range of meanings in which Ms Warner specialises - her repertoire of myths. This approach may work well for The Tempest but can hardly get to grips with Shakespeare's Sonnets; it is ideal for Frankenstein and Dracula, but gives little entree to Sense and Sensibility and Middlemarch. Do these non-qualifying works have less to tell us about the human condition?

    Even where the myths themselves are concerned, there is something worryingly homogenising about Marina Warner's treatment of them. Her myths are timeless archetypes: for example, the "combat" myth which she finds in Beowulf, St Michael and the Dragon, and the exploits of Harry Potter. She contrasts this with another archetype, the "transformation" myth (Beauty and the Beast, etc), in which evil, instead of meeting with destruction, is transformed into good.

    What gets lost here is any sense of how deeply different the mental world of ancient mythology was. In what it is no longer politically correct to describe as "primitive" thought, the dichotomies of good and evil did not operate as they do in Tolkien or Spielberg: when the hero killed the beast, he put on its skin, and became more beast-like himself. This was not transformation as opposed to combat: it was combat with a different meaning, one which the Warner approach will always tend to obscure.

    Click here to return to top of page.

    Why Are Students So Ignorant of History? (posted 8-22-03)

    Darryl McGrath, writing in the Buffalo News (August 17, 2003):

    One student was unable to define either the Berlin Wall or the Soviet Union on a recent quiz. Another thought that Hiroshima was the bombing that made the United States enter World War II. A third thought the line "look away, look away" came not from the Confederate national anthem "Dixie" but from "The Star-Spangled Banner."

    Almost none of the students had ever heard of the Eastern Bloc countries, and all of them struggled to list three times in the past 30 years that the United States had entered into an armed conflict. Most of them couldn't name the two presidents who saw the United States through World War II.

    All of these are examples of how my students -- college juniors and seniors -- did on quizzes in the advanced journalism course I taught this summer at the University at Albany.

    There were no trick questions. There was nothing on these quizzes that should have stumped anyone who has read a newspaper, a news magazine or the cover of People magazine once a week -- hey, try once a year -- for the last five years. But every semester, I get more than a few answers of the kind for which the term "doozie" was coined.

    All of us have seen those real-life history essays from hell written by schoolchildren that float around on the Internet as someone's idea of a joke. You know, the ones in which fifth-graders hopelessly mangle events and misunderstand words and write something like this: "Queen Elizabeth the First was the queen of England a long time ago. She was known as the Virginian Queen because she never had any kids."

    As a journalist who has written about urban school districts, I actually don't find it so cute when children become the butt of somebody's joke about the inadequacies of public education. I find it far less funny when those same children are ready to enter the working world 10 years later, and they're still at the same level of historical illiteracy.

    What are we doing wrong with the way we teach history? More specifically, why do college students have so little understanding of the 20th century?

    One problem might be the textbooks, suggests literacy advocate Andrew Carroll, 33, who came to his love of history after college.

    "In my old high school textbook, there was a full page on the Tariff of 1816, and only a sentence on D-Day," Carroll said. "Unless you want to impress someone at a cocktail party, the Tariff of 1816 doesn't have as much significance as D-Day."

    Carroll is the co-founder and executive director of the American Poetry & Literacy Project in Washington, D.C., which distributes free books of poetry in public venues such as train stations. As the editor of two collections of letters written by ordinary Americans, Carroll -- who recalls hating history in high school and college -- has discovered that there are better ways than rote memorization to make history come alive for students.

    Based on his own archiving projects for his books, Carroll suggests encouraging students to devise and execute history projects of their own. Students who hate classroom lectures might come alive as they analyze documents, oversee an oral history assignment or become the curators of a classroom collection of artifacts.

    I'd happily settle for my students demonstrating a grasp of the past 60 years, but the 1990s are about as far back as most can go. For many of them, Vietnam might as well be the Peloponnesian War. Other critical markers of the past century -- the civil rights movement, World War II, the Great Depression -- are fodder for a college-level version of the Queen Elizabeth essay.

    Click here to return to top of page.

    Were the Nazis Christian? (posted 8-22-03)

    Christopher Shea, writing in the Boston Globe (August 17, 2003):

    RECENT DECADES HAVE seen endless interpretive battles over the Nazis. The Holocaust was an evil genius's long-prepared scheme, or an improvised response to developments during World War IIone of several possible "Final Solutions." German soldiers were only willing to commit genocide after participating in brutal warfare on the Eastern front, or they were eager killers from the start. Churches resisted the Third Reich, or they legitimated it.

    Until now, though, one piece of conventional wisdom has gone unchallenged: that the Nazis disliked Christianity. The standard view has been that while Hitler and his deputies may have feigned respect for religion during their ascent to power, they essentially believed, with Nietzsche, in the "death of God." They were as anti-Christian as the Soviets, but with a pagan twist: Some of them hoped to turn mutant versions of dormant Germanic and Norse legends into a state religion. In the place of the cross, think Wagner and Wotan, swords and horned helmets.

    Richard Steigmann-Gall, an assistant professor of history at Kent State in Ohio, thinks otherwise. In his new book, The Holy Reich (Cambridge), he argues that many Nazis and their followers were sincere Christian believers. Nazism was the opposite of atheistic: It was a "singularly horrific attempt to preserve God against secular society." Indeed, "the battles waged against Germany's enemies constituted a war in the name of Christianity." The modern tendency to paint Hitler and his allies as anti-Christian "kooks," he explains in an interview, is just another way to put an artificial distance between them and us and thereby to avoid the toughest questions about our own susceptibility to evil.

    There were a handful of self-styled pagans in the Nazi regime, notably Heinrich Himmler and Alfred Rosenberg. But "many other Nazis thought their religious views were ridiculous," Steigmann-Gall says. "Hitler didn't hesitate to mock their ideas behind their back." Additionally, the 1939 book "Hitler Speaks," in which the Fhrer was quoted as saying that his future plans included "stamping out Christianity in Germany, root and branch," is now widely viewed as a fraud.

    Though Hitler did view Roman Catholicism as a threat to German nationalism, Steigmann-Gall points out, his hope until the late 1930s was to unite Protestants under one state church. Plenty were willing to go along, but dissenters, including Martin Niemller and his "Pastors' Emergency League," fended off the plan. Imprisoned for his efforts, Niemller was lauded as a hero after the war.

    Steigmann-Gall emphasizes that Niemller and his peers were far more concerned with preserving their churches' autonomy than with opposing the regime's ideology. In fact, Niemller voiced vicious anti-Semitic sentiments of his own. Moreover, Steigmann-Gall argues, historians have failed to come to grips with the tight interweaving of Protestantism and German identity. In the 1920s, one of Hitler's intellectual mentors, Dietrich Eckart, talked up parallels between Christianity and muscular nationalism: "In Christ, the embodiment of all manliness, we find all that we need." In 1933, after the Nazis assumed power, ministers argued from the pulpit that it was fitting that this social revolution had come on the 450th anniversary of Martin Luther's birth.

    Is Steigmann-Gall's argument fair? Several critics have pointed out that the conception of Christianity held by most National Socialists was far from a conventional one. As Jack Fischel, a historian at Millersville University, argued in The Weekly Standard last month, "By eliminating the Old Testament from the biblical canon, reinventing Jesus as an Aryan, and depicting the struggles of Christ as the archetype of the eternal battle between the Aryan and the Semite . . . the Nazis altered fundamental Christian doctrine."

    John S. Conway, author of "The Nazi Persecution of the Churches" (1968), agrees: "The kind of Christianity they thought they believed in was so diluted of orthodoxy that it was just a mishmash which even the most liberal Protestant would find difficult to swallow."

    Yet many did swallow it, Steigmann-Gall counters. To say Nazis weren't Christians because their views were a mishmash "is too convenient," he says. "It doesn't explain Nazi conceptions of Christianity. It explains away Nazi conceptions of Christianity."

    Click here to return to top of page.

    Was Canadian Explorer the Model for Coleridge's Rime of the Ancient Mariner? (posted 8-21-03)

    Randy Boswell, writing for CanWest Interactive (August 19, 2003):

    An acclaimed Canadian writer's new book about Samuel Hearne -- the intrepid 18th-century explorer who blazed a trail to the far reaches of northern Canada -- offers a tantalizing theory sure to cause a stir among the world's literary historians.

    Calgary author Ken McGoogan claims it was Hearne and his haunting experiences in the Canadian wilderness that inspired one of English literature's most memorable characters and greatest poems: Samuel Taylor Coleridge's epic ballad The Rime of the Ancient Mariner.

    The revelation is at the centre of Mr. McGoogan's biography Ancient Mariner: The Amazing Adventures of Samuel Hearne, the Sailor Who Walked to the Arctic Ocean, to be published this fall by Harper Collins. While acknowledging that Coleridge's 1798 masterpiece was given shape by a host of mythic stories of adventure and tragedy, Mr. McGoogan says the real individual who finally triggered the poem was Hearne -- an uncommonly educated fur trader who wrote eloquently about the psychological baggage he carried back to Britain from his Arctic odyssey.

    In Coleridge's poem, a grizzled sailor is driven by some strange compulsion to tell and retell a tragic tale from his days on the high seas. The old man recalls how his cruel killing of an innocent albatross once brought a terrible curse upon his ship, leaving all crew but himself dead. The angry ghosts of his mates then conspire with the ocean, the wind and a host of mysterious forces to punish the ancient mariner -- condemning him to a life spent wandering the countryside and perpetually reliving the horrific ordeal unleashed by his own foul deed.

    Scholars have long debated whether some man known to Coleridge had served as a model for his famous mariner. Fletcher Christian, who led the mutiny of The Bounty in 1789, has emerged as a popular choice. And William Wales -- Hearne's best friend and a former math teacher of Coleridge's who had sailed on one of Capt. James Cook's voyages -- has also been identified as a possible inspiration for the poem.

    But Mr. McGoogan writes that "anybody seeking the original mariner should look to Samuel Hearne."

    Hearne became the first European to reach the Arctic Ocean by land when he and his Indian guide, Matonabbee, trekked 2,000 kilometres in 1771 from a trading post at present-day Churchill, Man., to the mouth of the Coppermine River.

    Near the end of the journey, a heartsick Hearne witnessed the unprovoked massacre of a party of Inuit by a group of Dene hunters who had joined his expedition.

    Mr. McGoogan, who previously wrote an award-winning book about the life of Arctic explorer John Rae, says his theory that Hearne was Coleridge's muse for The Rime was sparked by a conversation he had with a friend about his latest biographical project.

    "Call it coincidence or synchronicity," Mr. McGoogan told the Citizen, "but he had recently read a biography of Samuel Taylor Coleridge that mentioned Hearne."

    In fact, Mr. McGoogan learned that Coleridge had possessed a 1796 edition of Hearne's Journey and made extensive notes on the margins of its pages. Coleridge had also written once that "Hearne's deeply interesting anecdotes" about native people's belief in curses and other such phenomena underlay the theme of his poem The Three Graves.

    "Investigating further," Mr. McGoogan said in the interview, "I learned that scholars had sorted out much about the development of The Rime, but had failed to resolve one key mystery: they had argued fiercely but could not agree on who had inspired the Old Navigator, the original ancient mariner figure, a compulsive storyteller haunted by events from his past."

    With some digging, Mr. McGoogan says he connected Hearne with Coleridge in 1791 -- ironically through their mutual association with the ex-sailor Wales, now a math teacher and one of the other possible models for the mariner character.

    "Comparing dates, I realized that Hearne had returned to London and renewed his friendship with Wales at precisely the time Coleridge was a leading student" at the school where Wales worked.

    Wales, Mr. McGoogan writes in the book, was well known as a simple, affable fellow and "not an outsider who puts one in mind of the Ancient Mariner."

    But Hearne, "a loner who settled into London in the late 1780s, bears an uncanny psychological resemblance" to Coleridge's bewitched sailor.

    "Haunted by some harrowing ordeal, an Old Navigator compulsively tells his tale to anyone who will listen," Mr. McGoogan writes. "Consider Samuel Hearne in 1791. A born storyteller, a sometime navigator haunted by horrific memories and guilt feelings ... Hearne's friendship with Wales, Coleridge's published citation, the distinctive psychology of the mariner figure -- all the evidence points in the same direction."

    Click here to return to top of page.

    Rock N Roll Is Part of History (posted 8-21-03)

    Jonathan Yardley, writing in the Washington Post (August 2003):

    Precisely how rock-and-roll got its name probably never will be definitively answered, but there can be no doubt that it entered popular usage thanks to a disc jockey named Alan Freed, a "wild, greedy and dangerous man" who was, in the mid-1950s, "the dominant nighttime personality on radio in New York City." Almost exactly half a century ago he changed the name of his show to "Rock 'n' Roll Dance Party" and began to plug the music of black rhythm-and-blues performers as well as the young whites who began to copy and reinterpret their work.

    The rest is history -- not a blip on the pop-cultural radar screen but a development of major importance in 20th-century American, and eventually world, history. Thus we now have, in Oxford University Press's ongoing series called "Pivotal Moments in American History," Glenn C. Altschuler's account of rock-and-roll's formative years, the decade immediately following the Second World War. Its three predecessors in the series cover the Supreme Court's Brown v. Board of Education, the stock market crash of 1929 and the battle of Antietam, which is to say the editors -- the distinguished historians David Hackett Fischer and James M. McPherson -- put rock-and-roll in rarefied company.

    They are right to do so. "How Rock 'n' Roll Changed America," as Altschuler's subtitle puts it, really should be phrased as a question rather than a declarative statement because the exact nature of its influence is not easily pinned down, but it surely ranks with the movies and television among the most important developments in 20th-century America. Inasmuch as that was the century in which pop culture shoved high culture aside and became (to borrow a pop-cultural slogan) the heartbeat of America, it must be viewed in a far larger context than historians traditionally have been willing to accord such matters.

    The 20th century wasn't just FDR and Ike and JFK. It was also Fats Domino, Chuck Berry and Elvis Presley. Like it or not, the "Great Man" theory of history has to be revised to grant admission to people -- women as well as men -- whose accomplishments and influence fit less comfortably into definitions of greatness shaped by the doings of statesmen and generals. If a strong case can be made (and it can) that the most important American of the 20th century was not any of the aforementioned presidents but Walter Elias Disney, then by the same token the Founding Fathers (along with a few Mothers) of rock-and-roll must also be given their place on history's stage.

    Click here to return to top of page.

    Is the Movie The Magdalene Sisters Anti-Catholic? (posted 8-21-03)

    Phil Kloer, writing in the Atlanta Journal and Constitution (August 16, 2003):

    'If anything, the reality was worse," says Frances Finnegan, the Irish historian who literally wrote the book about the Magdalene Asylums, a gulaglike series of laundries that turned thousands of young Irish-Catholic women into slave labor for more than a century.

    She's talking about the new film "The Magdalene Sisters," which opened Friday in Atlanta and which has been causing a stir for almost a year, ever since it won the top prize at the Venice International Film Festival and was quickly condemned by the Vatican newspaper L'Osservatore Romano as an "angry and raucous provocation."

    The movie is based in historical fact but uses fictional characters to tell the brutal story. Written and directed by Peter Mullan, a fallen-away Catholic, "The Magdalene Sisters" is a searing indictment of Ireland and the Catholic Church in the 1960s and has stirred passions among Catholics worldwide.

    William Donohue, president of the U.S.-based Catholic League, calls the film "anti-Catholic propaganda."

    "This is a game that can be played with any demographic group and with any institution," Donohue says. "Just gather the dirty laundry, pack it tightly, and present it as if it were reality."

    But the Sisters of Mercy of the Americas, whose sisters in Ireland operated some of the homes, issued a statement that the reformatories represent "a time in the history of the Catholic Church and religious orders of which we are not proud" and said they were praying for victims of the system.

    And the Survivors Network of Those Abused by Priests, a Catholic lay group, has praised the movie....

    In the film, Mullan shows us three young Catholic women in Ireland in 1964. Margaret is raped at a relative's wedding and becomes pregnant; Rose has an illegitimate child who is taken away from her; and Bernadette, an orphan, is just pretty and flirty enough that the nuns are convinced she's bound to get into trouble.

    All three are packed off to an industrial laundry where they are actually prisoners, even though they have not been convicted of anything. Attempts to escape are punished severely, friendship and most talking are forbidden, and there is no contact at all with the outside world. For long hours, 364 days a year (they get Christmas off), they scrub laundry by hand in huge pots of steaming water, for no pay.

    Although the young women are composite characters, their plight reflects what happened to thousands of young women over many years.

    "I've met hundreds of women from these places," says Mullan, the writer-director. "Everybody has a different color to her story. Some believe it was more physically violent than in the film. Others believe it wasn't as violent, it was more psychological. But they say it captures the essence."

    'Just locked up'

    Mullan took some of the incidents in the movie --- including an almost unwatchable sequence in which all of the girls are forced to stand at attention, stark naked, in the communal shower while two nuns mock them --- from the British TV documentary in which former Magdalenes testified about their experiences. The backgrounds of the three main characters in the movie are all drawn from women in the documentary.

    "I didn't see anything godly, anything Christly," says Phyllis Valentine, who served in a laundry in the '40s. "All I saw was a bunch of bullies."

    The Magdalene system was named for Mary Magdalene, Christ's follower in the New Testament who for many years was considered a reformed prostitute. (Biblical scholars now say this was actually based on confusion between Magdalene and another New Testament figure named Mary, and that Magdalene was never a prostitute.)

    Originally set up in the middle of the 19th century and dotted throughout Ireland, the homes were designed to reform prostitutes and return them to society with usable skills. But slowly, says Finnegan, they changed into homes for unwed mothers and then prisons for any girl or young woman society deemed to be sexually outside the pale.

    "The Irish people were aware of them,


    Click here to return to top of page.

    The Nearly Invisible Founding Father (posted 8-21-03)

    Scott Brooks, writing in USA Today (August 15, 2003):

    You can be sure that Jacob Broom existed. His signature is right there, on the U.S. Constitution, to prove it.

    Otherwise, the man's a ghost, his likeness forgotten by history.

    The faceless Founding Father stands cast in bronze among the giants of the country's birth at Philadelphia's new National Constitution Center. Jacob Broom, but not Jacob Broom. A forgery.

    Historian Ray Smock knew he would be challenged when the center hired him to consult on the construction of 42 life-size statues of delegates to the 1787 Constitutional Convention. Based on Smock's homework, sculptors knew that Benjamin Franklin had a size-44 waist, that James Madison was the shortest founding father at 5-feet-4, and that George Washington "had a little stomach on him, but not too much."

    But Jacob Broom? "He seems to be the invisible man," Smock says.

    Broom, who represented Delaware at the convention, attended every session during debates over the Constitution, yet notes from the debates make scarcely a mention of him. There are no known portraits of Broom, nor has anyone uncovered a diary or memoir.

    The problem has frustrated artists before. In Howard Chandler Christy's famous 1940 painting, Scene at the Signing of the Constitution, which hangs above the great staircase in the U.S. Capitol, Broom's face is blocked out by another delegate's raised arm.

    Smock, however, didn't have the luxury of an obstructing appendage. This was 3-D. The task, then, was to create a look that would befit a man of Broom's era.

    Broom had to look average. Artists from StudioEIS in Brooklyn built him to stand 5-feet-8, the average height for the time. He wears a pigtail, the most common hairstyle among the delegates. His face bears the undistinguished look of a white man of English ancestry.

    "If he'd been particularly short or tall, I think something would have shown up," Smock says. "The record on him seems so average."

    Click here to return to top of page.

    Meriwether Lewis Deserves Better (posted 8-19-03)

    Kathryn Moore, writing in the Journal of the West (Spring 2003):

    It is a sad irony that perched above the Missouri River in Sioux City, Iowa, is the soaring obelisk to expedition member Charles Floyd while the leader of the Corps of Discovery [Meriwether Lewis] rests under a monument that many observers note strongly resembles a filtered cigarette. Perhaps, as part of the bicentennial activities of the expedition, the National Park Service should move Lewis’s body from the remote location at Grinder’s Stand to the more visible and visited Park Service site at the Gateway Arch in St. Louis. After all, Meriwether Lewis gave years of service and his life to open that gateway to the west.

    Click here to return to top of page.

    Genghis Khan Is Now Regarded as"Cool" in Mongolia (posted 8-19-03)

    Rupert Wingfield-Hayes, writing for the BBC News (August 12, 2003):

    At the opening ceremony for the annual Nadaam festival in the Mongolian capital Ulan Bator, the star of the show comes not from today but from 700 years ago. Bursting on stage in the middle of the national stadium is the unmistakable figure of Genghis Khan, the Mongol warrior who built a vast 13th Century empire.

    In flowing, fur robes the great Khan seats himself on a giant golden throne. The crowd roars. On the field his minions bow before him.

    These rather over-the-top theatrics are part of a growing cult built around the image of Genghis Khan.

    Under Mongolia's former Communist rulers, the mere mention of his name was outlawed. Now there is no escaping him.

    An hour from the capital, the annual Nadaam horse races see hundreds of racers gallop across the open steppe kicking up huge clouds of dust. The racers are all children, but the competition is still fierce. Ask any of them who they most admire and the answer comes firing back:

    "Genghis Khan, he was a great Mongolian!" says 13-year-old Batsukh.

    In Ulan Bator there is now a Chinggis Khan Brewery [Chinggis is a closer transliteration of the Mongolian than Genghis], and even a Chinggis Khan night club, packed with young Mongolians knocking back Martinis and glasses of cold beer.

    There are 2.5 million Mongols struggling here in a country with a poor economy, poor infrastructure, with strong neighbours

    The Great Khan is not only ubiquitous, he's cool.

    But why are Mongolians so hung up on the blood-thirsty absolute monarch who died more than 700 years ago?

    Sumati, one of Mongolia's most acerbic social critics, suggested one answer: "Mongols at that time were much more powerful as a nation comparing to what they are now. There are 2.5 million Mongols struggling here in a country with a poor economy, poor infrastructure, with strong neighbours.

    "But really there is little knowledge about that time, just more emotional than based on some facts, when people relate to the name of Genghis Khan".

    Click here to return to top of page.

    The Slug that Gave Consumer's Legal the Right to Sue Manufacturers for Damages (posted 8-19-03)

    Stephanie Todd, writing for BBC News (August 18, 2003):

    A dead slug found at the bottom of a drink in a Paisley café went on to shape the legal rights of consumers not just in Scotland but all around the world.

    The case began when shop assistant May Donoghue and a friend met for an ice-cream on a Saturday evening at Frankie Minghella's "Tally café" in Wellmeadow Place.

    The friend ordered a "pear and ice" for herself and paid for a ginger beer float for Mrs Donoghue.

    After consuming most of her treat Mrs Donoghue was horrified to discover a partially decomposed slug as she poured out the remains of her drink from its brown frosted bottle.

    She suffered from shock and later had to be treated for gastro-enteritis and later decided to take action against the café owner.

    For all it is a bizarre story, it had a huge effect on the law worldwide

    But Mr Minghella insisted that as Mrs Donoghue had not bought the drink herself, he did not owe her a "duty of care" therefore she had no grounds on which to base her complaint.

    His only legal responsibility lay in providing that duty of care to the actual purchaser, not the consumer.

    In a move without legal precedent, Mrs Donoghue decided to sue the manufacturer of the ginger beer, Paisley soft drink maker David Stevenson.

    The case lasted four years as her lawyer William Leechman claimed that the slug must have crawled into where the bottles were being stored before being filled.

    His argument centred on the fact that Stevenson had a "duty of care" to those consuming his product, even without a direct contract.

    The case went all the way to the House of Lords, the highest court in the land, before Mrs Donoghue finally won her battle in 1932.

    Lord Atkin, who ruled in the shop worker's favour, summed up the crucial question in the case as "the rule that you are to love your neighbour becomes, in, law you must not injure your neighbour.

    "You must take reasonable care to avoid acts or omissions which you can reasonably foresee would be likely to injure your neighbour."

    He defined a neighbour as "persons so closely and directly affected by my act that I ought reasonably to have them in contemplation as being so affected when I am directing my mind to the acts or omissions which are called in question".

    Mrs Donoghue was awarded £200 in compensation - the equivalent of £7,400 today....

    Millions of damages actions around the world now regularly begin with Lord Atkin's ruling in the Paisley slug case and its 75th anniversary this August is a landmark cherished by lawyers far and wide.

    Click here to return to top of page.

    Why the National Constitution Center Works (posted 8-18-03)

    Adam Cohen, writing in the NYT (August 18, 2003):

    This city's new Constitution museum does not sugarcoat how the Founding Fathers resolved the "awful dilemma" of slavery. One display panel reminds visitors that Georgia and South Carolina threatened to leave the Union if they could not keep importing slaves. Other states were anti-slavery and the compromise, enshrined in Article I, Section 9 of the Constitution, was that the slave trade could continue until 1808, and be taxed at no more than $10 per slave.

    That stunning dispatch is typical of the National Constitution Center, which opened this summer, not far from the Liberty Bell. A museum dedicated to the Constitution sounds either hopelessly geeky (what's on exhibit, legal treatises?) or propagandistic, in the old Soviet style. But the reason this one works, in addition to all the cool technology, is that it presents constitutional history as it was: a long, hard struggle against those who wanted to keep blacks enslaved or segregated, women in their place and outsiders outside.

    It is, in the end, an inspiring place, because it tells a largely triumphal story of rights recognized and new groups woven into the fabric of the nation. But for anyone paying attention to the Bush administration's judicial nominations, this is a sobering time to visit. The White House seems intent on packing the courts with judges who want to reverse — or at least retard — many of the gains recorded here. The museum has a cafe with computers, and visitors are encouraged to e-mail their representatives in Congress. If museumgoers appreciate the constitutional rights they have gained over time, they should ask their senators to vote no on the worst of the Bush nominees.

    As they walk through the museum's chronological display, many visitors are caught off guard at what they see in the early years. Interactive screens ask questions: Are you male? White? Do you own property? They then tell you whether you would have been allowed to vote in different periods. People learn that before the 17th Amendment, United States senators were elected by state legislatures, not by the general populace. I watched as an alarmed woman asked her companion if he was aware of the Chinese Exclusion Act, the infamous 1882 law that kept out immigrants partly to protect America's racial purity.

    But for all the bad news dutifully recorded, there are also a series of extraordinary, interlaced stories of Americans moving forward as the Constitution itself moved forward, by amendment or judicial interpretation. Blacks go from Dred Scott, the 1854 Supreme Court case holding they could not be citizens (illustrated with a pair of shackles) to the Bakke decision upholding affirmative action. Women go from being denied the vote to getting a female Supreme Court justice, Sandra Day O'Connor, in 1981. Some of the greatest victories vindicated rights it's hard to believe we were ever without. In 1964, the Supreme Court ruled that the equal protection clause required "one man, one vote," and barred Alabama from cramming 600,000 people into one legislative district while another had just 15,417.

    Click here to return to top of page.

    Newton Brings Winston Back to Earth(posted 8-16-03)

    John Ezard, writing for the Guardian (London) (August 15, 2003):

    To the rest of the world Sir Isaac Newton - not Sir Winston Churchill - is the Greatest Briton in history.

    The scientist best known for discovering gravity triumphed over the war leader by five percentage points in a BBC World poll announced yesterday.

    The poll was conducted in the same way as the domestic vote in the BBC2 Great Britons series in November, in which more than a third of the 1.2 million Britons taking part chose Sir Winston, often regarded as the greatest Englishman since Nelson for his role as prime minister in the second world war years.

    Newton reached only sixth place then, but in yesterday's international poll of 10,000 people the 17th-century mathematician was said to have been chosen by more than a fifth of voters, 21.4% against Churchill's 17%.

    The other main difference between the polls was that the 19th-century railway, bridge and ship engineer Isambard Kingdom Brunel came seventh with world viewers.

    In November's poll of home viewers he was second to Churchill. Both votes followed the transmission of a series of programmes profiling 10 candidates.

    The historian Tristram Hunt, who presented the Great Britons profile of Newton, said he was delighted that his choice had won."I think it is a proper reflection of his genius that a global audience has voted him the Greatest Briton.

    "Indeed, it was Newton's advances in physics - his understanding of gravity and planetary motion - that have sent satellites into space and allowed the series to be beamed round the globe. The world has now repaid the favour."

    Narendhra Morar, commissioning editor for BBC World of Great Britons, said:"I think one of the reasons for Newton's victory was that the poll was conducted online and he would appeal to younger, computer-savvy voters.

    "It's fascinating that our viewers chose a different greatest Briton to the original series, although Churchill still had a strong following and actually came first among BBC World's expatriate viewers."

    Click here to return to top of page.

    The Next Generation: The Millennials (posted 8-14-03)

    Don O'briant, writing in the Atlanta Journal and Constitution (August 11, 2003):

    They grew up plugged into the Internet, always cooked their popcorn in a microwave oven and never watched television without a remote control. They probably never played Pac-Man, and they think Kansas, Chicago, America and Alabama are places, not musical groups.

    Meet the millennials, the generation that has taken over college campuses in droves as they head for class in the coming weeks.

    Unlike some of their predecessors, they have no interest in sit-ins, anti-war marches or rebellions against authority. The post-1982 kids --- sometimes known as "echo boomers," "Generation Y" or "Generation Next" --- respect their parents, want clear rules set for themselves and are determined to be successful.

    At least that's what authors Neil Howe and William Strauss learned in researching their books, "Millennials Go to College" and "Millennials Rising."

    "They are optimistic, team-oriented and they closely resemble the 'Greatest Generation' that fought World War II," said Howe, a Yale-educated historian who is considered one of the nation's leading experts on generations and historical cycles. "They are planners and goal-setters."

    That's apparently true of many millennials entering metro Atlanta colleges. At an orientation program at Georgia State University last week, John Hardin, 18, announced that he was getting a degree in criminal justice. "I've had my career planned out for as long as I can remember," he said....

    "I believe in a lot of things my parents do," said Josh Evans, 18, a summer intern at CNN who is attending the University of North Carolina this fall. They're both politically liberal, for example. "But I've been able to expand my values, too. Because my parents are from the South and moved to New York, I have this sense of New York smarts and Southern hospitality."

    They may share their parents' values, Howe said, but they don't necessarily want to emulate their behavior. He cites studies showing that rates of tobacco and alcohol use, violent crime, out-of-wedlock pregnancies and suicides are way down among today's teenagers.

    This trend is part of a cycle, Howe said. Similarly to millenials, the "Greatest Generation" (born 1901-1924) followed the notorious "Lost Generation" (1883-1900), in which drug and alcohol abuse was rampant. The "Silent Generation" (1925-1942) grew up as the children of war and depression.

    Baby boomers (1943-1960) rebelled against the conformity of the Silent Generation. The "Gen-Xers" (1961-1981) were criticized as slackers and grew up in a culture of rising divorce, parental neglect and a "reality bites" economy.

    Changes occur when a generation realizes that it's no longer the generation, Howe said. "The boomers realized that when MTV and hip-hop hit the scene. Xers are beginning to realize that now. The future belongs to the millennials."

    Click here to return to top of page.

    The Kansas Museum Devoted to the History of Flight (posted 8-14-03)

    From CNN (August 13, 2003):

    A moon rock. The only Soviet Vostok spacecraft in the Western world. The Apollo 13 command module, Odyssey. An SR-71 Blackbird spy plane.

    These are among hundreds of items -- from a piece of Dentyne gum flown on the Apollo-Soyuz mission to the gloves Neil Armstrong and Buzz Aldrin used to touch the moon -- at the Kansas Cosmosphere and Space Center.

    It's a world-class museum, a repository for important artifacts from the nation's space program, and an affiliate of the Smithsonian Institution -- and it's located in a town of 40,000 on the Kansas plains, an hour's drive from Wichita.

    The Cosmosphere traces its roots to a local planetarium, built in the 1960s. A space expert who'd once worked there, Max Ary, happened to be serving on a Smithsonian committee to find homes for artifacts released after the Apollo program ended, when the planetarium board asked whether he had any ideas for a museum.

    Under Ary's direction, the Kansas Cosmosphere was launched in 1980, and that local planetarium was transformed into a nationally recognized space museum.

    In addition to viewing objects on display in the Hall of Space Museum, guests can also take in a planetarium show, an IMAX movie, and a trip to Dr. Goddard's Lab, where an instructor performs experiments and provides an introduction to rocket science, designed for all ages. A T-shirt in the gift shop even says: "Actually, I am a rocket scientist."

    The Cosmosphere also restores spacecraft, and visitors can catch a glimpse of that work being conducted in the Hall of Space Museum.

    Space exploration began, as the museum's exhibits explain it, with Germany's rocket program during World War II. Displays include a set of German production documents along with a V-2 rocket -- the first long-range ballistic missile to be used in combat, launched by the Nazis in 1942 -- that was restored at the Cosmosphere.

    "We try to wrap every piece in some kind of historical, sociological context," said Jeff Ollenburger, president and chief executive of the Cosmosphere.

    In the Cold War gallery, the museum even has sections from the wall that separated East and West Germany at the towns of Boeckwitz (East) and Zicherie (West). There are also examples of early U.S. and Soviet spacecraft, including Vostok and Mercury spacecraft.

    The Odyssey is displayed so that visitors can walk around it and appreciate the damage done to the heat shield of the Apollo 13 command module. The spacecraft carried the mission's three astronauts safely to Earth after an oxygen tank exploded while en route to the moon. The incident was depicted in the 1995 movie "Apollo 13," for which the Cosmosphere helped build props. A video of the film runs continuously as part of the exhibit.

    Click here to return to top of page.

    The Best Book Ever Written About Washington, DC (posted 8-14-03)

    Jonathan Yardley, writing in the Washington Post (August 13, 2003):

    The best"Washington novel" isn't a novel at all. Published six decades ago, Margaret Leech's"Reveille in Washington: 1860-1865" is what academic historians condescendingly call"popular history," written with the novelist's eye for character and telling detail as well as the novelist's command of narrative. The story of the District of Columbia during the Civil War,"Reveille in Washington" is still authoritative as history and is something of a masterpiece of storytelling.

    What matters most for this discussion is that"Reveille in Washington" portrays this city more vividly, accurately and thoroughly than any other book. Period. Its portrait of Washington transcends time. We no longer dump our sewage into the swamp that is now the Mall and we don't ride a horse-drawn streetcar from the Capitol to the White House, but the essential character of the city has not changed from that day to this. Leech got it exactly right.

    The competition, to be sure, isn't much. The number of words that have been written about this city is surely beyond human calculation, but the number of good ones is small. In his preface to the 1991 paperback edition of"Reveille in Washington," James M. McPherson takes note of David Brinkley's"Washington Goes to War," a look at the capital during World War II that is engaging reading but surprisingly narrow in scope. If asked to recommend a good book about the city I always cite the Federal Writers' Project's"Washington, D.C.: A Guide to the Nation's Capital," even though it has been little revised since its original publication in 1942 and is, in any event, lamentably out of print.

    Click here to return to top of page.

    Museum Features History of Contraceptives (posted 8-14-03)

    From ABCNewsline (August 13, 2003):

    Many thought preventing pregnancy was a modern-day desire after the pill's introduction in the 1960s, but a Canadian museum is exploring how crocodile dung, lemons and weasel testicles were once among the contraceptives used thousands of years ago.

    Some of the more than 600 condoms, sponges, cervical caps and other devices on display at Toronto's History of Contraception Museum, the world's biggest collection of contraceptive devices, were completely ineffective, while others were simply dangerous or lethal.

    But a few proved to be somewhat effective and showed the sheer inventiveness of people, said Petra Goodhead, the museum's communications coordinator.

    The desire to avoid pregnancy dates back thousands of years, such as the reference in the Bible's Book of Genesis to Onan's use of coitus interruptus.

    One of the first-ever written prescriptions for a contraceptive device is a 1550 BC papyrus sheet from Egypt on display, which describes a tampon made of seed wool moistened with ground acacia, dates and honey.

    Despite its primitiveness, the tampon worked in part because acacia ferments into lactic acid, an ingredient in today's spermicides.

    Three thousand years ago in India and Egypt, dung from animals thought to possess mystical powers, such as crocodiles and elephants, were inserted into the woman's vagina prior to intercourse to prevent pregnancies.

    While the smell may have crushed the mood, the dung actually acted as a crude blocking agent and its high acidity was thought to provide some spermicidal reaction, the museum noted.

    Superstitions also played a major role in trying to ward off pregnancy in the Middle Ages, but many were ineffective.

    Women strapped amulets containing mule's earwax, weasel's testicles and a bone taken from the right side of a totally black cat to body parts to avoid pregnancy.

    "If one takes the two testicles of a weasel and wraps them up, binding them to the thigh of a woman who wears also a weasel bone on her person, she will no longer conceive," reads one passage among the museum's 11 display cases, which include authentic contraceptive items.

    Click here to return to top of page.

    Controversy About What to Do With Last Remaining Hut in British POW Camp(posted 8-14-03)

    Simon de Bruxelles, writing for theTimes (London) (August 14, 2003):

    PLANS to turn the last remaining hut at a former prisoner of war camp for captured German officers into a tourist attraction have been attacked by a council leader.

    Jeff Jones, leader of the Labour-run Bridgend County Council in South Wales, says that the hut should be demolished and 24 paintings by the PoWs that are stored there should be sold to the highest bidder.

    Mr Jones was compared to the television character Basil Fawlty yesterday after describing the paintings of nude women as ugly and claiming that the artists must have been members of the"SS painting-by-numbers brigade".

    The wall paintings were saved ten years ago with the help of a Pounds 100,000 grant from CADW, the Welsh equivalent of English Heritage.

    The site of the prison camp, at Island Farm in Ewenny, near Bridgend, is scheduled for redevelopment. Campaigners including the Bridgend Civic Trust want Hut 9, the only survivor, to be turned into a museum. But Mr Jones said that neither the hut nor the paintings were worth saving.

    The camp was home to 2,000 German officers between 1943 and 1948. In March 1945 Hut 9 was the scene of the largest breakout of prisoners when 67 men escaped through a tunnel. Most were recaptured within a week.

    Some historians believe that the paintings were created as a diversion to distract the attention of British guards and give prisoners an excuse for being in the hut.

    The most explicit picture, a reclining figure in frilly underwear, was directly above the trapdoor entrance to the escape tunnel.

    Mr Jones said that during their long incarceration the prisoners must have forgotten what a pretty woman looked like."Looking at the German women they painted, they are so ugly I would have stayed in the camp. I wouldn't have bothered to escape.

    "If people think this is art, then God help us. The funny thing is that the women's bodies are well drawn but the faces are dreadful...these paintings are poor. Anyone could do them."

    Alun Cairns, a Tory member of the Welsh Assembly backing the museum plans, said:"Mr Jones's Basil Fawlty-style outburst shows he is an ignoramus."

    Natalie Murphy, of Bridgend's Civic Trust, said:"Mr Jones's comments are a load of rubbish. The paintings are of great historical value."

    Click here to return to top of page.

    Donald Ritchie: The McCarthy Transcripts (posted 8-14-03)

    Donald Ritchie, associate historian of the U.S. Senate Historical Office, writing in the OAH Newsletter (August 2003):

    In May 2003, when the U.S. Senate released all of the previously closed anticommunist hearings that Joseph McCarthy conducted a half century ago, the event served as a national history lesson. The story spread across the media, and editorial cartoonists drew a natural connection between McCarthy's crusade and the current tension between national security and civil liberties. For historians, the release meant, at long last, access to the largest body of unexamined McCarthy materials other than his own senatorial papers still under seal at Marquette University....

    Reading the transcripts for over two years formed definite impressions of their substance. Convinced that subversion and espionage were rampant in the federal government, Senator McCarthy ascribed policies with which he disagreed to either stupidity or sabotage. He tended not to call an agency's top officials to explain these polices, but worked from the bottom up, starting with lower-level employees. With little hard evidence, he expected to drag confessions out of reluctant witnesses, or to get them to perjure themselves. If a witness took the Fifth Amendment, he interpreted it as an admission of guilt. After a closed hearing adjourned, the chairman would advise witnesses that they were free to talk to the waiting reporters if they chose, but that he would not reveal their names publicly. Most witnesses, shaken by the experience, fled without meeting the press. Senator McCarthy would then step into the hallway and deliver his version of the testimony. Somehow the names of the witnesses regularly made their way into print despite his assurances. A review of reports in the New York Times and the Chicago Tribune--one skeptical and the other supportive of McCarthy's claims--reconstructed what he told reporters. His accounts appear grossly exaggerated when compared to transcripts.

    How the senator chose which witnesses to take into public became clearer as the hearings progressed. Those who willingly confessed past politics and named names, and those who took the Fifth Amendment, were more likely to appear at a later public hearing than those who defended themselves rationally and articulately. The testimony of the composer Aaron Copeland and of the archivist Sherrod East offer models of the type that McCarthy did not want to confront publicly.

    Revelations of the orchestration of the hearings and the patterns of culling witnesses dominated news coverage of the hearings' release. The New York Times ran an editorial on "Auditioning for Senator McCarthy." Headlines ranged from "McCarthy Rigged His Showdowns" in the New York Post to "Tales from a Redbaiter's '50s Fishing Expedition," in the Washington Post. Radio and television news pursued similar themes....

    McCarthy's defenders cite the VENONA intercepts as evidence that the senator "was n to something." The problem with this defense is that very few of those in VENONA came under McCarthy's scrutiny. The term "McCarthyism" so broadly covers all of the investigations of the 1940s and 1950s, that it has melded McCarthy's investigations into those conducted by the House Un-American Activities Committee and the Senate Internal Security Subcommittee. McCarthy's detractors blame him for investigating Hollywood, which he did not. His supporters praise him for investigating Alger Hiss, Julius Rosenberg, and others whom he did not. The Permanent Subcommittee on Investigations' release of these executive sessions finally clarifies the differences between McCarthy and McCarthyism.

    Click here to return to top of page.

    The History of the Family: Changing Interpretations (posted 8-14-03)

    Joan Acocella, writing in the New Yorker (August 11, 2003):

    A good deal of our intellectual life in the past half century has been ruled by the following pattern: First, a French person, with great brilliance and little regard for standards of evidence, promulgates a theory overturning dearly held beliefs. Second, many academics, especially the young, seize on the theory and run with it, in the process loading it with far more emotional and political freight than the French thinker—who, after all, was just “doing theory”—had in mind. Meanwhile, other scholars indignantly reaffirm the pre-revisionist view, and everyone calls for more research, to decide the question. In the third stage, the research is produced, and it confuses everybody, because it is too particular, too respectful of variation and complexity, to support either the nice old theory or the naughty new one.

    Recent histories of the family have followed this itinerary. For years, people believed that the family had been the basis of society from ancient times and that only modern liberal ideas had shaken it. Then, in 1960, the French social historian Philippe Ariès produced a book, “Centuries of Childhood,” arguing that for most of history what we call the family—above all, what we call childhood—did not exist, and that only modern liberal ideas created it. According to Ariès, medieval children joined the world of adults from the moment they were weaned, and no one made a fuss about how little and cute they were. In the seventeenth century, however, this system was overthrown. Children gradually came to be seen as creatures of a different order from adults: innocent, fragile, temptable, and therefore in need of molding. Thus came compulsory schooling, which isolated them from the world. At home, they were again sequestered, in what became the increasingly small, self-involved unit of the nuclear family.

    Ariès did not approve of this development. The pictures of the premodern family suggested by his book—father drinking beer with the neighbors, mother at the spinning wheel, grandmother stoking the fire, one child hanging on her skirts, another peeing in a corner or chasing a pig, maidservants flirting with apprentices—are full of Bruegelesque life and variety, tumble and zest. The “discovery of childhood,” Ariès says, deprived the child of all that and “inflicted on him the birch, the prison cell—in a word, the punishments usually reserved for convicts.” At the same time, children became the objects of “obsessive love,” together with incessant demands for conformity to a family ideal. The Europeans became the Portnoys.

    Ariès’s book was translated into English in 1962, and by the seventies many scholars were writing about the “discovery of childhood,” without sharing their French colleague’s regrets about it. Amid the politics of the period, any discovery of childhood looked like a good thing. Edward Shorter, in his “Making of the Modern Family” (1975), deplored the “ghastly slaughter of the innocents” that in his view constituted child care in the old days. Lawrence Stone, the author of “The Family, Sex, and Marriage in England, 1500-1800” (1977), claimed that the conditions of premodern childhood “created an adult world of emotional cripples, whose primary responses to others were at best a calculating indifference.” These men placed the turning point in the eighteenth century rather than the seventeenth, but they, too, felt that the European family eventually underwent a “revolution in sentiment.”

    In opposition to this so-called sentimentalist school, a “continuity” school reared its aggrieved head. According to Steven Ozment, a continuity theorist, no revolution in sentiment ever occurred, or needed to; there had been a culture of childhood ever since there were documentable children. And so, while Shorter’s premodern children were being slaughtered and Stone’s crippled, Ozment’s—as described in his book “Ancestors” (2001)—were “playing marbles, shooting dice, jumping rope, flying kites, dragging tame birds and reluctant beetles about on strings.” Many people, including the disputants, called for more research. “There is little doubt,” Stone wrote in 1974, “that in ten years time the picture will be much clearer, and that all of us who have ventured to advance theories about this subject will have been proved wrong.”

    They were, sort of. The dispute did generate a vast amount of research, which is now being synthesized in Yale’s “The History of the European Family.” The first volume, “Family Life in Early Modern Times, 1500-1789” ($35), came out in 2001. The second, “Family Life in the Long Nineteenth Century, 1789-1913” ($35), was published this past spring. The final volume, on the twentieth century, is due in January. This series is a huge enterprise, with essays by twenty-nine scholars summarizing recent findings on marriage, divorce, child-rearing, kinship groups, and on and on. But it’s not as though the picture has become clearer. Many of the theory-wielders—Ariès above all—had been casual about standards of evidence. The researchers who came after them were much more responsible. They didn’t rely heavily on personal documents such as letters and diaries, because such evidence tells us only about the small minority of people (mostly moneyed, mostly male) who, in this stretch of time, could write. They didn’t look at paintings, because paintings are made by artists, who have their own notions. They concentrated on more objective, and hence more representative, sources: wills, inventories, tax rolls. But, as Ozment has pointed out, the more representative a source, the more shallow and impersonal the information it conveys. You have your choice, he said: “deep interpretation” or “deep sourcing.” The Yale books, from top to bottom, are deep sourcing, and therefore, while the editors, David I. Kertzer and Marzio Barbagli, claim that their findings support the continuity hypothesis, that is true only because the thing the writers report again and again—variation, difference—is bound to yield some evidence for continuity, as well as for everything else.

    Click here to return to top of page.

    A Scientific Explanation for the Trances of the Oracle of Delphi (posted 8-14-03)

    John R. Hale, Jelle Zeilinga de Boer, Jeffrey P. Chanton and Henry A. Spiller writing in Scientific America (August 2003):

    The oracle of Delphi functioned in a specific place, the adyton, or "no entry" area of the [Apollo] temple's core, and through a specific person, the Pythia, who was chosen to speak, as a possessed medium, for Apollo, the god of prophecy. Extraordinarily for misogynist Greece, the Pythia was a woman. And unlike most Greek priests and priestesses, the Pythia did not inherit her office through noble family connections. Although the Pythia had to be from Delphi, she could be old or young, rich or poor, well educated or illiterate. She went through a long and intense period of conditioning, supported by a sisterhood of Delphic women who tended the eternal sacred fire in the temple.

    Tradition attributed the prophetic inspiration of the powerful oracle to geologic phenomena: a chasm in the earth, a vapor that rose from it, and a spring. Roughly a century ago scholars rejected this explanation when archaeologists digging at the site could find no chasm and detect no gases. The ancient testimony, however, is widespread, and it comes from a variety of sources: historians such as Pliny and Diodorus, philosophers such as Plato, the poets Aeschylus and Cicero, the geographer Strabo, the travel writer Pausanias, and even a priest of Apollo who served at Delphi, the famous essayist and biographer Plutarch....

    Generations of scholars accepted these accounts. Then, in about 1900, a young English classicist named Adolphe Paul Oppé visited excavations being carried out by French archaeologists at Delphi. He failed to see any chasm or to hear reports of any gases, and he published an influential article in which he made three critical claims. First, no chasm or gaseous emission had ever existed in the temple at Delphi. Second, even if it had, no natural gas could produce a state resembling spiritual possession. Third, Plutarch's account of a Pythia who had a violent frenzy and died shortly afterward was inconsistent with the customary description of a Pythia sitting on the tripod and chanting her prophecies. Oppé concluded that all the ancient testimony could be explained away. ...

    [The authors go on to explain that in the 1980s faults were discovered in the area of the temple.]

    As careful geologic research and reasoning solved riddle after riddle, we were still left with the question of what gases might have emerged. De Boer learned that geologists working in the Gulf of Mexico had analyzed gases that bubbled up along submerged faults. They had found that active faults in this area of bituminous limestone were producing light hydrocarbon gases such as methane and ethane. Could the same have been true at Delphi?
    To find out, we asked for permission to take samples of spring water from Delphi, along with samples of the travertine rock laid down by ancient springs. We hoped to discover in this porous rock traces of the gases that were brought to the surface in earlier times. At this point, Chanton, who is a chemist, joined the team. In the travertine samples collected by de Boer and Hale, he found methane and ethane, the latter a decomposition product of ethylene. Chanton then visited Greece to collect water samples from springs in and around the oracle site. Analysis of the water from the Kerna spring in the sanctuary itself revealed the presence of methane, ethane and ethylene. Because ethylene has a sweet odor, the presence of this gas seemed to lend support to Plutarch's description of a gas that smelled like expensive perfume.

    To help interpret the possible effects of such gases on human subjects in a confined space, one like the adyton, Spiller, a toxicologist, became a member of the project. His work with "huffers"--teenage drug users who get high on the fumes from substances such as glue and paint thinner, most of which contain light hydrocarbon gases--had shown a number of parallels with the behavior reported for the trance state of the Pythia.

    Spiller uncovered even more parallels in the reports of experiments on the anesthetic properties of ethylene carried out more than half a century ago by pioneering American anesthesiologist Isabella Herb. She had found that a 20 percent mixture of ethylene produced unconsciousness but that lower concentrations induced a trance state. In most cases, the trance was benign: the patient remained conscious, was able to sit up and to respond to questions, experienced out-of-body feelings and euphoria, and had amnesia after being taken off the gas. But occasionally Herb would see violent reactions, the patient uttering wild, incoherent cries and thrashing about. Had a patient vomited during such a frenzy and ingested some of the vomit into the lungs, pneumonia and death would inevitably have followed. Thus, according to Spiller's analysis, inhaling ethylene could account for all the various descriptions of the pneuma at Delphi--its sweet odor and its variable effects on human subjects, including even the potential for death.

    Click here to return to top of page.

    The First Drink of Wine--Ever (posted 8-14-03)

    Richard Monastersky, writing in the Chronicle of Higher Education (August 14, 2003):

    The priceless samples rest inside a recycled cardboard box in the corner of the laboratory. Patrick E. McGovern reaches in and pulls out several plastic bags, each holding a sliver of dull orange pottery, baked some 9,000 years ago by people living in China's Henan province.

    To the uninitiated, the shards look like any others from the ancient world. But in his lab, at the University of Pennsylvania's Museum of Archaeology and Anthropology, Mr. McGovern has detected unusual compounds within the clay that tell of a revolution in the making. The pottery contains the oldest-documented evidence of an alcoholic drink, one made most likely by fermenting grapes with rice and honey. In other words, the earliest known experiments with wine.

    "We feel it's a very solid case for some sort of mixed fermented beverage," says Mr. McGovern, a senior research scientist at the museum.

    Elsewhere in the lab, he has shards from the Caucasus Mountains of Georgia that point to winemaking in that region more than 8,000 years ago. "In both areas," he says, "people were experimenting with different kinds of natural products to develop fermentation systems and see which things worked best. I think it could have been a very exciting time for producing different kinds of fermented beverages."

    Mr. McGovern, who might be called an oeno-archaeologist, practices a subspecialty with just a handful of colleagues around the globe. Although he fell into the subject by accident 15 years ago, the work takes full advantage of his roving academic past, in which he wandered from chemistry to neuroscience to archaeology, with a brief stop in between at a German vineyard.

    This fall Princeton University Press will publish Ancient Wine: The Search for the Origins of Viniculture, which details Mr. McGovern's sleuthing through the ages for clues to how people began fermenting grapes. In the book, he provides the first reports on both the Chinese and the Georgian finds, which together document that people started their love affair with wine at least as far back as the beginning of the Neolithic era, near the dawn of agriculture and before the emergence of the first cities. "The history of civilization, in many ways," he writes, "is the history of wine."

    Click here to return to top of page.

    Did Drake Explore the West Coast of the United States? (posted 8-12-03)

    Dinitia Smith, writing in the NYT (August 9, 2003):

    On the evening of Sept. 26, 1580, the vessel Golden Hind sailed into England's Plymouth Harbor commanded by the privateer Francis Drake. The ship's ballast had been thrown overboard and replaced by 26 tons of silver and hundreds of pounds of gold that Drake had plundered from Spanish galleons on the Pacific Coast of South America.

    According to the folklore of southwestern England, a voice from the ship called out to fishermen tending their nets, "Is the queen still alive?" referring to Drake's patron, Elizabeth I. It had been three years since Drake had set out on his journey. He had gone 40,000 miles, the first English sea captain to circumnavigate the globe. And now he was returning home a glorious hero.

    After Drake landed, there were celebrations. He was eventually knighted by Queen Elizabeth. But a veil of secrecy descended over his journey. The Spanish ambassador, Bernardino de Mendoza, wrote to King Philip II of Spain that Drake's men had been warned "on pain of death" not to divulge details of their trip, and Drake's logs and charts were impounded. Among the many mysteries for contemporary scholars is a gap in Drake's account of his whereabouts from April 1579, when he left the Pacific Coast of Mexico, to November when he landed in the East Indies.

    Now an independent Canadian scholar, Samuel Bawlf, says he thinks he has solved the mystery. Mr. Bawlf, the author of "The Secret Voyage of Sir Francis Drake," newly published by Walker & Company, says Drake had actually been on a secret mission for Elizabeth to find the Pacific entrance to the Northwest Passage, the coveted trade route that would link Europe to the treasures of the Orient. And in doing so, Mr. Bawlf says, he had sailed much farther north than anyone had ever dreamed, to Alaska, 200 years before the first recorded European explorers, including Capt. James Cook.

    If the thesis is true -- and there are skeptics and supporters who are debating the question -- "it would make Drake the greatest explorer in history, until that time at least," Mr. Bawlf said in an interview. He is a former minister of conservation for British Columbia in charge of archaeological and historical sites, and he lives on Salt Spring Island, British Columbia.

    Mr. Bawlf contends that Drake, a proud, boastful man, was obliged to keep his discoveries secret as a result of a coverup ordered by Elizabeth.

    The year before Drake's voyage, a sea captain named Martin Frobisher had discovered what he believed was the Atlantic entrance to the Northwest Passage, west of Greenland. And now, Mr. Bawlf says, Drake believed he had discovered its western entrance, then known by the mythical name Strait of Anian, at the site of the present day Chatham Strait, which runs north through the Alaskan panhandle at 56 degrees north latitude. Mr. Bawlf contends that Elizabeth wanted to hide Drake's discoveries from her Spanish rival, Philip, to protect the valuable trade route. Over the next 15 years there were several follow-up attempts to reach the North Pacific and Drake's entrance, but they were unsuccessful during Elizabeth's reign.

    For the past five years Mr. Bawlf, 59, has engaged in a perhaps quixotic mission to prove his point, combing old maps in the British Library, the Huntington Library in California and other institutions for evidence encrypted in them of the true story of Drake's great journey.

    Click here to return to top of page.

    Gar Alperovitz: NYT's Kristof Is Wrong (posted 8-12-03)

    Gar Alperovitz, Letter to the Editor of the NYT (August 11, 2003):

    Nicholas D. Kristof (column, Aug. 5) writes that modern Japanese scholarship "has bolstered" the "U.S. moral position" in the atomic bombings of Hiroshima and Nagasaki. But this is true only if one asks the wrong question and ignores the critical issue.

    Of course, the Japanese military wanted to continue to fight; no one disputes this. Would it have been allowed to do so, however, once the impact of the Red Army's entrance into the war had been digested — and had assurances been given to Emperor Hirohito (as was done in August 1945)?

    Not in the judgment of American intelligence as early as April 1945 — which is why the United States arranged for Russian agreement to enter the war in early August (until the atomic bomb gave it an alternative). And not in the judgment of other modern Japanese historians.

    Not only was the Red Army likely to chew up what remained of Japan's best divisions in Manchuria (thereby hitting the military where it counted); an attack by the Soviet Union was seen as a mortal political threat to the entire imperial system.

    The judgment of the vast majority of top American military leaders was that the bombings were unnecessary — including (among many others), Generals Eisenhower, MacArthur, LeMay and Arnold, and Adm. William D. Leahy, chairman of the Joint Chiefs of Staff and chief of staff to the president.

    Click here to return to top of page.

    Admiral Perry's 150th Anniversary: How the Japanese Remember His Visit (posted 8-12-03)

    Norimitsu Onishi, writing in the NYT (August 12, 2003):

    As events this summer commemorating the 150th anniversary of [Admiral] Perry's arrival have made clear, he and his black ships still have a profound resonance. Even more than Gen. Douglas MacArthur, who led the American occupation of Japan after World War II, Perry is perhaps the most widely known foreign historic figure in Japan — which might come as a surprise in the United States.

    "Perry? He was an explorer, wasn't he? That's all I know," said Leslie Fields, 41, a software engineer from San Diego, who works at the American naval base, about 12 miles south of Yokohama. "I have no idea what this parade is about."

    Americans might also be surprised by the lack of emphasis here on the Pearl Harbor attack. Recent editorials here hardly mentioned it in a review of the major events of the last 150 years. One of the most widely used government-endorsed junior high school textbooks devotes three pages to Perry, but only three lines to Pearl Harbor.

    If relations with the United States began for the Japanese with Perry, they began for Americans with Pearl Harbor, said Kenichi Matsumoto, a professor of the history of Japanese thought at Reitaku University in nearby Tokyo.

    "For the United States, Pearl Harbor was a traumatic experience, but the Japanese don't fully understand its significance," he said. "On the other hand, Americans don't want to dwell on Perry's visit to Japan because it doesn't fit well with America's version of history. This gap in perception is very large."

    Pearl Harbor does not dovetail with Japan's emphasis on its own suffering in World War II. That focus makes it easier to underplay its aggressions against the United States and other Asian nations.

    In the United States, historians say, Perry has sunk into obscurity partly because he conjures up an imperial image that makes Americans uncomfortable. When Perry came here, America was in an expansionist mood, moved by Manifest Destiny to export Christianity, civilization and commerce. Historians agree, though, that President Millard Fillmore sent Perry to Japan largely because America needed oil — though back then it was the oil from whales found off the Japanese coast. It was also competing against Britain for trade in China and needed Japan as a base. Perry arrived here with four ships mounting more than 60 guns and nearly 1,000 men, carrying a list of demands from Fillmore.

    The Japanese were overwhelmed by Perry's firepower. When he returned the next year, the Japanese yielded and signed a so-called treaty of amity and commerce. The treaty thrust Japan — which until then had banned travel abroad on punishment of death — onto the world stage.

    To this day, the difference in perspectives on the beginning of American and Japanese relations colors each society's understanding of the other, historians say. The perceptions remain in what Shu Kishida, a professor at Wako University in Tokyo who specializes in applying psychoanalysis to history, calls "a people's subconscious memory."

    To Americans, Japan is the sneaky country behind Pearl Harbor, an image that re-emerged during trade friction in the 1980's. To Japan, the United States is an insensitive brute.

    "Japan was saying, `No,' " Mr. Kishida said of Perry's demands, "but was forced to open up its ports, like a woman who was raped." That impression has lingered, he added.

    But most Japanese regard Perry's arrival as the basis of present friendly ties with the United States, said Hiroshi Sato, 45, who teaches history at Tsukuda Junior High School in Tokyo. In his class, he dedicates three to four hours to Perry's visit.

    Click here to return to top of page.

    A New Theory Suggests the Inca Actually Did Possess a Written Language (posted 8-12-03)

    John Noble Wilford, writing in the NYT (August 12, 2003):

    Of all the major Bronze Age civilizations, only the Inca of South America appeared to lack a written language, an exception embarrassing to anthropologists who habitually include writing as a defining attribute of a vibrant, complex culture deserving to be ranked a civilization.

    The Inca left ample evidence of the other attributes: monumental architecture, technology, urbanization and political and social structures to mobilize people and resources. Mesopotamia, Egypt, China and the Maya of Mexico and Central America had all these and writing too.

    The only possible Incan example of encoding and recording information could have been cryptic knotted strings known as khipu.

    The knots are unlike anything sailors or Eagle Scouts tie. In the conventional view of scholars, most khipu (or quipu, in the Hispanic spelling) were arranged as knotted strings hanging from horizontal cords in such a way as to represent numbers for bookkeeping and census purposes. The khipu were presumably textile abacuses, hardly written documents.

    But a more searching analysis of some 450 of the 600 surviving khipu has called into question this interpretation. Although they were probably mainly accounting tools, a growing number of researchers now think that some khipu were nonnumerical and may have been an early form of writing.

    A reading of the knotted string devices, if deciphered, could perhaps reveal narratives of the Inca Empire, the most extensive in America in its glory days before the Spanish conquest in 1532.

    If khipu is indeed the medium of a writing system, Dr. Gary Urton of Harvard says, this is entirely different from any of the known ancient scripts, beginning with the cuneiform of Mesopotamia more than 5,000 years ago. The khipu did not record information in graphic signs for words, but rather a kind of three-dimensional binary code similar to the language of today's computers.

    Dr. Urton, an anthropologist and a MacArthur fellow, suggests that the Inca manipulated strings and knots to convey certain meanings. By an accumulation of binary choices, khipu makers encoded and stored information in a shared system of record keeping that could be read throughout the Inca domain.

    In his book "Signs of the Inka Khipu," being published next month by the University of Texas Press, Dr. Urton said he had for the first time identified the constituent khipu elements. The knots appeared to be arranged in coded sequences analogous, he said, to "the process of writing binary number (1/0) coded programs for computers."

    When someone types e-mail messages, they exist inside the computer in the form of eight-digit sequences of 1's and 0's. The binary coded message is sent to another computer, which translates it back into the more familiar script typed by the sender. The Inca information, Dr. Urton said, appeared to be coded in seven-bit sequences.

    Each sequence could have been a name, an identity or an activity. With the possible variations afforded by string colors and weaves, Dr. Urton estimated, the khipu makers could have had at their command more than 1,500 separate units of information. By comparison, the Sumerians worked with fewer than 1,500 cuneiform signs, and Egyptian hieroglyphs numbered under 800.

    Click here to return to top of page.

    Museum: How Belgium Brought Civilization to the Congo (posted 8-11-03)

    Craig Winneker, writing in the Washington Post (August 9, 2003):

    I recently revisited Belgium's Royal Museum for Central Africa for the first time since I went there on a class trip more than 25 years ago, when my family lived here for a time. Turns out I hadn't missed much.

    Housed in a splendid Louis XIV-style palace on the outskirts of Brussels, the museum has proclaimed the glories of Belgian rule in the Congo for more than a century. And during that time, its impressive collection has remained largely unchanged. Imposing gilded statues depict Belgium's influence in Central Africa. "Belgium brings civilization to the Congo," reads the inscription on one, showing a priest ministering to an adoring Pygmy tribesman. Others illustrate the "security" and "well-being" that were brought to the natives by their colonial masters.

    I did notice one new feature, though: a small posterboard sign that appeared earlier this year, with no fanfare, in one of the more controversial exhibits. The "Gallery of Remembrance" is a shrine to Belgians who died while serving in Central Africa. Its walls are painted with the names of some 1,500 fallen military officers, bureaucrats, traders and pioneers. A bronze plaque salutes the martyred leaders of an anti-slavery campaign. But now, the new sign offers a one-paragraph addendum in French, Dutch and English:

    "The attentive visitor," it reads, "will not fail to notice that, at the time, no need was felt to question the Belgian presence in Central Africa. There was no mention of the Congolese victims, for instance. The viewpoint is exclusively European and concentrates on a few historical episodes. The underlying reality of colonial events was completely ignored. The memorial to the campaigns against slavery, unveiled in 1959, is a rather late example of mythmaking."

    For the first time since it was founded in 1897 by King Leopold II, the museum is finally getting ready to recognize that "underlying reality" and "mythmaking." It may take an unusually "attentive visitor" to find it, but the temporary plaque is a small step in a two-year project aimed at transforming the institution to acknowledge the brutal realities of Belgium's colonial history that have come to light in recent years. It is the first public measure the country is taking to make its citizens aware of the atrocities committed a century ago.

    These long overdue plans have stirred controversy. Some Belgians argue that changing the museum too much will obliterate what they see as their country's honorable role in improving life in Central Africa. Others say the institution, considered one of the foremost of its kind in the world, should be preserved as it is -- as the embodiment of a particular kind of worldview during an important period of history. I have a more postmodern view: The best thing to do with this museum might be to display it inside another museum.

    Click here to return to top of page.

    Fresh Evidence that Caligula Really Was Crazy (posted 8-8-03)

    John Hooper, writing in the Guardian (August 8, 2003):

    British and American archaeologists digging in the Roman Forum said yesterday they had uncovered evidence to suggest that the emperor Caligula really was a self-deifying megalomaniac, and not the misunderstood, if eccentric, ruler that modern scholars have striven to create.

    For several decades historians have been lifting their eyebrows at the Latin authors' portrait of Caligula as a madman who came to believe he was a god.

    But Darius Arya of the American Institute for Roman Culture said a 35-day dig by young archaeologists from Oxford and Stanford universities had reinstated a key element in the traditional account.

    "We have the proof that the guy really was nuts," said Dr Arya as he sat in the shade of a clump of trees a few metres from the excavation. Suspicious of the very unanimity of the ancient sources, modern scholars have suggested they could have been politically biased.

    They have argued, for example, that Caligula's renowned plan to make his horse a consul was really a joke that his subjects failed to comprehend. And, for many years, they have taken a sceptical view of a claim, by Suetonius, that he incorporated one of Rome's most important temples into his own palace.

    Writing about 70 years after Caligula's assassination, Suetonius recorded that the emperor "built out a part of the palace as far as the Forum, and making the temple of Castor and Pollux its vestibule, he often took his place between the divine brethren, and exhibited himself there to be worshipped."

    "This was so outrageous - an act of such impiety, such hubris - that a lot of historians have had great difficulty in believing it," said archaeologist Andrew Wilson, the leader of the Oxford University team.

    Earlier digs in the area showed that a street had run between the two buildings in both the 1st and 3rd centuries AD, before Caligula's reign.

    This gave rise to a theory that the emperor had merely built a bridge between them, even though another ancient source provided an explanation for the apparent contradiction: that the original street was re-established when Caligula's successor, Claudius, destroyed his blasphemous extension.

    Standing in the broiling sun of a Rome August afternoon, Dr Wilson said yesterday that the latest excavations had uncovered no trace of a bridge, but they had found more and more evidence of structures within the site of Caligula's palace that ran at an identical angles to others abutting the site of the temple of Castor and Pollux.

    The dig had also revealed sewerage lines running at the same angle. "The Caligulan foundations imply walls that seem to be projected across the line of the street as far as the temple," Dr Wilson said.

    Click here to return to top of page.

    Did the Beatles Kill the USSR? (posted 8-8-03)

    Mikhail Safonov, senior researcher at the Institute of Russian History at St Petersburg, writing in the Guardian (August 8, 2003):

    [John] Lennon ... murdered the Soviet Union.

    He did not live to see its collapse, and could not have predicted that the Beatles would cultivate a generation of freedom-loving people throughout this country that covers one-sixth of the Earth. But without that love of freedom, the fall of totalitarianism would have been impossible, however bankrupt economically the communist regime may have been. I first heard of the group in 1965. An article about some unknown "Beatles" was published in the journal Krokodil. The name grated on the ear, perhaps due to its phonetic content, associated in my mind with whipped cream (vzbeetiye slivki) and biscuits (beeskvit).

    The article described how a BBC announcer had told the world that Ringo Starr had had his tonsils removed - but had pronounced tonsils so indistinctly that listeners thought the drummer had had his toenails removed, and how the Liverpool postal service was having to work overtime due to the number of letters requesting the toenails in question.

    The first song I heard was on Leningrad radio. It was A Hard Day's Night. I didn't like it - it seemed monotonous, and I doubted if it was worth all those "toenail" requests. Then a collection of songs was released in the German Democratic Republic, taken from the first album. It was impossible not to listen when all anyone was talking about was the Beatles. The music came to us from an unknown, incomprehensible world, and it bewitched us. In his 1930s novel, The Master and Margarita, Mikhail Bulgakov says that love fell upon the heroes like a mugger with a knife from a side street. Something similar happened to the souls of our "teenagers" (a word we learned thanks to the Beatles).

    In the Soviet Union, the Beatles were proscribed. In the early days, infatuation with the Beatles implied an unconscious oppositional stance, more curious than serious, and not at all threatening to the foundations of a socialist society. For instance, during an astronomy lesson, my schoolmate had to give a talk about a planet. Having recited everything that he had copied from a journal, he made his own addition: "And now the latest discovery of four English astronomers - George Harrison, Ringo Starr (and the two others) - the orbit of such and such planet is approaching the Earth, and in the near future, there may well be a collision." The physics teacher barely knew more than we did about the planets. So she listened to this talk of "a possible collision" unsuspecting. She had not heard of these "astronomers". She hadn't even heard of the Beatles.

    My classmates formulated their love for the Beatles in the following manner: "I would have learnt English in its entirety, exclusively from the things that Lennon spoke about." This was a paraphrase of the words of Mayakovsky inscribed on a stand in the literature classroom: "I would have learnt Russian in its entirety, exclusively from the things that Lenin spoke about." In the 1960s, you could not be imprisoned for changing the name of Lenin to that of Lennon, but trouble awaited anyone who blasphemed against the name of the immortal leader: problems dished out by the Komsomol (Communist Union of Youth) could wreck your career. And so, bit by bit, we Lennon fans became ensnared in doubting the values that the system was trying to inculcate.

    To make the slogan about the English language come literally true would have been impossible, as we were learning in a class of 40 pupils and had just two hours of foreign language teaching per week. We wrote down the texts of the English songs using Russian letters. Many of us didn't understand their meanings, but sang them all the same. There was a fashion to have Beatles hairstyles. Young people, "hairies" as the old people called them, were stopped on the street and had their hair cut in police stations. I myself completed my schooling with a grade that qualified me for a silver medal. But, with my own Beatles-inspired haircut, I might not be awarded my medal - I needed a "state hairstyle", with my hair brushed back and washed in a sugar solution. After the leavers' evening, at which I was solemnly awarded my school-leaver's certificate, I was walking out of the Palace of Culture when I was seized by police officers and pushed into their pillbox - all because of my haircut. I said: "What are you doing? Do you want to spoil the best day of my life? I have just been awarded a medal and you push me into a pillbox." The policemen began to laugh at me. "A hairy hippy has been awarded a medal - what a laugh!"

    One of the Leningrad schools staged a show trial against the Beatles. A mock public prosecutor was appointed, and the proceedings were broadcast on the radio. The schoolchildren proclaimed themselves outraged by all that the Beatles had done. The verdict of the trial was that the Beatles were guilty of anti-social behaviour. All this reeked of 1937. But even in Stalin's time, show trials were not held for famous foreigners, who had become almost an integral part of the way of life of the Russian people.

    Yet the more the authorities fought the corrupting influence of the Beatles - or "Bugs" as they were nicknamed by the Soviet media (the word has negative connotations in Russian) - the more we resented this authority, and questioned the official ideology drummed into us from childhood. I remember a broadcast from a late 1960s concert of some high Komsomol event. Two artists in incredible wigs, with guitars in hand, walked around the stage back to back, hitting one another and making a dreadful cacophony with their instruments. They sang a parody of a Beatles tune: "We have been surrounded by women saying you are our idols, saying even from behind I look like a Beatle! Shake, shake! Here we don't play to the end, there we sing too much. Shake, shake!"

    Click here to return to top of page.

    Anti-Americanism's Long Roots (posted 8-8-03)

    Sarah Schmidt, writing for CanWest News Services (August 7, 2003):

    It seems 19th-century settlers could pick out the ugly American in a pack of polite Canadians.

    In letters unearthed by historian Ruth Ann Harris, these distinctly North American personas leap off the pages written by Irish immigrants communicating with family members back home.

    "The general attitude was that the Canadians were more civilized than were Americans," the Boston College professor writes in a paper to be presented tomorrow at the Reading the Emigrant Letter conference at Carleton University in Ottawa.

    "In 10 days travelling through the state of New York I came in contact with a good many characters of the Yankees and from New York to Canada shore I did not meet one that I could like," Henry Johnson wrote in his first letter to his wife in October 1848.

    Johnson left Ireland on June 29, 1848, in search of work in New York. After a short stint in the United States, he travelled to Hamilton in Canada West.

    "They have no regard for religion, children have very little respect for their parents, and they carry what they call the spirit of independence so far as even in their speech to defy God himself. The Canal boats carry on their trade on Sunday same as another day.

    "The people of Canada are quite different," Johnson continued.

    "They are all Scotch and North of Ireland people, homely and civil in this part. When I came into it first I felt almost as if I was getting home again. It is most decidedly a better place to rear a family in than the States if you wish them to have any regard to religion or any respect for their parents."

    Mary Gayer Anderson's disdain of all things American was even more pointed in her series of letters to her mother in Belfast.

    At 27, Anderson followed her husband, a land speculator, to North America in 1884. Along with their four children, the Andersons visited Ontario, Quebec and British Columbia, but lived in Illinois, Kansas, Tennessee and Alabama. Anderson had very different impressions of the two countries.

    "It is a wonderful place, but I am sure as we have heard, very wicked," she wrote to her mother in September 1884, of Chicago."One looks in vain for the face of a gentleman or lady. All mean, common and ruffianly looking."

    The sharpest criticism Anderson used against a contemporary was that she was"dreadfully American." Also,"the women here have the shrillest voices I ever heard," she wrote to her mother.

    Anderson didn't get the same vibe while travelling through Canada."I have had a dose of Americanism since coming here, and it is nauseous. Canada was so different," she wrote in the fall of 1884 after travelling to Coburg, Ont.

    The Carleton Centre for the History of Migration is hosting the three-day international conference, the first to focus on emigrant letters, which begins today.

    "The individual experience is always enlightening in ways you can't anticipate. Letters are very unpredictable," said historian Bruce Elliott, director of the centre.

    Click here to return to top of page.

    Rehashing the Case of the New Zealand Graduate Student Who Denied 6 Million Jews Died in the Holocaust (posted 8-7-03)

    David Cohen, writing in the National Business Review (August 8, 2003):

    Don't believe the guy who once said the problem with history is that there's no future in it. The dust has been disturbed yet again in the long-buried case of Joel Hayward and the University of Canterbury.

    Another scholar is seeking to still or confirm the question of whether Dr Hayward was the victim of an academic witch-hunt on account of the revisionist views he once took but has since repudiated on the subject of the Holocaust. Do we need this?

    Apparently so. Already the opening of this old casket has got a number of local media outlets loudly sneezing, such is their passion for unfettered historical investigation, as they claim to understand it.

    Their general position has found support this past fortnight, with a raft of press interviews given by Dr Hayward to mark his colleague's relitigation of the original scandal.

    The work, you'll recall, argued that far fewer than six million Jews, perhaps fewer than one million, perished in concentration camps during the time of Nazi rule across most of Europe.

    It speculated that the idea of gas chambers being used against the innocent during World War II might have been a propaganda invented by the UK, the US and Jewish lobbyists in the thrall of Zionist forces. It postulated that Hitler could not be held personally responsible for the situation. And so on.

    As somebody who was involved in reporting on the situation at the time, I have no personal judgment to make on Dr Hayward, who has said he now wishes only to concentrate on his new career as a freelance scholar. But the latest surge of coverage isn't only about the acknowledged "mistakes" a 29-year-old student made a decade ago. It's about the way in which a confused local media irrigates the past.

    This has not been our finest hour. With only three lonely, impressive and, it has to be said, left-leaning exceptions ­ stand up, Anthony Hubbard of the Sunday Star-Times, New Zealand Herald columnist Diana Wichtel and the Listener's Philip Matthews ­ the tendency on the journalistic front has been far more toward what the Amer ican columnist George F Will once tactfully characterised as historical amnesia, fumigated by gassy notions of "tolerance" that cannot distinguish between an open mind and an empty mind when it comes to the critical world events of the past century.

    Peculiarly uninformed when it comes to the collective fate that befell European Jewry during World War II, the emptyheads have clucked on about little else other than the need to maintain the air of scepticism Dr Hayward attempted to cast over the topic in his ill-starred master's thesis.

    As an editorial in the New Zealand Herald put it, "All of history has to be open to constant reappraisal of events, their causes and consequences and the light they throw on the past and present."

    Much the same point was made, implicitly and somewhat less elegantly, in an impromptu apologia written by Diana McCurdy in the Dominion-Post, which included the fatuous warning of "a possible backlash against the Jewish community" if they or others were to ever again make known their displeasure at their history being so reappraised.

    On first blush the stance may appear high-minded, even impressively sophisticated; it could also be described as pernicious and kind of creepy.

    To be sure, history is about open-ended questions. So is journalism. As Mr Will asks, speaking on behalf of those who practise the other craft, what kind of student of history makes a career out of denying the reality of an almost contemporary event that has been recorded graphically, documented bureaucratically and described in vast detail by victims, bystanders and perpetrators?

    More than three years have passed since Canterbury was forced to ask itself the same question after first apologising to the country's tiny Jewish community for the unwarranted distress caused by its conferral of a master's degree for the 360-page dissertation The Fate of Jews in German Hands: An Historical Inquiry into the Development and Significance of Holocaust Revisionism.

    An independent inquiry convened by the university later found the work to be seriously flawed and its central conclusions unjustified. They acknowledged that the affair had been deeply embarrassing to one of the country's most respected institutions of higher learning, a university that has long had a special claim to know better than most on questions of bogus history.

    Click here to return to top of page.

    Douglas Brinkley: Those Anonymous Writers at the WPA Are No Longer Anonymous (posted 8-7-03)

    Douglas Brinkley, writing in the NYT (August 2, 2003):

    Writers are usually unabashed about claiming authorship for their work. So it's curious that many of the alumni of one of the most significant American literary projects of the 20th century were ashamed of it: the Federal Writers' Project, a program of President Franklin D. Roosevelt's Works Progress Administration.

    Advertisement


    Created in 1935, in the heart of the Great Depression, the Writers' Project supported more than 6,600 writers, editors and researchers during its four years of federal financing. When the government funds expired, Congress let the program continue under state sponsorship until 1943. Although grateful for even subsistence wages in a time of economic despair, few participants deemed it a badge of honor to earn $20 to $25 a week from the government.

    But the Library of Congress takes a different view. With little fanfare, it has been unpacking boxes of extraordinary Writers' Project material over the last few years from warehouses and storage facilities. After an arduous vetting process, much of it is now available to the public.

    What is becoming clear, says Prof. Jerrold Hirsch of Truman State University, in Kirksville, Mo., is that the editors of the project believed that they could build a national culture on diversity. "They faced a great challenge coming out of the 1920's, where white supremacists, via WASP primacy and the K.K.K. and anti-immigration laws, held sway," Mr. Hirsch said. "In the Federal Writers' Project, ethnic minorities were celebrated for being turpentine workers or grape pickers or folk artists."

    John Cheever was one of the program's unenthusiastic participants. A child of proud Massachusetts Republicans who had called the W.P.A. short for "We Poke Along," he was ashamed of working as a "junior editor" at the program's Washington office. He once described his duties as fixing "the sentences written by some incredibly lazy bastards."

    Nonetheless, Cheever's experiences at the Writers' Project provided the material for many of the best scenes in his 1957 novel, "The Wapshot Chronicle."

    Cheever wasn't the only one who found inspiration at the Writers' Project. Others included Conrad Aiken, Nelson Algren, Saul Bellow, Arna Bontemps, Malcolm Cowley, Edward Dahlberg, Ralph Ellison, Zora Neale Hurston, Claude McKay, Kenneth Patchen, Philip Rahv, Kenneth Rexroth, Harold Rosenberg, Studs Terkel, Margaret Walker, Richard Wright and Frank Yerby.

    These federal employees produced what would become the renowned American Guide Series, comprising volumes for each of the 48 states that then existed, as well as Alaska. The Writers' Project also turned out many other regional, city and cultural guides, like Algren's "Galena, Illinois" and Wright's "Bibliography of Chicago Negroes." All in all, it published more than 275 books, 700 pamphlets and 340 "issuances" (articles, leaflets and radio scripts).

    Click here to return to top of page.

    Eric Rauchway: Why Kristof's Wrong About the Hiroshima Bomb (posted 8-6-03)

    Eric Rauchway, writing on MSNBC.com (August 6, 2003):

    Today is the anniversary of the bombing of Hiroshima, which I might not even have mentioned — didn’t we have a big enough fight on the 50th anniversary? Do we have to start ginning up a new one two years before the 60th?

    Nichoals Kristof cites “an emerging consensus” among historians: “We Americans have blood on our hands” because of Hiroshima. Borrowing from the President, he critiques ”[r]evisionist historians like Gar Alperovitz,” who have shaped “this emerging consensus” that “Washington believed the bombing militarily unnecessary.”

    Kristof knows lots of things about lots of things I know nothing about, and I learn from his work, especially on contemporary Asia and Africa. But he doesn’t know “American scholarship” all that well. Even if you’re not an a-bomb expert — I’m not — you can say pretty quickly that Kristof has the wrong end of the stick.

    First of all, woe betide anyone who asserts there’s an “emerging consensus” on the atomic bombs of 1945; it’s one of those issues that reliably draws shouty people. And Alperovitz’s work in particular invariably polarizes the profession; reviews of his Atomic Diplomacy and The Decision to Use the Atomic Bomb include accusations of professional malfeasance as well as expressions of strong admiration. When historians get to scuffling over his work, well, the tweed flies.

    But second of all, and more importantly, if you went looking for common ground between historians identified with both the left and the right, you would find quite the opposite of what Kristof says.

    Walter LaFeber of Cornell says,
    Militarily, the Americans dropped the first bomb to end the war as quickly as possible and before perhaps a million casualties resulted from an invasion of Japan.
    America, Russia, and the Cold War, 7th ed. (1993), p. 25

    John Lewis Gaddis of Yale says,
    Having acquired this awesome weapon, the United States used it against Japan for a simple and straightforward reason: to achieve victory as quickly, as decisively, and as economically as possible.
    We Now Know (1997), p. 87

    And Barton J. Bernstein of Stanford said as early as 1974,
    The administration did not decide to use the bomb against Japan in order to stop Soviet entry into the Pacific war or to gain advantages over the Soviets in Eastern Europe or in the Pacific. Rather, the Truman administration acted upon the inherited [i.e., from FDR] assumption that the bomb was a legitimate weapon for combat use against an enemy.... The combat use of the bomb promised to speed the end of the war and thereby to save American lives.
    “The Quest for Security,” Journal of American History 60:4, p. 1014

    Indeed, Bernstein long ago indicated that what Kristof describes as the Alperovitz “consensus” is dodgy for a few simple reasons.

    1. Truman didn’t per se decide to use the bomb; he simply allowed the existing bomb program established by FDR to go ahead. (This, by the way, is why suggestions that Truman’s bigotry influenced the decision cannot be terribly serious. The bombing had nearly nothing to do with Truman personally.)

    2. Assertions that Truman, or nearly-Secretary-of-State James F. Byrnes, were thinking principally in terms of scaring the Soviets and not of defeating the Japanese come from other people’s (usually self-serving) recollections recorded well after the fact — after they were persuaded that the a-bomb was something terrible and different.

    3. The bomb’s developers didn’t think of it as a deterrent threat rather than a combat weapon. The notion that it belonged to a special category only emerged after people could see what instant, awful and lasting damage it could do.

    None of this general belief that the bombs were going to be used for military purposes touches the question of whether they were used in the best possible way — could they have been deployed against a more definitely military target? Could there have been a little longer delay before the second bomb? How many lives did avoiding an invasion really save? Wouldn’t an invasion have cost Japanese lives too — more Japanese lives, maybe, than the bomb? Working up to an invasion would probably have meant continued conventional bombing and continued blockades, generating more casualties even before the invasion itself occurred.

    Bernstein again, this time in 1998:

    In 1945, before Hiroshima and even afterward, Truman rightly believed that the use of the A-bomb on Japan would be warmly endorsed by Americans, that they never would have understood, much less approved, a decision not to use the weapon if it was available, and that no mainline American politician, who would have been likely to be President at the time, would have decided otherwise....
    He also seemed to believe that the use of the bomb, as Secretary Byrnes contended, might help him in dealing with the Soviets. But that hope was never a controlling reason but only a supplementary, and thus a confirming, reason to do what Roosevelt would probably also have done, what virtually all top-level presidential advisers seemed to endorse, and what only one adviser, Under Secretary Bard, who was on the fringes of decision-making, ever questioned before Hiroshima: dropping the bomb on Japan in order to speed a surrender.
    “Truman and the A-Bomb,” Journal of Military History, 62:3, p. 567

    As I say, if there were something like a professional historical consensus, this would be it. Nobody’s happy about the bomb — Truman wasn’t either — but you won’t find hordes of historians going around accusing the Truman administration of using the bombs without military reasons in the midst of what was, after all, a brutal war in which the bombing of civilians had already been established as awful, common practice.

    In fine: Whatever the shortcomings of my profession, Mr. Kristof — and historians can be maddening — consensus on Alperovitz isn’t one of them. Please build a straw man out of someone else.

    Click here to return to top of page.

    Holocaust Deniers Countered by Harvard History Project (posted 8-6-03)

    David Mehegan, writing in the Boston Globe (August 6, 2003):

    Witnesses age and die, and events such as the sinking of the Titanic or the attack on Pearl Harbor become more the stuff of books and movies than of real life, at least to young people. That's true even of the Holocaust, the subject of a growing flood of film and fiction. From directors George Stevens (''The Diary of Anne Frank,'' 1959) to Roberto Benigni (''Life Is Beautiful,'' 1998) and Roman Polanski (''The Pianist,'' 2002), the need for a palatable story can sometimes smooth away the jagged edges of history. But last week the Harvard Law School library began to put the raw, unvarnished evidence of the Nazi horror before anyone with a personal computer. On a new website called ''Nuremberg Trials Project: A Digital Document Collection'' (www.nuremberg.law.harvard.edu), the library is digitizing and posting 82,000 documents, totaling 650,000 pages, in its collection of 120,000 documents (more than 1 million pages) from the Nuremberg war crimes trials of 1946-49. Not all the documents in the files will be posted, only those used in the trials. When the project is done, anyone will be able virtually to attend the trials, hear the testimony, and examine the evidence.

    More important, to some historians: The trial transcripts and trove of supporting documents help yank the rug out from under the Holocaust deniers.

    ''It's an unbelievably constructive use of the Internet,'' says Deborah E. Lipstadt, professor of Jewish studies at Emory University and author of ''Denying the Holocaust: The Growing Assault on Truth and Memory.'' ''Before, these materials were not accessible. You could never browse them. To put material of this magnitude and importance on the Internet is a great step forward.''

    There were 13 trials at Nuremberg; the first, of such major leaders as Hermann Goering, was conducted by an international tribunal. The other 12, held by the United States with American civilian judges and American legal procedures, involved lesser defendants but no less heinous crimes. One was USA v. Karl Brandt, et al., the doctors' trial of 1946-47 with which the Harvard project begins. For the use of the numerous lawyers and judges, multiple photostats of original documents, translated documents, and transcripts of the trials were made. After the trials, the National Archives and several American law schools, including Yale, Columbia, and Harvard, received complete sets of the documents.

    Harvard got two sets, in 680 boxes, which were compared and in some cases combined for completeness, and for more than 50 years they were kept in metal file cabinets, available to scholars. They were difficult to use, however, because there was no index (the National Archives' set had an index, but scholars had to use sometimes-deteriorated microfilm). Of greater concern was that the paper of the 1940s was cheap, acidic, and decaying fast. ''After half a century,'' says Harry Martin, Harvard's law librarian, ''they became stiff and began to fade, and we said, `We can't let people handle them any more because they are crumbling.' ''

    By having the whole collection digitized by Harvard's Digitial Imaging and Photography Group, Martin and his team of about eight library staffers would solve several problems. They would protect the originals by taking them out of use and storing them in a climate-controlled repository, and they would make the digitized versions available to all interested citizens, not just qualified historians.

    Click here to return to top of page.

    Why Writing Systems Die (posted 8-5-03)

    From UPI (August 4, 2003:

    A group of scholars has determined that old writing systems go out of use because of a stigma that they pick up as the civilizations associated with them die.

    The scholars also determined that specific writing forms are associated with the ruling class and religion of the societies they reflect.

    Changes in writing systems mirror changes in society, not because of technological advances, but because of feelings associated with the form of writing, said Stephen Houston of Brigham Young University. "This is a new take on communicative 'technologies' -- that they are completely saturated with cultural values and conditioned by history."

    The study is reported in the new issue of Comparative Studies in Society and History, published by Cambridge University Press.

    Houston, a Maya expert, was joined by Oxford Egyptologist John Balinese and Johns Hopkins' Jerrold Cooper, who studies cuneiform.

    Click here to return to top of page.

    Was Hiroshima Really About Revenge? (posted 8-5-03)

    James Carroll, writing in the Boston Globe (August 5, 2003):

    ''Having found the bomb, we have used it.'' These are words spoken by President Truman in a radio address to the American people on the evening of Aug. 9, the day a second bomb fell on Nagasaki. ''We have used it against those who attacked us without warning at Pearl Harbor, against those who have starved and beaten American prisoners of war, against those who have abandoned all pretense of obeying international laws of warfare.''

    President Truman, and others who justified the bomb, would rarely speak this way again - a direct articulation of revenge as a main motivation for the overwhelming destruction of the Japanese cities. In his radio remarks, Truman went on to add the other justifications: ''We have used it in order to shorten the agony of war, in order to save the lives of thousands and thousands of young Americans. We shall continue to use it until we completely destroy Japan's power to make war. Only a Japanese surrender will stop us.'' But even the surrender, when it came, would prompt after-the-fact controversy, since, clinging to the emperor, it wasn't unconditional. If we accepted Japan's hedged surrender after the atomic bomb, why wouldn't we accept it before?

    Every justification offered for the use of the atomic bomb would be clouded by ambiguity except one - revenge. It was the first justification Truman offered, speaking the primal truth, and it was the only justification the American people needed by then. But soon enough, revenge would disappear from all official explanations, and even Truman's critics would rarely address it except obliquely. Much better to debate the necessity of that invasion.

    Americans do not like to acknowledge that a visceral lust for vengeance can be the main force behind national purpose, and that is why the Aug. 6 anniversary always arrives beclouded. In 1995, when the Smithsonian attempted to mount a retrospective exhibit observing the 50th anniversary of Hiroshima and Nagasaki, a mainstream consensus slapped down any effort to ''reappraise the fundamental assumptions'' of the bomb's use. President Clinton declined to second-guess Truman, and the Smithsonian exhibit was canceled. What terrifies Americans is the possibility that stated reasons are distant from, or even unrelated to, the real reasons for the nation's behavior. But Truman had it right the first time: to understand Aug. 6, 1945, you must return to Dec. 7, 1941, the score that had to be settled.

    Pearl Harbor resurfaced in the American memory on Sept. 11, 2001. Again and again, the Day of Infamy was invoked as the relevant precedent - the only other time the United States had suffered such a grievous blow. And just as before, there was never any doubt that the blow would be avenged. Moving quickly away from the unsatisfyingly abstract ''war on terrorism'' and then from the frustration of Osama bin Laden's escape in Afghanistan, President Bush took America to war against Iraq to satisfy that primordial need. And it worked. The United States of America clenched its fist the day the twin towers came down. Against Iraq, the United States finally threw a punch that landed. That is all that matters.

    The controversy over the Bush administration's misleading ''justifications'' for the war in Iraq is a reprise of the endless debate over ''justifications'' offered for the atomic bomb. Neither set of questions grips the American conscience. There is no ''agonizing reappraisal of fundamental assumptions'' in this country. When we want our revenge, we take it. And, even as the flimsy rationales with which we cloak it are stripped away, we fervently deny that vengeance, not justice, defines our purpose.

    Click here to return to top of page.

    Hiroshima: Blood on Our Hands? (posted 8-5-03)

    Nicholas Kristof, writing in the NYT (August 5, 2003):

    Tomorrow will mark the anniversary of one of the most morally contentious events of the 20th century, the atomic bombing of Hiroshima. And after 58 years, there's an emerging consensus: we Americans have blood on our hands.

    There has been a chorus here and abroad that the U.S. has little moral standing on the issue of weapons of mass destruction because we were the first to use the atomic bomb. As Nelson Mandela said of Americans in a speech on Jan. 31, "Because they decided to kill innocent people in Japan, who are still suffering from that, who are they now to pretend that they are the policeman of the world?"

    The traditional American position, that our intention in dropping the bombs on Hiroshima and then Nagasaki was to end the war early and save lives, has been poked full of holes. Revisionist historians like Gar Alperovitz argue persuasively that Washington believed the bombing militarily unnecessary (except to establish American primacy in the postwar order) because, as the U.S. Strategic Bombing Survey put it in 1946, "in all probability" Japan would have surrendered even without the atomic bombs.

    Yet this emerging consensus is, I think, profoundly mistaken.

    While American scholarship has undercut the U.S. moral position, Japanese historical research has bolstered it. The Japanese scholarship, by historians like Sadao Asada of Doshisha University in Kyoto, notes that Japanese wartime leaders who favored surrender saw their salvation in the atomic bombing. The Japanese military was steadfastly refusing to give up, so the peace faction seized upon the bombing as a new argument to force surrender.

    "We of the peace party were assisted by the atomic bomb in our endeavor to end the war," Koichi Kido, one of Emperor Hirohito's closest aides, said later.

    Wartime records and memoirs show that the emperor and some of his aides wanted to end the war by summer 1945. But they were vacillating and couldn't prevail over a military that was determined to keep going even if that meant, as a navy official urged at one meeting, "sacrificing 20 million Japanese lives."

    The atomic bombings broke this political stalemate and were thus described by Mitsumasa Yonai, the navy minister at the time, as a "gift from heaven."

    Click here to return to top of page.

    Postage Stamps: Sanitized History (posted 8-1-03)

    Rebecca Dana, writing in the Washington Post (August 4, 2003):

    PAGE through the collection of any comprehensive philatelist -- John Hotchner of suburban Falls Church, Va., for example -- and try to find a sad stamp. Just try.

    You'll find stamps that commemorate happy parts of sad stories, like the 13-cent Harriet Tubman from the Black Heritage series, and stamps that honor heroes who emerged from the rubble, like the image of firefighters at the World Trade Center. There are pictures of good firsts that stand for sad precedents, like the first black man to make it on a stamp (Booker T. Washington), and memorials to sad lasts and the ends of great men, like the 2-cent stamp issued after President Warren Harding's death.

    But most postage stamps resemble the ones on display at the National Postal Museum starting this week in a new exhibition, "Art of the Stamp." They have the same sunny spirit as the portrait of Ayn Rand, ever-positive, or the one of Ernest Hemingway, well before his suicide. Or they have the down-home goodness of the picture of a Boy Scout saluting, not litigating, and the one of Lou Gehrig with his game face, not wracked by disease. Displayed are not the actual stamps, but 101 original works of art the Postal Service commissioned to make into the stamps.

    "The history of the United States postage stamp is the history of America," says Pat Burke, head of the exhibits department for the Postal Museum. The first U.S. stamp was released in 1847, she says, and postal images have featured or connoted every major event that happened since -- as well as a prodigious number of lighthouses and hummingbirds along the way.

    There's no Japanese internment series here, though; no Joe Jackson seven-center or Lewinsky commemorative.

    "And there isn't a Watergate stamp either," Hotchner says.

    The exhibition is an anthology of positives seen through Norman Rockwell-style rose-colored spectacles. Rockwell's postal art, in particular, is rarely on display, says Ken Martin, a director of the American Philatelic Society. The show features two original works: the saluting Scout and a five-cent stamp marking a century of free local postage called City Mail Delivery.

    Click here to return to top of page.

    Lewis & Clark: Maya Lin's Monuments to Their Expedition (posted 8-1-03)

    Timothy Egan, writing in the NYT (August 4, 2003):

    The currents at the mouth of the Columbia River swallow ships, sailors and time itself in regular gulps. To leave a lasting mark here requires something that can withstand the forces of a river that drains an area the size of France where it collides with the big-fisted edge of the world's largest ocean.

    Maya Lin has walked this surf-cuffed rock at land's end, trying to find a way to use the Lewis and Clark bicentennial to look 200 years ahead. More than two decades after she created one of the nation's best-known pieces of public art, the black granite panels of the Vietnam Veterans Memorial in Washington, she is devising a lasting tribute to the most celebrated of American adventures.

    Nothing is left from Lewis and Clark's winter on the Pacific in 1805-6. For nearly four months, 31 men, a teenage girl and a baby hunkered down at continent's edge, on the other side of the river, in a fort no bigger than the average new American house.

    When Ms. Lin was first asked to commemorate this odyssey, she was puzzled, she said.

    "I said: `Lewis and Clark? That's the last thing I would ever think of doing something on,' " she said in an interview.

    But she said she was drawn into the many ironies — the layers of two centuries — that the story of the 8,000-mile journey evokes. Particularly here, where the United States runs out of continent, and where Lewis and Clark reached the ocean, people say they are prone to look backward and forward, to try to touch some piece of this history.

    "Lewis and Clark summon in us a moment in time," Ms. Lin said. "And looking 200 years ahead, I'm trying to be optimistic. But I am exceedingly saddened by what we have done to these rivers."

    Ms. Lin is constructing at least four memorials along the Columbia, marking places where cultures and rivers converged and nations forever changed course. Relying on the elements — stone, wind and water — her artwork, called the Confluence Project after the private nonprofit group that is financing it, is scheduled for completion in 2006, and will be designed to last 200 years, she said.

    "A big part of it is going to be about water, what these rivers mean," Ms. Lin said.

    Two centuries ago, it was monumental enough to send an expedition overland to view the Pacific. Now, swimmers, artists, historians and legions of casually curious follow the same route and still discover something — about the land, themselves or the course of a young nation's history.

    Every generation reinterprets Lewis and Clark, the Corps of Volunteers for North Western Discovery, as Lewis called the expedition. And the Columbia River, holding the relics of prior ambitions, is where many of those thoughts are born. Franklin D. Roosevelt was chugging along beside this river in a train in 1920, wondering whether some use could be made of "all the water running down unchecked to the sea."

    Later, as president, Roosevelt ushered in the era of big dams, which buried Lewis and Clark campsites upriver and changed a way of life from the Continental Crest to the sea.

    But even with most of the river shackled, the pilgrims keep coming, especially to the last 200 miles of the route. In the desert of eastern Washington State, where the Snake River meets the Columbia, people drive past one of the biggest nuclear cleanup sites in the world and a tangle of industrial clutter to arrive at a little state park where the corps camped in October 1805. In 100-degree heat, they stare at the flat water covering the campsite, and they imagine.

    "The river is remarkably clear and crowded with salmon in maney places," Clark wrote. It was swift, carrying the corps to the ocean at a clip of 30 miles a day.

    Now the Columbia River is clouded, a sluggish reservoir. "People are surprised at how different everything is than how it was when Lewis and Clark camped here," said Ken Maki, a volunteer guide at the state park near Pasco, Wash.

    Click here to return to top of page.

    Walt Whitman: Quakerism, Homosexuality and the Sea (posted 8-1-03)

    A press release issued in conjunction with the debut of a new website (still in development) concerning Walt Whitman, http://www.generalpicture.com/log150/frameset.htm:

    After more than a decade of research, an independent scholar has solved what he terms Walt Whitman's "quaker paradox." The solution, he says, is a pink triangle involving religion, sexuality, and.... whaling. Mitch Gould believes the new view of Whitman will gain ground by the 150th anniversary of the publication of Leaves of Grass. This occurs on July 4th, 2005.

    The "quaker paradox" has plagued scholars ever since Whitman's deathbed disclaimer. He said that he had considered joining Long Island Quakers as a youth, but "was never made to live inside a fence." Indeed, there is no record of Whitman ever belonging to a conventional Quaker meeting. But references to Quakerism in his poetry, his prose works (which include essays on Elias Hicks and George Fox), his personal conversations, and even his dress, are as numerous as they are profound. This trend begins with the 1855 avowal in "Song of Myself:" "I cock my hat as I please, indoors and out." (Note that as Whitman began to expurgate various editions of his book, by 1867, the word "cock" was replaced by "wear.") In his final years, he became more insistent about his religion. "I am a good deal of a Quaker," he told Hamlin Garland point-blank in 1888.

    The answer, according to Mitch Gould, is that Whitman did not join meetings of the first radical Quaker schism, known as Hicksite Quakerism, but he did participate in a much later and lesser-known schism from that group, called the Friends of Human Progress. As evidence, Gould points to John Buescher's recent discovery of two newspaper accounts unknown to Whitman scholars. In one, Whitman recounts his involvement in a table-tipping séance that summons the spirit of a drowned mariner. In the other, Whitman is trying to determine whether only effeminate men are suitable as mediums for channeling spirits-a source of concern for him, as the hirsute exponent of manly love.

    Ann Braude at Harvard Divinity School had already mapped a similar triangle during the rise of women's rights, featuring gender outlaws in bobbed-hair and Bloomers, liberal religion, and spiritualism. Braude showed how a small band of radical Quakers largely instigated both the First Woman's Rights Convention at Seneca Falls and the craze for spirits in the "miracle year" of 1848. The Friends of Human Progress extended the most radical forms of Quaker free-thought to all participants, without requiring them to come "inside a fence" of membership, to use Whitman's term. When he covered the North Collins Progressive Friends Meeting for the Brooklyn Daily Times in 1858, he urged his readers to imagine whether "these heterogenous elements are destined to coalesce at some period, distant or near; it matters not, and form a grand Heresy in religion and morals which shall number in its ranks millions of souls?" It was about this time that Whitman penned "Mediums," a poem characterized by Harold Aspiz as a "lyric steeped in spiritualism," and also "Calamus 4," which portrays the wandering poet using Victorian flower-language to flirt with a swarm of living and dead friends buzzing around him like so many humble-bees.

    Braude pointed out that spiritualism's critical role in suffragism was written out of the official history of the women's movement by Elizabeth Cady Stanton and Susan B. Anthony, because of the intense embarrassment over spiritualism that characterized American society by the turn of the century. Gould reasons that the same reticence to publicly discuss spiritualism-even when Leaves of Grass first appeared-accounts for its absence in most of the contemporary reviews of the book. The exceptions to this rule, of course, are the spiritualist publications, which received the Leaves with great acclaim. A British spiritualist journal swore that Whitman had been "baptised into the true Jordan of Spiritualism."

    Building upon the work of Sherry Ceniza, Gould stresses Whitman's ardent friendships with spiritualist-suffragists such as Ellen O'Connor and Abby Price, and suggests that Leaves of Grass was written as a transparent appeal to the varied social concerns of these reformers. In the "Calamus" poems, Whitman unequivocably announced that he was founding a new institution of manly love. In his notebooks, he revealed, "My final aim: To concentrate around me the leaders of all reforms--transcendentalist, spiritualists, free soilers." Gould demonstrates that Rufus Griswold, the most forthcoming of all Whitman's critics, specifically identifies Whitman's poetry as sodomitical and complains that anyone who objects to this is labeled a "non-progressive conservative. destitute of the 'inner light.' " Delving into a forgotten 1855 burlesque entitled Lucy Boston, Gould further shows how some male spiritualist-suffragists were ridiculed not only as effeminate, but sodomites as well.

    Gould's work also addresses the third leg of the triangle, beginning with an economic chart that locates both Moby Dick and Leaves of Grass at the very pinnacle of the whaling industry's importance. Long before Whitman was experimenting with the innovations in both form and content that would become his flagship poem, whaling voyages had stretched out to three or four years in duration. They contributed a substantial portion of the Long Island 's wealth, and employed an enormous workforce. In 1914, a prominent researcher in sexuality, Dr. Douglas C. McMurtrie, learned that a marked decline in homosexual experiences among sailors was due to "the passing of long voyages, such as used to be taken in the sailing ships, not touching land, in some instances, for six months at a time." This is consistent with literary hints dropped by Herman Melville, Charles Warren Stoddard, and with the "erotic diaries" of navy sailor Philip Van Buskirk. Drawing upon the recent findings of Joann Krieg, Caleb Crain, Hans Turley, and Lillian Faderman, Gould demonstrates many instances of Quaker tolerance for passionate same-gender relationships, dating back to Daniel Defoe's novel, Captain Singleton. The conclusion is that the Quaker doctrine of the "inner light" of individual conscience uniquely accommodated this maritime reality, and that Leaves of Grass, ultimately, gave voice to the sentiments of Quaker whalers in Whitman's birthplace near Sag Harbor.

    Click here to return to top of page.

    The Pyramids Were Not Built by Slaves (posted 8-1-03)

    Jonathan Shaw, writing in Harvard Magazine (July-August 2003):

    The pyramids and the Great Sphinx rise inexplicably from the desert at Giza, relics of a vanished culture. They dwarf the approaching sprawl of modern Cairo, a city of 16 million. The largest pyramid, built for the Pharaoh Khufu around 2530 B.C. and intended to last an eternity, was until early in the twentieth century the biggest building on the planet. To raise it, laborers moved into position six and a half million tons of stone—some in blocks as large as nine tons—with nothing but wood and rope. During the last 4,500 years, the pyramids have drawn every kind of admiration and interest, ranging in ancient times from religious worship to grave robbery, and, in the modern era, from New-Age claims for healing “pyramid power” to pseudoscientific searches by “fantastic archaeologists” seeking hidden chambers or signs of alien visitations to Earth. As feats of engineering or testaments to the decades-long labor of tens of thousands, they have awed even the most sober observers.

    The question of who labored to build them, and why, has long been part of their fascination. Rooted firmly in the popular imagination is the idea that the pyramids were built by slaves serving a merciless pharaoh. This notion of a vast slave class in Egypt originated in Judeo-Christian tradition and has been popularized by Hollywood productions like Cecil B. De Mille’s The Ten Commandments, in which a captive people labor in the scorching sun beneath the whips of pharaoh’s overseers. But graffiti from inside the Giza monuments themselves have long suggested something very different.

    Until recently, however, the fabulous art and gold treasures of pharaohs like Tutankhamen have overshadowed the efforts of scientific archaeologists to understand how human forces—perhaps all levels of Egyptian society—were mobilized to enable the construction of the pyramids. Now, drawing on diverse strands of evidence, from geological history to analysis of living arrangements, bread-making technology, and animal remains, Egyptologist Mark Lehner, an associate of Harvard’s Semitic Museum, is beginning to fashion an answer. He has found the city of the pyramid builders. They were not slaves.

    Click here to return to top of page.

    The Importance of Risk as a Concept in Writing History (posted 7-31-03)

    John Quiggin, writing in the Australian Financial Review (August 1, 2003):

    The past is inevitably viewed through the prism of the present and the imagined future. New concerns about the future necessitate a reassessment of the past and a rewriting of history in the light of contemporary preoccupations. This point is nicely illustrated by David Moss's When All Else Fails,* in which he surveys two centuries of US history and presents the state as "the ultimate risk manager". It is unlikely that past theorists of the state would have taken this view, or that many previous historians would have presented the development of the limited liability corporation as part of the same historical movement as the New Deal.

    However, it is fast becoming a commonplace that "risk" is the central idea of the early 21st century. Just as happened with "globalisation" a decade ago, it is necessary to reassess the experience of the 19th and 20th centuries in the light of new ways of thinking about the present. Moss has fulfilled this task admirably.

    He starts with a nice primer on risk and the history of thinking about risk and uncertainty (an adequate exposition of the subtle differences between these concepts would require a book in itself). It is striking that the concept of probability, fundamental to any formal reasoning about risk and uncertainty, was not developed until the 16th century. Despite insightful contributions in the 1920s by John Maynard Keynes and leading Chicago economist Frank Knight, serious economic analysis of problems involving uncertainty did not occur until the latter half of the 20th century.

    Although probability estimates are now part of daily life, from sporting odds to weather forecasts, the relatively recent development of probability as a concept is an indication that it is not part of our natural mental equipment. As a result, reasoning about probability is typically heuristic in nature and characterised by a range of biases, many of which were catalogued by Amos Tversky and Daniel Kahneman. In particular, people like certain sorts of gambles (those with a lower risk of loss and a small chance of a big payoff, as in a lottery) and dislike others which, under certain consistency assumptions, they should rank similarly. Even more strikingly, the way in which a risky prospect is presented or "framed" may have a large effect on how it's perceived. As Moss shows, this must be taken into account in policy analysis and policy formulation.

    Moss distinguishes three phases of public risk management in the US. The first, "security for business", encompasses innovations such as limited liability and bankruptcy laws, introduced in the period before 1900. The second phase, "security for workers", includes Progressive initiatives such as workers compensation and the core programs of the New Deal, unemployment insurance and social security. The third phase, "security for all", is still under way and includes such diverse initiatives as consumer protection laws and public disaster relief.

    In many ways, the discussion of bankruptcy and limited liability laws is the most interesting section of the book. These institutions have been established for so long now that they seem like a natural part of the capitalist order of things. Yet, as Moss shows, before their introduction, they were vigorously opposed by defenders of the free market, who felt they undermined the principle of individual responsibility and promoted what is now called moral hazard.

    Click here to return to top of page.

    The Richest Man of Color in America Who Wanted to Establish a Colony in Africa of Freed Blacks (posted 7-31-03)

    Christopher Cox, writing in the Boston Herald (July 29, 2003):

    As civil war wracks Liberia, the West African nation's unique historical ties with the United States have been examined anew. Yet the odd story of a country founded by freed American slaves has largely ignored a crucial character: Massachusetts mariner Paul Cuffe, who made the first exodus to Africa to help his brethren "rise to be a people."

    Now a figure best known to historians, 200 years ago Cuffe (pronounced CUFF-ee) was arguably the richest man of color in America, with a fleet of ships and a large farm in Westport.

    "He was very well-regarded in this area," said Mary Jean Blasdale, collections manager of the New Bedford Whaling Museum, which displays several Cuffe artifacts.

    The son of a former slave and his wife, a Wampanoag Indian, Cuffe (1759-1817) embodied the self-made man. Born on barren Cuttyhunk Island, he shipped out at age 14 aboard a whaler. By age 20, Cuffe was running British naval blockades in his own vessel.

    Cuffe would amass a fortune in shipping, then one of the most integrated professions in America. He built a fleet of six ships and traded as far away as Europe and the Caribbean. More than once, his all-black and American Indian crews created a stir when calling on ports in the American South.

    Slavery and racism motivated Cuffe to join the Quakers, a pacifist religious group at the forefront of the abolitionist cause.

    According to Richard Kugler, a Westport resident whose ancestors partnered with Cuffe on several ventures, Cuffe's Quaker friends suggested he resettle skilled, freed American blacks in Africa. At the time, English Quakers were looking to salvage the failing British colony of Sierra Leone. In Cuffe, the Quakers saw "a desirable accession."

    The venture appealed to Cuffe's religious values and to his business sense. Settlers could proselytize, train local tribesmen and create a black trading network that might undercut slaving.

    "It was the Christian elements tied in with trade and anti-slavery, all those good, early-19th century causes," said UMass-Boston history professor Julie Winch, an expert on antebellum America.

    In 1810, Cuffe made a 2-year voyage to Sierra Leone and England to scout locations and raise financial support. The War of 1812 put the venture on hold.

    British investment dried up after the conflict, but Cuffe forged on at his own expense, confident in his vision "to promote the improvement and civilization of Africa." Some historians say Cuffe also believed African-Americans could best be uplifted in Africa, away from slavery and restrictive laws.

    In December 1815, Cuffe and crew sailed from Westport aboard his 109-ton brig, Traveler, with 38 settlers recruited mainly from the Boston area. Cuffe saw the freemen as a vanguard; he planned to return annually with further colonists.

    The unusual mission caught the attention of white advocates of black emigration. On his return, Cuffe suggested other possible West African settlement sites to the American Colonization Society. But the motives of this all-white group were murky: Some members were abolitionists who believed former slaves deserved a refuge. Others were racists who hated intermixture or saw colonization as a way to get rid of troublemakers.

    Cuffe, who died in September 1817, never lived to see his colony fulfilled. Within the ACS, slave owners and powerful Southern politicians, such as Henry Clay and John C. Calhoun, soon took control. To the outrage of other black leaders, ACS literature used Cuffe's image "as if he was willing to go all the way with them," Winch said.

    In 1820, the ACS sent its first shipload of colonists to Sierra Leone. After two years on a swampy island, survivors fled down the coast to present-day Monrovia, where they traded weapons and rum for a swath of coastline. The enclave grew as American slave states deported their free blacks.

    "If he could have lived another five, 10 years, (Cuffe) would have probably been a deeply disillusioned or very angry man," Winch said.

    In 1847, the ACS colony became the independent nation of Liberia. A tiny elite of Americo-Liberians ran the country until 1980, when indigenous resentment sparked a bloody coup. The continent's first black republic has since become a tragic place that Cuffe, the early champion of African nationalism, could never have imagined: an abject, impoverished sinkhole of human misery.

    Click here to return to top of page.

    Samuel Insull: The Ken Lay of His Day (posted 7-31-03)

    From the Washington Post (July 30, 2003):

    To history buffs, the headlines from the collapse of Kenneth L. Lay's Enron Corp. in 2001 are uncannily familiar. It has happened before.

    The executive was Samuel Insull; the company, Commonwealth Edison in Chicago, built by Insull into the Midwest's dominant utility; and the scandal was the collapse of utility holding companies in the 1930s Depression.

    The downfall of Insull would scarcely merit a historical footnote today, were it not for the law Congress passed in 1935 to effectively outlaw the kinds of corporate empires Insull and his peers created in the industry's formative years.

    That law, the Public Utility Holding Company Act, a milestone of President Franklin D. Roosevelt's New Deal, still stands. Known in the industry by its acronym, PUHCA -- pronounced "pooh-ka" -- its rules dictated the creation of today's patchwork of hundreds of separate monopoly power utilities presiding over city or regional customer markets.

    It was Roosevelt's revenge on the holding companies he despised.

    At the center of storm was Insull. He rose to power a century ago in an America raised on Horatio Alger stories of hard-working boys who made good, and his story outdid fiction.

    Like Ken Lay, he was born into a proud family of modest means and grew up filled with ambition. Insull's home was Britain and his real first job was as an office boy to the London representative of a pioneer in the infant U.S. electric-power industry.

    The pioneer was Thomas Edison, and on the recommendation of Insull's London employer, Edison hired the young man as his secretary, at the inventor's New Jersey headquarters.

    Edison came to trust and rely increasingly on the hardworking, observant Insull, biographer Forrest McDonald said, and made him his financial manager. In 1892, Insull became president of the Chicago Edison Co. and turned the generating company into the prototype for the entire industry.

    He absorbed smaller rivals, creating a monopoly whose size allowed investments in ever-larger generating systems. He accepted regulation as the price of monopoly franchises. And he ceaselessly promoted electricity as the pathway to a modern America. "Early to bed and early to rise; work like Hades and advertise" was a favorite motto.

    Insull was revered as one of Chicago's leading benefactors. With his circular spectacles and white moustache, he reminded some of the dapper gentleman on the Monopoly game board.

    As Insull's empire expanded, he created a pyramid of public companies arrayed beneath a parent holding company. The companies passed funds, loans and stock shares among themselves, often out of shareholders' view. Utility profits, for instance, paid for losses on electric railways he controlled.

    When industrialist Cyrus S. Eaton acquired enough stock to threaten Insull's control of his empire, he responded by expanding the size and complexity of his holding-company network. And when Insull bought out Eaton's stake in 1930, he took on a mountain of debt that would sink his enterprise when the nation's economy collapsed in the 1930s.

    Thousands of his shareholders were wiped out and Insull became the symbol of corrupted wealth and power, said author Milton A. Chase. In 1932 and 1933, he was indicted on state and federal charges of embezzlement, fraud and other violations.

    Insull had gone to Europe, and, to get him back, Congress passed a law authorizing Insull's arrest in any country that permitted it by treaty. Seized in Turkey in 1934, he was returned to stand trial.

    As the lead witness in his own defense in the state case, the 74-year-old Insull won over the jury, historians recount. With his own fortune gone, he testified that he had not profited from his business dealings and that most of his salary had gone to charity. After hearing the judge's instruction that Insull could not be convicted for bad judgment or innocent mistakes, the jury acquitted him and his co-defendants. Other acquittals followed.

    He died on a Paris subway platform in 1938, his pockets empty except for a monogrammed handkerchief, three years after Congress enacted the law that could have born his name.

    Click here to return to top of page.

    What Was Germany Like During the Occupation Following WW II? (posted 7-31-03)

    Amity Shlaes, writing in the London Financial Times (July 28, 2003):

    The electricity is still out. People are "bitter, disillusioned and hopeless". They express fury at the Allies, especially the English, whom they believe to be "sabotaging renewal". Many argue that things are worse than under the old dictatorship. On the streets, foreign correspondents interview barefoot orphans, who clamour for an American visa. Above all, there looms the profound hypocrisy of the occupation itself, and its "attempt to eradicate militarism by means of a military regime".

    These descriptions sound familiar. They come, however, not from liberated Iraq in the blistering summer of 2003 but from liberated Germany in the icy winter of 1946. Penned by the Swedish journalist Stig Dagerman, they and other reporting from the period are worth reviewing, if only as a reminder of how typical frustrations, uncertainties and defeats are to no-longer-new occupations.*

    Dagerman travelled to Germany in the autumn of 1946 on behalf of Expressen, a Swedish newspaper. He knew that by now his readers were expecting to receive reports of recovery. After all, 18 months had passed since V-E day. That autumn, "the leaves were falling in Germany for the third time since Churchill's famous speech about the falling of the leaves". What he encountered instead was devastation, a people entering their fourth winter of bunker life, often without food or heat. He described a cellar dweller: "Someone wakens, if she has slept at all, freezing in a bed without blankets, and wades over the ankles in cold water to the stove and tries to coax some fire out of some sour branches from a bombed tree." Her frozen breakfast she cooks on a stove she herself has heaved from a crumbling ruin. Beneath it the body of the owner had lain for two years.

    Children in this Germany were sent out each morning, not to school but to steal. The black market flourished. Citizens who, in the Weimar Republic and under Hitler, had followed every law now trafficked in Chesterfield cigarettes and frozen potatoes. Political hope likewise was fading. Local elections were pending, but the very notion seemed somehow suspect or ridiculous. After all, as Dagerman writes, "if you are living at the edge of starvation, then your first interest is in fighting not for democracy but to distance yourself from the edge."

    Berlin held its first free elections on October 20, but that day, a Sunday, Dagerman recalled, "looked like all the other dead Sundays. There was not the slightest trace of enthusiasm or joy in the crowds of deathly voters." Nazi leaders were executed at Nuremberg the week before these elections, an event which was meant to impose a sort of reassuring finality to the dictatorship. But not all Germans were comforted; at a girls' high school in Wuppertal, pupils wore mourning; in Hamburg a sign was painted, "Shame Nuremberg". Anti-Americanism seemed to be growing. Ernie Pyle, the American war correspondent, noted a German tendency to destroy German tanks and guns rather than give them over to the occupiers.

    As for the occupying powers, they moved from mistake to mistake. Dagerman reported "clumsy dismantling operations", after which "the confiscated material was left to lie to rust in the autumn rains". Arrogant allies made a "practice of making five German families homeless to make space for one Allied family". There were even times when Allied officers mistook resistance heroes for foes.

    Roy Jenkins, the British politician and historian, later recalled the famous sacking by Sir John Barraclough, the British military governor, of Cologne's German mayor. Barraclough gave the mayor a dressing down for insubordination, as well as for having "failed in your duty to the people of Cologne". The mayor was Konrad Adenauer, who went on to become the Federal Republic's first chancellor.

    Dagerman flew north from Hamburg convinced that Germany was heading nowhere; to him it seemed forever "autumnal and icebound". He was not alone in his analysis. Victor Gollancz, the English publisher, wrote of Germany that "the values of the west are in danger".

    Iraq is not Germany. It is more violent. Saddam Hussein, its former leader, may still be alive. Gollancz's "values of the west" do not dominate Iraq. It is more like Hirohito's Japan. Still, a number of things in the German story are relevant to Iraq. The Adenauer infelicity, for example, recalls a US screw-up in May. US troops stormed the Baghdad Hunt Club only to discover that the 35 Iraqis they were handcuffing were the core of the US-funded Iraqi National Congress. The INC's leader, Ahmad Chalabi, was on the brink of being named to Iraq's interim governing council.

    The media are another issue. Dagerman was one of the most talented journalists of his generation. He was a better writer than the reporters who slavishly reproduced the Allied armies' line. But he happened to be wrong. He and a number of others mistook transition for permanence. This is what the US administration is now saying of some of the journalism now coming from Iraq.

    Finally, there is the matter of timing. The post-war history of Germany tells us that it is too soon to draw conclusions about Iraq. As in Germany, it will be the later postwar months and years - the period when a new republic is attempted - that prove dispositive.

    Click here to return to top of page.

    The Shifting Views of Winston Churchill (posted 7-31-03)

    William Wallace, writing in the LA Times (July 27, 2003):

    The last two years have seen Churchill books cascade out of publishing houses on both sides of the Atlantic, with no sign of anyone turning off the tap. There are studies on Churchill and leadership. On Churchill and appeasement. On his wit, his wife, and his love affair with America. There are short biographies and thick ones that run to more than a thousand pages.

    All of which raises the question: How much can the market bear, even for the man who so stubbornly defended civilization at its darkest hour and saw off the Nazis?

    "Oh, I think he'll keep selling without any difficulty," says Andrew Roberts, the British historian and author of this year's "Hitler and Churchill: Secrets of Leadership" (Weidenfeld). "Churchill fanatics tend to be well off, they're proud of their Churchilliana, and like those who are fans of Lawrence of Arabia or Marilyn Monroe, they'll buy every single book. I've been in homes where they're piled up, floor to ceiling," he says. "Just books on Churchill."

    The basis of that enduring fascination with Churchill and World War II stems in part because the first draft of that history was written by Churchill himself. "He understood how much getting your own version of events on the record mattered," Cannadine says. Churchill wrote the history of that war in six volumes -- "History of the Second World War" -- "six extraordinary works in which he is the shining star," Cannadine says. "His account holds the field right up until the 1960s. It's amazing, actually. He fights the war and then tells everybody how they should feel about it" (so amazing that David Reynolds, a professor at Cambridge University, is now writing a book about how Churchill wrote those books).

    But the giddiness faded almost immediately after Churchill's death in 1965 at age 90. By then, his very Victorian rhetoric sounded jarring to a culture reveling in Beatlemania and the beginnings of Swinging London. "Until then it had been only hagiography," Best says. "We swallowed Churchill's version and the faults were not allowed to appear."

    The revision, when it hit with full force in the late 1980s and early 1990s, was harsh. In addition to attacks on Churchill by Holocaust revisionist David Irving, there was a more cogent attack that became known in Britain as the Tory nationalist critique. Led by Oxford professor John Charmley, it argued that Churchill's refusal to make a separate peace with Hitler in 1941 cost Britain its empire and influence.

    "The current phase of Churchill scholarship sees that as fantasy," says Best, whose book analyzing the pro- and anti-Churchill schools comes down on the favorable side. "Britain could not have lived as a vassal state on the periphery of the Nazi Empire. I think we've got it right at last."

    Sandys indignantly claims the revisionist school was nothing but a scam to sell more books. "People think, 'Ah, if I write a book with Churchill in the title it'll sell a few copies. And if I say something nasty about him it will -- shock, horror -- sell more.' " But Roberts says the burst of Churchill books in recent years sprang from the desire of such eminent scholars as Yale University's John Lukacs (who published "Churchill: Visionary, Statesman, Historian" last year) to bury the revisionist take.

    "We couldn't have left the debate where it was," Roberts says. "I now think we've put it back in its box."

    Click here to return to top of page.

    China's History Textbooks: Slowly Admitting the Truth About the Korean War (posted 7-31-03)

    Robert Marquand, writing in the Christian Science Monitor (July 26, 2003):

    "The central government has changed its attitude in certain respects toward North Korea," says a leading Chinese historian, who requested he not be identified. "For example, the government is allowing a more open discussion of the war, and of our relations with the North."

    Historical revisions, for example, are extremely sensitive in communist nations; they can indicate mistakes by revered leaders, or anger important allies. Only recently has China backed off its claim, taught to students for decades, that the bloody and inconclusive Korean War was brought about by a South Korean attack on the North. Second-year high school history texts in Beijing now read that "on June 25, [1950], the Korean Civil War broke out" - leaving even the causes of the war unexplained.

    Among Chinese intellectuals and scholars it is widely known, dating to the opening of the Soviet archives in Moscow, that the North attacked the South after extensive consultations with Joseph Stalin, and later with Mao Zedong. This represents a major change in Chinese understanding of the war.

    Click here to return to top of page.

    UK's Hidden History of Immigration Revealed (posted 7-30-03)

    Dominic Casciani, writing on the website of the BBC (July 30, 2003):

    It's strange to think that a photograph that captures so much suffering can be a rare historical treasure.

    This picture was taken by a Royal Navy officer and early photography enthusiast in 1869, shortly after his patrol ship had intercepted a slave vessel bound from west Africa to the Americas.

    Three decades after Britain itself abolished slavery, it captures the full horrors of what happened to those taken from Africa to man the plantations of the Americas.

    We don't know what happened to the people photographed, and this picture is one of a few known to exist anywhere in the world. But its hitherto hidden existence deep in the National Archives makes it a symbol for the UK's piecemeal documenting of its own history of immigration and minority communities.

    Now that history is neither hidden nor purely in the minds of specialist academics. Thanks to £2m of National Lottery funding, it is very much in the hands of the general public.

    Four years ago, the National Archives and a number of smaller organisations decided to meet an explosion in demand from minority communities wanting to research their own families and histories in the UK. The Moving Here website launched this week is the product.

    Click here to return to top of page.

    Garry Wills: Hillary's Thin Memoir (posted 7-30-03)

    Garry Wills, writing in the NY Review of Books (August 14, 2003):

    It was naive, I suppose, to think that Ms. Clinton would analyze what really worked or went wrong in the administration she was so much a part of. Her main effort was her main failure, the health plan. It would be very interesting to learn where and why that experiment failed. Was the strategy wrong (a complicated combination of public and private funding, as opposed to a single payer) or just the tactics (secret meetings with representatives of too many interests)? Was the staff ill chosen or ill directed (the Ira Magaziner problem)? Did opponents' money and maneuvers doom the plan, no matter what its content? These are all important matters, which she does not so much address as allude to. Her diagnosis of the failure is on the order of "The historical odds were against Bill." True, perhaps, but not very enlightening. Later we learn that "the defeat of our health care reform effort...may have happened in part because of a lack of give-and-take." Or else there was too much give-and-take, too many (six hundred) consultants, too great deference to the private sector. Whatever.

    In one of her few impolitic slips regarding people who are now her colleagues, Ms. Clinton blames part of her troubles on dumb congressmen, who did not even know the difference between Medicare and Medicaid: "Health care reform represented a steep learning curve for more than a few members of Congress." But Magaziner seemed to be counting on the dumbness factor. He thought he could install the new program by slipping it into a Budget Reconciliation Act, cutting off congressional debate—despite the fierce record Senator Robert Byrd had for excluding non-germane matters from Budget Reconciliation. Ms. Clinton attributes this slick approach to the fact that Magaziner "was not a Washington insider." But she and Mr. Clinton backed the Magaziner ploy.

    To her credit, Clinton does not make the excuses she might have. She does not blame her two-week absence from the hundred-day schedule, when crucial decisions on the plan were being made while she was detained at the bedside of her dying father. She does not sharply criticize the alarmist "Harry and Louise" ads paid for by insurance companies—she just prints the dialogue from a Harry-and-Louise skit she did with Mr. Clinton at the White House Correspondents Dinner. She does not mention the wildly erroneous but widely distributed attack on the plan run by Andrew Sullivan in The New Republic—an attack that launched the bizarre political career of its author, Betsy McCaughey.

    But neither does her book trace the longer-term result of the way she and her team went about health reform. The attack team assembled to defeat the plan was an innovative blend of activists, pollsters, pundits, lobbyists, and officeholders, which went on to produce the Contract with America and the Gingrich takeover of the House of Representatives. This was the real right-wing combination that threatened the Clinton presidency. It was to fight this that Ms. Clinton called Dick Morris to the White House and backed his "triangulation" effort at recovery. The keystone of this campaign was the transfer of welfare responsibility to the states, which lost Ms. Clinton the support of her old friends Marian and Peter Edelman, though it probably won Clinton his reelection. Her own defense is unabashedly political—"If he vetoed welfare reform a third time, Bill would be handing the Republicans a potential political windfall"—coupled with a salving gesture: "A Democratic administration was in place to implement it humanely."

    Click here to return to top of page.

    Was Seabiscuit the Hero of the 1930s? (posted 7-30-03)

    Allen Barra, writing in the Wall Street Journal (July 30, 2003):

    Adapted from Laura Hillenbrand's bestseller and with narration by historian David McCullough, the film version of "Seabiscuit" seems destined to become the official version of the rags-to-riches horse and his place in the history of the Depression. If you picked up a newspaper or magazine or switched on an entertainment program last weekend, you heard that Seabiscuit was "the most beloved athletic figure" or even "the most famous icon" in America in the late 1930s, surpassing (to cite just three names pulled out of last weekend's stories hooked to the film's release) Clark Gable, Lou Gehrig and Franklin D. Roosevelt in popularity.

    There might be something to those comparisons. In 1938, Gable's biggest film, "Gone With the Wind," was a year from release. Gehrig had an off-season and was in fact a year from retirement. As for FDR, well, racehorses have no political or ideological opposition.

    Seabiscuit certainly is, as the subtitle of Ms. Hillenbrand's book states, "An American Legend." But was he truly, as both movie and book seem to suggest, the real-life "Rocky" of the down-and-outers?

    First, there's the question of his pedigree; given his ancestors, the case could be made that Seabiscuit was as much an underachiever as an underdog. His granddad was Man o' War, probably the greatest racehorse of all. His father, Hard Tack, broke numerous speed records (though his temperament was so difficult that he was quickly put out to stud). His mom was Swing On, who, though she was not much of a racehorse herself, was descended from the legendary Whisk Broom II. Though he was a misfit, Seabiscuit's bloodlines were more like FDR's than Tom Joad's.

    Neither the movie (which glosses over the hardships and brutality of horse racing for both jockeys and mounts) nor, for that matter, Ms. Hillenbrand's book really comes to terms with the paradox of a Depression hero emerging from The Sport of Kings. (The film tosses a lump of sugar to the average man when Jeff Bridges as Charles Howard, Seabiscuit's owner, tells the track officials to "Open up the infield. You shouldn't have to be rich to enjoy this race.")

    The film takes as a given Seabiscuit's affinity with the downtrodden, their affinity with Franklin Delano Roosevelt as their savior, and, hence, Seabiscuit's validity as a symbol of the New Deal. (In case we miss the connection, a black-and-white photo of FDR is flashed on screen shortly after one of Mr. Bridges's Roosevelt-style speeches.) The more interesting possibility that Charles Howard, a progressive Republican and a bit of a good-natured huckster, wasn't above using a populist twang to sell tickets is never explored.

    It's a common fallacy for popular historians to confuse something that entertains people with something that actually touches their lives. Seabiscuit was certainly a hero during the Depression, but was he the hero of the Depression? Thumb through any number of books on the 1930s and on sports heroes from that decade, and you're likely to find much more on Jesse Owens and his spectacular victories at the 1936 Berlin Olympics than about any racehorse.

    Click here to return to top of page.

    Sojourner Truth's Truths (posted 7-29-03)

    Laura Miller, writing in Salon (July 29, 2003):

    We have no definitive text for the famous speech that Sojourner Truth made before a convention on women's rights in 1851 in Akron, Ohio. That's because her words were improvised and the speaker herself was illiterate. What we do have are a couple of versions, one set down years later by Frances Gage, an organizer of the convention, and some contemporaneous newspaper accounts that are a lot sketchier.

    But in the unlikely event that Gage tweaked Truth's commonsensical eloquence a tad, we still lose out on the translation, for Sojourner Truth, a former slave unable to read or write, was one of the legendary orators of her time. Her words had the power to silence scoffers instantly and to coax forth floods of tears, but she had something else as well. "I do not recollect ever to have been conversant with any one who had more of that silent and subtle power which we call personal presence than this woman," wrote Harriet Beecher Stowe.

    Nearly 6 feet tall and thin as a rail, Truth cut a striking figure before she opened her mouth, but even so she often had to win over the crowd. The story of the Ohio conference as Gage tells it is a typical scenario. Some of the women assembled -- who had been browbeaten for a while by eminent gentlemen over their presumptuous demands for voting rights, considering that they couldn't even climb into a carriage unassisted! -- were dismayed to see her in the room at all. They didn't want women's rights to get muddled up with the even more explosive issue of abolition. When Truth stood to speak, these women vainly urged Gage to keep her quiet.

    But four or five sentences into the speech, they were cheering, and afterward they crowded around her in admiration. The speech is great in a way that no 19th century, and few 20th century, speeches dare to be -- simple, witty, direct, short. Truth heaps scorn on "that man over there" for calling women too frail to master their own lives. Hasn't she worked harder and suffered more brutality and heartbreak than "that little man in the black" will ever know?

    A great deal of Truth's authority came from her own history. Born a slave in about 1797, she was sold five times before the state of New York, where she lived, abolished slavery in 1827. She was forever being separated from her family -- first her parents, and later one or another of her 13 children. In 1843, working as a servant in New York City, she experienced a religious calling, changed her name from Isabella Baumfree to Sojourner Truth and became a traveling preacher. She lived in utopian communities, exchanged ideas with progressive thinkers like Frederick Douglass and did the kind of work we'd call activism today -- advocating for abolition, and after the Emancipation Proclamation, helping former slaves start new lives -- until her death in 1883.

    Truth's "Ain't I a Woman" is speech at its most elemental: unplanned, passionate, springing to life in the mouth of its speaker. It is acutely personal because it relies so much on the very person of the woman who speaks it -- the undeniable, unavoidable facts of her life -- and because it was pointed (as surely her long finger was pointed) right at the people who chose to ignore those facts. It is for all time, but it is also of a particular and heated moment. (Truth was nothing if not quick on the uptake. When a man told her that her speeches didn't amount to a fleabite, she replied, "Maybe not, but, the Lord willing, I'll keep you scratching.")

    Click here to return to top of page.

    David Greenberg: Does Magruder's Allegation Mean Much? (posted 7-29-03)

    David Greenberg, writing in the NYT (July 29, 2003):

    Some Nixon experts are justifiably skeptical of Mr. Magruder's claims, which he did not state publicly for three decades. Other evidence suggests he may be right. But whether or not Mr. Magruder's recollections are accurate, the more interesting question is why we remain so fascinated with this detail about Watergate.

    Mr. Magruder's assertions have received attention partly because they dispel what has always seemed an anomaly: the longstanding presumption of Nixon's innocence in the break-in.

    For one thing, sanctioning the act would have been fully in character for Nixon. On several occasions, he told aides to break into the Brookings Institution, the liberal-leaning think tank in Washington that he thought possessed classified documents. Moreover, John Ehrlichman, a presidential adviser, suggested that Nixon also endorsed stealing the files of Lewis Fielding, the psychiatrist of Daniel Ellsberg, the former Defense Department official who had leaked the Pentagon Papers to the press. Indeed, Nixon himself wrote in his memoirs that although he couldn't recall if he'd been told about the Fielding break-in, "I cannot say that had I been informed of it beforehand, I would have automatically considered it unprecedented, unwarranted or unthinkable."

    Regarding the fateful Watergate caper, however, Nixon always insisted that he learned of it only through news reports. He held that he would never have supported it — because he thought the offices of the Democratic candidate, not the party, would yield more useful information.

    But that claim is undermined by Nixon's statement of June 20, 1972: "My God, the committee isn't worth bugging, in my opinion. That's my public line." As the journalist Ron Rosenbaum has noted, the wording implies that he had some private suspicion to the contrary.

    The case for Nixon's foreknowledge of the Watergate break-in remains, of course, circumstantial; it wouldn't convict him in court. Unless evidence stronger than Mr. Magruder's interview emerges, historians will likely remain agnostic on the matter.

    Although Watergate buffs may despair over being doomed to uncertainty, the embrace of agnosticism is not a defeat for historians. On the contrary, the sooner we accept that mysteries like Nixon's prior knowledge of the break-in aren't likely to be solved, the better off we'll be.

    After all, the break-in itself was one of many crimes of the Nixon administration. "Watergate" has come to refer not to that single operation but to a whole series of what Mr. Mitchell famously called the "White House horrors": abuses of federal law-enforcement agencies to serve Nixon's private purposes, attempts to subvert democratic procedures, obstructions of justice at the highest levels.

    Click here to return to top of page.

    Niall Ferguson: In Defense of the British Empire (posted 7-28-03)

    Niall Ferguson in the course of an interview with Jerry Bowyer on TCS (June 16, 2003):

    TCS: Are we talking about whether an empire is a good thing or are we talking about whether the English speaking empire was, on balance, a good thing?

    FERGUSON: I think the latter. One of the key points I try to make in the book is that compared with the available alternatives in the 19th and the first half of the 20th century, the British empire was really clearly preferable. Whether you think in terms of the other 19th century empires from the Belgian empire to the Russian empire, or by the time you get to the 1930's to the kind of empires that Hitler and Hirohito wanted to create in the world. By comparison with those alternatives, people who were living under British rule in, say, India or in sub-Saharan Africa were pretty well off. And actually people living under British rule in sub-Saharan Africa were in many ways a good deal better off than they have been since the British went.

    Because having your own corrupt dictators rip you off and send the proceeds to Swiss bank accounts is really significantly worse than being run by Britain's quite enlightened and very non-corrupt administrators back in the 1940's and '50's. So, I think there is a clear distinction, in other words, between the way English speaking empires work and the way the alternatives have worked.

    TCS: I recently talked to Warren Zimmerman who's written about the American empire and about Theodore Roosevelt's legacy. According to Zimmerman, the Spanish empire and the French empire had a very deep streak of self-interest whereas the English-speaking empire, yes, there was self-interest, but there was also a streak of idealism that seems to have been missing from the other empires of world history.

    FERGUSON: I think that's right. And the language of liberty, which, of course, the Americans regard as uniquely their own, was being used before the American revolution by British empire builders. And I don't think it's true to say that empires like the British empire are built purely out of self-interest and purely in order to "exploit subject peoples."

    One of the really impressive things about what happens in 19th century British domains is the way that infrastructure is modernized. I mean the British bring a communications revolution to Africa and Asia by building railways and telegraphs, creating huge transoceanic cables for communications and steamship routes. It's a kind of modernizing project. And although that certainly has its profitable side from the point of view of the British, it would be absurd to pretend that it doesn't benefit the so-called "subject people." I mean, everybody gains from an empire of free trade and I think that's a crucial point. Britain's empire was the only 19th century empire that was committed to globalization and it wasn't just the British who benefited from that.

    TCS: All right, let's turn the issue around for a moment. This is a counter-intuitive stance. Your stance is that empire is good for the colonies. But I'm not so much worried about what that would do to Afghanis or Iraqis, I'm more concerned about what it would do to us.

    FERGUSON: Yeah, good point. One of the old stories that you used to read about was that Britain's decline as an economic power was a consequence of imperial overstretch, the British were stretched too far trying to run the world all the way from Canada to Singapore and finally the sheer cost of it brought them to their knees. And you could infer from that, oh dear, the United States could face the danger of overstretch.

    Well, I've tried to show in this book and I talked about it in my last book, "The Cash Nexus", as well, that this overstretch story actually doesn't make much sense. The British empire wasn't very expensive to run until finally the world wars came along and those really had quite different causes from the existence of an empire.

    When you look at the actual costs of running the empire in terms of defense expenditure, they were remarkably low, and of course the benefits from Britain's point of view of having free trade and access to the markets of around a quarter of the world, these were pretty considerable. One can see that in many ways the empire was self-financing. I think from the United States point of view it's an even easier calculation to make because the U.S. is so much richer than the British Isles ever were.

    I think if you just take the case of Iraq, which is clearly the thing uppermost in people's minds at the moment, the U.S. would gain immensely if it successfully transformed Iraq into a growing market economy with a stable form of government. First of all, it would no longer be a threat to the U.S. and her allies; and second of all, there would be economic benefits from trading with one of the world's great oil powers. It's a win/win story that I'm trying to put forward here. I don't think it was the empire that did Britain in. Quite the reverse. It was probably as much the loss of empire that did Britain in.

    Click here to return to top of page.

    Pakistan Upset After Scholar Says the Koran Has Been Mistranslated (posted 7-28-03)

    From WorldNetDaily.com (July 26, 2003):

    Pakistan has banned the latest issue of Newsweek magazine because of an article titled "Challenging the Quran," which features a scholar who contests the Islamic holy book's purported origin and claims it counts "white raisins" as a heavenly reward, not dark-eyed virgins.

    In a book sure to touch off a storm when it's published this fall, the German scholar says faulty transcription of the original Quranic text has resulted in errors of interpretation. Using the pseudonym Christoph Luxenberg, he questions the Islamic rule that women must cover themselves and asserts the Quran originally was a Christian document.

    "The article is insulting to the Quran," Pakistani information minister Sheikh Rashid Ahmed, told the Associated Press.

    Customs authorities have been notified to seize copies of the magazine, said Ahmed, according to the AP. The offending article is on page 40 of Newsweek International's July 28 edition.

    "The decision was taken to prevent religious violence and control law and order situation," he said.

    Ahmed insisted Pakistan has freedom of expression, but said the government expected the media to be careful about the religious sensitivities of Muslims, Agence France-Presse reported.

    Pakistan's military regime is threatened by militant Sunni and Shiite sects. As WorldNetDaily has reported, the Asian nation has a law requiring the death penalty for anyone who blasphemes the Quran or Islam's prophet, Muhammad.

    A Newsweek spokesman told the AP his magazine stands by its story.

    "This does happen in some places around the world with one of our 10 international editions," he said.

    The Newsweek article says Luxenberg's book is "likely to be the most far-reaching scholarly commentary on the Quran's early genesis, taking this infant discipline far into uncharted – and highly controversial – territory."

    Newsweek explained this is because Islamic orthodoxy considers the Quran to be the verbatim revelation of Allah, which has made critical study of the book off-limits in much of the Islamic world.

    Luxenberg, according to Newsweek, is a professor of Semitic languages at one of Germany's leading universities. He has chosen to remain anonymous because he fears a fatwa by enraged Islamic extremists.

    The scholar bases his main claims on the assertion the Quran's original language was not Arabic but a tongue closer to Aramaic, the language Jesus spoke.

    The copy of the Quran in circulation today is a mistranscription of the original text, he maintains, insisting Arabic did not become a written language until 150 years after Muhammad's death.

    The commandment for women to cover themselves is based on a misreading of the text, he contends, according to Newsweek. The verse calling women to "snap their scarves" over their bags becomes in Aramaic "snap their belts around their waists," says Luxenberg.

    An Aramaic reading of Sura 33, calling Muhammad "the seal of the prophets" – or the final and ultimate prophet of God – actually should read "witness of the prophets," he claims.

    The explosive implication is the Quran is merely a witness to the established Judeo-Christian texts.

    He argues the original Quran actually was a Christian liturgical document before Arabs turned Muhammad's teaching into a new religion long after his death.

    Newsweek notes in 2001, a revisionist scholar was convicted by Egypt's Constitutional Court of "apostasy" for regarding the Quran as a document written by humans.

    Click here to return to top of page.

    Real-Life Archaeology Is Getting More Dangerous Than in the Movies (posted 7-28-03)

    From ABCNews.com (July 20, 20-03):

    Indiana Jones fought savage tribesmen and murderous competitors to keep the Ark of the Covenant from the Nazis.

    In the recent Mummy movies, a pair of archaeologists braved musty Egyptian catacombs to save the world from an ancient evil.

    And in the upcoming Lara Croft Tomb Raider: Cradle of Life, Angelina Jolie's character skips from underwater temples to Central Africa for the coveted, mythical Pandora's Box.

    But those challenges are nothing compared to those faced by their real-life counterparts. While modern-day archaeologists rarely have to confront lava pits, animated stone statues or the undead, they increasingly contend with entire peoples, becoming the frontline troops in the clash of civilizations.

    As empires and superpowers fade, cultural, religious and nationalistic movements have been growing in strength — and they are looking to archaeology to give them the validation of history, said Philip Kohl, editor of the book Nationalism, Politics and the Practice of Anthropology.

    "It's part of a post-modern movement," said Kohl, who teaches anthropology at Wellesley College in Massachusetts. He associated the trend with the end of the age of empires and the Cold War, and the ethnic conflicts that followed.

    For example, in northern India this spring, archaeologists began digging at the ruins of a 16th-century mosque to see if a Hindu temple also existed in the spot.

    What they find in the ruins of the Babri Masjid could settle a dispute between Hindus and Muslims that began, depending on your point of view, a decade ago — or half-a-millennium ago — or even earlier.

    Hindus say the mosque was built by Muslims after they destroyed a temple to the god Ram, who Hindus say was born on the site. In 1992, a Hindu mob destroyed the mosque, setting off a round of religious violence in which 3,000 people died.

    If proof of a temple is found, Hindu militant groups could use it to justify the destruction of the ancient mosque, and try to persuade the courts to allow the construction of a new temple.

    If ethnic and religious conflict is at the root of politically motivated archaeology, it should be no surprise that archaeologists specializing in the Middle East have had the most experience with it.

    In an area where "time seems to be immaterial … there is no shortage of political interests," said Guillermo Alcazar, an archaeologist from the University of California, San Diego.

    He made note of Masada, a 2,000-year-old fortress on the edge of the Dead Sea, where hundreds of Jewish fighters made their last stand after the fall of Jerusalem in A.D. 74. The site was largely ignored until about 50 years ago, when Israel started to use the site to swear in some of its soldiers, he said — as if to say they will never again suffer the same fate.

    When the Palestinians first established the framework of a state, Alcazar said, "One of the first things they did was to establish a department of antiquities and dig like crazy."

    The focus of most of these efforts is Jerusalem, a holy city for three of the world's major religions. Within Jerusalem is the Temple Mount, an area central to the religious narratives of Jews, Christians and Muslims.

    The Temple Mount is the holiest site in Judaism and the third-holiest in Islam. Christians believe Jesus Christ preached there, and many evangelicals believe the site is central to Jesus' return.

    Archaeologists working on the Temple Mount have been involved in a political tug of war as recently as last fall, when a group of Jewish archaeologists complained to the Islamic trust in charge that a bulge in a wall there posed a serious risk of collapse.

    Early this year, the Islamic trust began repairs — but the Jewish archaeologists had them stop because they said the repairs were making the problem worse.

    The same archaeologists have also been alleging that the Islamic trust is trying to erase any evidence that a Jewish temple stood on the site. The trust has dismissed both complaints as politically motivated.

    The dispute has gotten so heated that the Jewish archaeologists have hired private aircraft to take aerial pictures of the site every few weeks.

    During the last major archaeological dispute over the Temple Mount in 1996, Israeli archaeologists opened a small exit to a tunnel in the mount. But Palestinians claimed it was a nefarious plot to undermine the foundation of their holy places. Riots ensued.

    Click here to return to top of page.

    Kathryn Weathersby: Did the Soviets Back North Korean Invasion Because of Acheson's Speech? No.(posted 7-28-03)

    Kathryn Weathersby, in the course of an interview with NPR (July 2003):

    [I]n the spring, in late '49 and early '50, what was happening is, on the one hand, the U.S. had very severely demobilized after World War II. So our armed forces were very small, they were very weak, they were under-funded. They were not capable of very much, in a word.

    And so when it came time to make a strategic policy in a kind of deliberate, you know, let's think about it for a long time, create a lot of committees and decide together how we're going to approach the world, the U.S. government was very constrained by the limitation of its means. It's a little hard for us to remember now because our means have been so great ever since the Korean War. The Korean War very decisively ended this period and began an arms race, the arms race that continued.

    So the decision was reached in late '49 and formally adopted in a policy paper called NSC 48. [The Departments of] State, War, [and the] Navy, [made this] collective strategic policy statement that the U.S. would hold on to its control of Japan and the Philippines and everything to the east - in other words, all the islands that had been acquired by Japan at the end of World War II, Hawaii and the west coast.

    But it would not try to intervene to the west of that line, which would mean not try to intervene in Taiwan or Korea or anywhere else on the Asian mainland. The reason being not really so much that Taiwan was unimportant or that Korea was unimportant but simply we didn't have the capacity to and those two places were also seen as being less important than Japan, certainly.

    But also, then, the Philippines that we had held since the end of the 19th century. So it seemed like a really quite logical decision from that point of view. Well, that decision then is what underlay Secretary of State Dean Acheson's speech at that National Press Club in January, 1950 where he laid out American determination as he saw it in Northeast Asia. In retrospect many people have said that Dean Acheson's speech gave the green light to the Soviets because Korea was excluded from the American defense perimeter. So the Soviets thought, well, okay, then we'll go ahead and get it. To be fair to the Secretary of State I would argue, and I've looked at this quite closely, that the timing of the Soviet decision suggests that it was not Dean Acheson's speech that made the difference, but rather Stalin had changed his mind at the very beginning of January, probably because he found out about NSC 48 from Donald Maclean, the British spy who was in Washington.

    Click here to return to top of page.

    NPR: The Impact of the Korean War on American Race Relations (posted 7-28-03)

    NPR (July 2003):

    In dozens of interviews fifty years later, black and white veterans of Korea remember integration as generally smooth and peaceful. But there were certainly problems, especially when it came to the new experience of blacks commanding whites. Mark Hannah of Wichita, Kansas was assigned to lead a white combat unit, "They didn't want me to become their squad leader, so they said, 'Well, we'll just kill the nigger. We've never had a nigger tell us what to do and we're not going to start now.'" The commander offered Hannah a choice - stay and be his jeep driver or find another unit. Hannah transferred out.

    Enemy propaganda tried to exploit racial tensions among the U.S. troops. Static-filled radio broadcasts from the Chinese capital of Peking tried to stir up outrage among blacks about fighting for a segregated democracy. Combat veterans Curtis Morrow and Samuel King say the enemy would also drop leaflets into their foxholes.

    "We all saw those pictures of a black man being hung and a bunch of white faces eating popcorn and little kids jeering and laughing and in the caption beneath the picture they would have 'Why are you here? Why are you fighting us? Is this what you're fighting for?'" recalls Morrow.

    King says, "It was an embarrassment for us to have someone in a foreign country know how we were being treated. And we over here fighting these people to make it better for someone back home and we get back home and it's not going to be any better and we knew that, yet and still we had a job to do and we felt that we should do it."

    To fight the cold war propaganda battle, the U.S. military made a proud display of integration success stories. This installment of Time for Defense contradicted one stereotype - that blacks were poor soldiers - while repeating another.

    Newsreel:
    "No story about the American Negro soldier would be complete without a spiritual. So let's return to the 96th for just a bit more, while some of the resting gunners sing 'I'm on the battlefield for my Lord.' "

    It would take decades for blacks to overcome blatant prejudice in the U.S. military. But President Truman's desegregation order was an historic opportunity for men like Bill Peterson.

    "I went from a high-school dropout to almost a graduate degree," explains Peterson. "I don't think I would have gotten that far in civilian life. I think that history will reflect that the military led the way and is still leading the way for integration and people of color - to include women - in leadership roles."

    The integrated military also meant that millions of whites went home with new knowledge about blacks. Charles Day was one of them.

    "I found out they're smart, remembers Day. "Some of them are smart as whips…As a kid I just thought well, 'They're maybe not as good as me.' That opinion changed drastically after I got to Korea - after a period."

    Interviewer: "Do you think it was a good thing that the army was integrated?"

    "Yes, it changed some opinions - like it did me."

    Historian Philip Klinkner says Korea would lay crucial groundwork for the growing civil rights movement in the U.S.

    "I think it showed African Americans as well as white Americans that integrated institutions could work - that a lot of the sort of intellectual and pragmatic arguments that were made for Jim Crow institutions really were shown to be myths, that American could move toward a more integrated society without some sort of crisis setting in."

    After Korea, the U.S. military became America's most integrated social institution, producing hundreds of black generals, offering education, job training and solid careers African Americans. Still, many black veterans say their service in Korea has been overlooked. They are the forgotten soldiers of a forgotten war. More than three thousand African Americans died in Korea. Veteran Nathan Street remembers helping one wounded black soldier who he says fought for a country, which at the time, scarcely deserved his sacrifice.

    "He was hit in the head and the chest," recalls Street, "he was breathing heavy, like he was snoring, there was blood in his lungs. We got him down, carried him down the hill and I seen the medic later and asked how he was doing. He said he died. And I didn't know him. And I wish I could tell his family, we tried. And then I think, he gave his life for what? He probably couldn't vote where he came from. But…things like that haunt me."

    Click here to return to top of page.

    Daniel Kevles: Why We Wrote a Textbook Based on Science and Technology (posted 7-28-03)

    Daniel Kevles, writing in the Chronicle of Higher Education (July 28, 2003):

    It is a commonplace that the United States is a scientific and technological society. SciTech has been as essential to American development as, for example, migration and settlement, industrialization and reform, the movements for civil rights and women's rights. It has wrought enormous changes in American life since the early days of the Republic -- notably in national security, transportation, communications, work, manufacturing, consumer habits, leisure, and medical care. When, in 1799, George Washington contracted the respiratory infection that killed him, his doctors could do no more than bleed him, soak his feet, swathe his throat with flannel and ointments, and assist his breathing with steam. Now he would be treated with antibiotics, temporarily placed on a respirator, and most likely saved.

    Perhaps no group is as familiar with SciTech -- especially the Tech -- as today's college students. They behave like astonishing parallel processors, able effortlessly and simultaneously to watch TV, surf the Web, listen to music, and do their homework. At the same time -- witness recent laments -- they seem to know little about their historical roots and connection to the larger society. Putting the story of science and technology into the American narrative could help connect students to the past and encourage them to recognize history as relevant. Yet most contemporary college textbooks grant SciTech little more than perfunctory treatment. They give the impression that technological innovation just happened, without cause or reason. In so doing, they encourage students to think that SciTech is beyond control, that it condemns people to victimhood before some technical juggernaut. SciTech, of course, did not -- and does not -- just happen. Like, say, constitution making, it has been a product of human agency, whether it was Edison sweating to forge his electric-light and power system, physicists struggling to produce an atomic bomb, or biologists trying to devise a vaccine for polio. And, having arisen from human effort, it has been (and remains) within the reach of democratic regulation and decision.

    Convinced that neglecting SciTech does injustice to the narrative and shortchanges students, my colleagues and I decided to write Inventing America, a textbook that integrates science and technology organically into the story of American history. We found that the integration came naturally, precisely because the innovations and uses of SciTech exemplify familiar themes in the development of the United States....

    From the founding of the Republic, SciTech was encouraged by both state and federal governments for economic development and, in the federal case, for national defense as well. In the 19th century, the national interest in exploration and settlement led to the establishment of governmental surveys of the land and its resources. The Lewis and Clark expedition early in the century was followed in 1838-42 by the U.S. Navy's Wilkes Expedition, a flotilla of six small ships that sailed from Virginia around Cape Horn to the Fiji Islands, Hawaii, and on to Oregon and Washington, gathering some 160,000 zoological, ornithological, botanical, and ethnographic specimens. In the middle third of the century, some states mounted geological surveys to ascertain the location and extent of resources such as minerals and timber. After the Civil War, the federal government fostered a grand reconnaissance of the West to assess possible routes of the proposed transcontinental railroad.

    Governmental encouragement of SciTech also took the form of legal, administrative, and financial incentives advanced directly or indirectly to private enterprise. Before 1787, the states enacted patent laws to foster invention, and the federal Constitution authorized Congress to do the same. Not long after the War of 1812, the U.S. Army collaborated with gun manufacturers to pioneer the development of interchangeable parts, an innovation that spread to other manufacturers and helped facilitate America's first industrial revolution, which depended upon mechanical invention and steam engines....

    The more SciTech pervaded American life, however, the more people regarded it with ambivalence, especially as they grew more sensitive to environmental preservation and the protection of minority rights. Dams might generate the electric power needed for economic development in the West, but they also inundated raw nature and forced residents, often Native Americans, from their homes. Nuclear-power plants might have been hailed as providing electricity too cheap to meter; as the accidents at Three Mile Island and then Chernobyl made clear, they posed their own hazards.

    The chemical and plastics industries supplied people with advantages ranging from microwaveable dishes to lighter automobile components, but they also contributed to pollution and destruction of wildlife. Personal computers gave millions convenient access to an infinite world of information; they also afforded public and private agencies unprecedented opportunities for surveillance and the invasion of privacy. High-tech medicine extended health and life; it also assisted in depersonalizing medical care, vastly increasing its cost, and creating dire life-and-death choices. Americans have lived for more than half a century with the threat of nuclear Armageddon. Now their sense of security is bedeviled by fears of chemical and biological weapons.

    All those ambivalences highlight the fact that the direction of SciTech has never been more crucially a matter for public policy and democratic decision. The contemporary role of SciTech in maintaining the nation's security, economy, environment, health, and intellectual vitality is indisputable. Exploration of the forces that have shaped its impact on the United States can equip people to deal with similar forces at work now. Not to examine them historically is to impair understanding of our nation's past and to menace its future.

    Click here to return to top of page.

    Why Conservatives Are Denouncing Ann Coulter's Book (posted 7-28-03)

    Sam Tanenhaus, writing in Slate (July 26, 2003):

    Ann Coulter, the right wing's dial-900 girl—a rail-thin, chain-smoking, hard-drinking, big-eyed leggy blonde who winkingly serves up X-rated ideological smut on liberals—is at it again. "Whenever the nation is under attack, from within or without, liberals side with the enemy," Coulter writes—or sneers—in Treason, her follow-up effort to the best-selling Slander. Like its predecessor, Treason sits atop the best-seller charts, riding higher than one of Coulter's signature miniskirts.

    But this time around, it isn't the liberals who are up in arms; it's the conservatives. Coulter's slurring of Democrats—from Harry Truman (soft on communism) to Tom Daschle (soft on Iraq) —has set off a howling chorus on the right. David Horowitz, Andrew Sullivan, and Dorothy Rabinowitz, among others, have been sternly giving Coulter history lessons, dredging up (once more) the anti-Communist credentials of Cold War liberals like Truman, John F. Kennedy, Lyndon Johnson, and Hubert Humphrey.

    Horowitz et al. are right, of course. But why are they so worked up? And why reach back so far to single out a few "good" liberals? This just reinforces Coulter's argument that today's breed can be dismissed as a single lumpen mass. In other words, they agree with her. So, why the outrage? Here's a guess: Coulter's conservative critics fear that her legions of fans—and lots of others, too—see no appreciable difference between her ill-informed comic diatribes and their high-brow ultraserious ones, particularly since Coulter's previous performances were praised by some now on the attack.

    But this is yet another case where the dumb public is right. Coulter's shocking book is not shocking at all. Nor is it novel. It is merely the latest in a long line of name-calling, right-wing conspiracist tracts, a successor to Elizabeth Dilling's Red Network, Fred C. Schwarz's You Can Trust the Communists (To Be Communists), and—a personal favorite—John A. Stormer's None Dare Call It Treason. This last, which sold 2 million copies in 1964, "explained" how the U.S. military had consciously served "the long-range political advantage of the communist conspiracy" in World War II. You can laugh, but by the time the 25th-anniversary updated edition was published, it had sold 7 million copies and Stormer was holding weekly Bible meetings for Missouri state legislators.

    Coulter's cheerleading on behalf of Sen. Joseph McCarthy and "his brief fiery ride across the landscape," as she puts it, is what has her critics most exercised. Doesn't she understand, they ask, that McCarthy wasn't an anti-Communist at all but a dangerous outrider who harmed a noble cause by defaming and giving ammunition to the left? Again they're right—but only on rather drearily familiar grounds. Coulter is closer to the truth on the big question, McCarthy's actual place in the conservative pantheon. For many years he was precisely the GOP folk hero she says—a pivotal figure who invented the inside-the-Beltway insurgency that has been the party's staple for half a century now, currently embodied by flame-throwers like Tom DeLay.

    During McCarthy's peak years, he was a GOP heavyweight egged on by the likes of Senate leaders Robert Taft and William Knowland. In 1952, Dwight Eisenhower, the GOP presidential nominee, shared a platform with McCarthy even though McCarthy had smeared Ike's mentor, George Marshall, by calling him a Communist dupe. And as Coulter says, the people—a lot of them, anyway—loved him, too. More than 1 million signed a petition supporting him during the censure debate of 1954, and half the Republican senators (22 out of 44) voted against the measure. A year after McCarthy's death in 1957 Robert Welch, another conspiracy-monger, founded the John Birch Society to pick up the cudgel and continue the "fight for America." Today, Birchers are remembered as kooks (and were often dismissed as such at the time). But these "little old ladies in sneakers" got a big hug from the conservative movement. Ronald Reagan for one—though mistily depicted of late as the ideological heir of the Democratic "traitors" Truman and JFK—made his political debut stumping for Congressman John Rousselot, a top California Bircher, in 1962.

    And the McCarthy legacy lives on. Remember the attack ad used in the last election against Georgia Democrat Max Cleland—the one that spliced in videotape of Osama and Saddam? The McCarthyites used the same ruse to destroy Maryland Democrat Millard Tydings in 1950, only then it was a composite picture juxtaposing photos of Tydings and Earl Browder, the onetime leader of the American Communist Party.

    Of course, using dirty tricks isn't news in politics—and their use is not limited to the right. Nor, for that matter, is the cry of treason. Woodrow Wilson dusted off the Sedition Act in order to jail critics of World War I. Franklin D. Roosevelt ordered the indictments of more than two dozen isolationists in 1942 on the sham charge that they were Nazi agents. A judge threw the case out, but conservatives didn't forget.

    All Coulter has done is import this approach—the flat-out accusatory style of hardball politics—into the realm of serious political discourse, ignoring the preferred arts of indirection and innuendo. And that's why her critics are agitated.

    Click here to return to top of page.

    Ann Coulter's Dangerous Book (posted 7-28-03)

    Anne Applebaum, writing in the Washington Post (July 27, 2003):

    To anyone who ever tried to understand why the political left has played such a large role in American intellectual life, or why the term "anti-communist" ever became an insult, or why so many allegedly clear-thinking people feared Joe McCarthy more than Josef Stalin, Ann Coulter's new book will certainly prove thought-provoking. I should reveal here that I have spent a great deal of time -- perhaps the better part of the last 10 years -- writing about communism, Stalinism and the West's relationship to both. Yet about halfway through Treason, an extended rant on these subjects, I felt a strong urge to get up, throw the book across the room, and join up with whatever Leninist-Trotskyite-Marxist political parties still exist in America. Even the company of Maoist insurgents would be more intellectually invigorating than that of Ann Coulter. More to the point, whatever side this woman is on, I don't want to be on it.

    It isn't very difficult to explain why this book is so bad. A few quotes from the opening chapter will do it:

    "Liberals have a preternatural gift for striking a position on the side of treason."

    "Whenever the nation is under attack, from within or without, liberals side with the enemy."

    "Liberals attack their country and then go into diarrhea panic if anyone criticizes them."

    "Whether they are defending the Soviet Union or bleating for Saddam Hussein, liberals are always against America. They are either traitors or idiots, and on the matter of America's self-preservation, the difference is irrelevant. Fifty years of treason hasn't slowed them down."

    The rest of the book continues in that vein. Aside from becoming rather tedious after about page 10, Treason fails to explain a number of pertinent points. For example: Who's a liberal? And what is "the left"? Coulter appears to believe that these terms are synonyms, and further confuses both of them with the Democratic Party -- which has, she claims, been suffering from "pusillanimous psychosis" since World War II. But Scoop Jackson was a Democrat, Jeanne Kirkpatrick was a Democrat, even Ronald Reagan started out as a Democrat (and this was after World War II). Robert F. Kennedy actually worked for Sen. McCarthy, as Coulter herself mentions, but fails to elaborate upon. Half the members of the House Committee on Un-American Activities were Democrats. The Truman administration prosecuted Alger Hiss. Kennedy stood up to the Soviet Union in Berlin and Cuba.

    Indeed, there were members of the left who were active anti-communists. Coulter actually quotes George Orwell a couple of times without mentioning that he was a socialist, presumably because she doesn't know. She doesn't mention the anti-communists in the American trade union movement, presumably because she hasn't heard of them, either. Her cartoonish, childlike interpretation of history allows for no nuances -- nor can it help her explain the present. She is notably silent on the subject of Democrats who supported the war in Iraq, for example, making only two glancing references to Sen. Joe Lieberman, and implying that every single Democrat who voted in favor of the resolutions to invade Iraq did so for crude political reasons. But if you tar everyone with the same brush, how can you know, really, what actually happened?

    All of this, of course, might be funny if it were meant to be funny, but it doesn't seem to be. Coulter hasn't got an ironic or witty bone in her body. Her insults are crass and dull-witted, and her jokes fall flat. She has no sense of history and skips back and forth from the Truman administration to the Reagan administration, as if 40 years made no difference. She quotes liberally from newspaper cuttings, television interviews and other conservative diatribes, apparently having done no actual research at all. Worst of all, this is the kind of rhetoric that will allow everyone else to dismiss her as a crank, putting off real debate about these issues for another decade at least.

    And the more successful she becomes, the more damage she will do to her own cause. If her ravings become confused with the work of serious historians, it's possible that the serious reading public will wind up dismissing all of them. I noted, after finishing this book, that a number of prominent conservatives have dissociated themselves from it. With any luck, others will too. Coulter will, of course, start screaming that she's become the latest victim of the left's ongoing secret campaign against McCarthy, but at least that will prevent her from spoiling serious historical investigation into anything else.

    Click here to return to top of page.

    America's First Spaceman (posted 7-25-03)

    NPR profile of Capt. Iven Kincheloe, who took X-2 rocket plane to record height (July 25, 2003):

    His feat is tucked in the timeline of aviation history -- somewhere between Chuck Yeager ripping through the sound barrier in 1947 and John Glenn making his orbital flight in 1962. But in his day, Capt. Iven Kincheloe, who flew a rocket-powered plane to the edge of space one morning in 1956, was as much a star as those other two famous aviators. NPR's Bob Edwards reports.

    On Sept. 7, 1956, the day of his historic flight, Kincheloe was crammed into the cockpit of a Bell X-2. The experimental aircraft was dropped from a Boeing B-50 carrier plane at 29,500 feet. Then Kincheloe, who had flown more than 100 missions in the Korean War, took the X-2 nearly 100,000 feet higher -- to a record 126,200 feet. It was 26,000 feet higher than anyone had ever flown before.

    Dr. Raymond Puffer, an Air Force Flight Center historian, describes what Kincheloe would have seen at that altitude: "The sky turns dark, indigo blue. You can easily see the curvature of the earth. He could see from San Francisco down to Mexico... he was weightless for something like 56 seconds, but he didn't particularly notice it because he was so tightly strapped in..."

    The feat earned Kincheloe celebrity status. He appeared on the popular television game show I've Got a Secret, where the panel was supposed to guess what he did. Because he was so famous, it didn't take long.

    Kincheloe was chosen, along with two other pilots, to fly the next-generation test plane, the X-15. While waiting for it to be ready, he flew test missions with other planes. On July 26, 1958, Kincheloe was killed when the engine of his F-104 airplane failed soon after take-off. He managed to eject but was too close to the ground and parachuted into the flames.

    Neil Armstrong, a friend and fellow test pilot at Edwards Air Force Base in the 1950s, says Kincheloe probably would have been at the center of America's space program.

    "I know had he survived that he would be very much in the middle of whatever was going on subsequent to that point. He may very well have been selected for the astronaut program. He was certainly capable of doing that -- or he might have chosen to do something else. But in any case, he would be at the forefront out at the edge of the frontier and having a ball doing it."

    Click here to return to top of page.

    Are TV Histories a Threat to the Study of History? (posted 7-24-03)

    Magnus Linklater, writing in the London Times (July 23, 2003):

    I enjoyed Michael Wood's TV series In Search of Shakespeare. I liked the way he rushed excitedly up dark Elizabethan streets in Southwark or Stratford-upon Avon, blew the dust off ancient manuscripts, poked about in old theatres and sat in on imaginary script conferences with the Bard. It was first-rate entertainment. But was it history?

    No, says Wood himself, because it lacks the argument and analysis that should form the basis of history teaching. Yes, say a growing number of history students.

    This, they claim, is the way the subject should be taught. Instead of lectures and impenetrable tomes dealing with the textual differences between first and second Quarto editions of Hamlet, or myth and identity in the later sonnets, they would much prefer "TV-style" talks to allow them easier "access" to the plays. They would dump the dons and opt for television alone.

    This, at any rate, is the dyspeptic view of a cross-section of academics, who have issued a report lamenting the state of their thinly attended lectures, and complaining about the unwholesome obsession of modern undergraduates with Hitler, to the exclusion of worthier topics such as the iconography of the medieval court.

    Their report cites poor command of English, ignorance of historical context and a shaky grasp of dates as typical of today's student generation. The very brightest remain as good as they ever were, but as for the rest, one Glasgow University lecturer describes them, in a Hobbesian phrase, as "uninterested, unmotivated or simply not very bright".

    We should take this view with a modest pinch of salt. Show me the academic dealing with a group of uninterested students, and I'll show you an academic who is not teaching very well. Television is a populising medium, and it has done its job well, by bringing in extra punters. The number of history undergraduates is on the increase, thanks in part to the work of TV historians such as Simon Schama and David Starkey; it has happened just as some universities were wondering whether history, as a subject, was beginning to wither on the bough. Instead of complaining, the dons should be delighted. I bet there are a few maths or physical sciences departments that would give their eye teeth for the same exposure.

    The job of a good university teacher should be to build on that initial enthusiasm. I doubt if there is a fundamental difference between the children of previous generations, who cut their teeth on Our Island Story, with its jolly pictures of Bruce and the Spider or Francis Drake playing bowls on Plymouth Hoe, and those who tune in to the delightfully wacky Adam Hart-Davies telling us how the Romans built their walls. The trick is to translate that interest into something more substantial.

    There is, however, one new challenge that today's teachers face which was unknown to their predecessors. It is the incredible ease with which information can be accessed (the word itself slips easily into the transitive form). Once you had, at the very least, to find and read a book or two -however superficially -before you could submit your essay; the very process of doing so cudgelled the brain into a semblance of activity. Now you can download a precis from the internet in seconds, adapt it rapidly and serve it up as your completed work. Not only does this leapfrog the thinking process, it makes you disinclined to tackle the harder task of genuine inquiry. Determining where plagiarism ends and understanding begins is often the biggest task facing examiners.

    Click here to return to top of page.

    Was What Happened in Goliad, Texas a Massacre or an Execution? (posted 7-23-03)

    Simon Romero, writing in the NYT (July 19, 2003):

    In history books, the killing of more than 300 Texan rebels by Mexican troops here has long been known as the Goliad Massacre. But to many residents of Goliad, with its 18th-century Spanish fort and towering monument to the dead, that brutal episode in its history is still open to interpretation.

    At the heart of the dispute, largely between Anglos and Mexican-Americans, is the porous definition of who is a Texan and what is Texas history at a time when Hispanics are growing in number and influence.

    Some of Goliad's Mexican-American residents prefer "execution" to "massacre" in describing what happened here in 1836 because of Mexican law at the time, which was explicit in meting out de facto death sentences for foreigners taking up arms against the government.

    "For so long in Texas history classes it's been drilled into us that Mexicans were the demons and Anglos the enlightened heroes," said Emilio Vargas III, an assistant principal at the Goliad elementary school and a descendant of Canary Islanders who settled here in the 18th century. "On this point we're no longer going to accept it without a fight."

    Such talk has shaken Goliad, where the population of 2,000 is almost equally divided between Hispanics and Anglos, with a small black minority. The dispute has included the Roman Catholic Church, which owns the Presidio de la Bahia, the site of the killings 167 years ago, when American and European settlers were engaged in a war to pry Texas from Mexico.

    Responding to letters and protests from parishioners and residents in Goliad, the Diocese of Victoria two years ago stuck with the long-used interpretation of events and refused to describe the killings as an execution. The church has owned the Presidio, a fort that operates as a tourist site and includes a chapel, since 1853.

    "I'm aware of the sensitivity of the issue, but it's historically been called a massacre, and we don't feel qualified to change the name," Bishop David Fellhauer said.

    The bishop's view might have signaled the end of the dispute, but tempers have continued to flare around Goliad, with many residents refusing to accept the church's position.

    Benny Martinez, president of Goliad's chapter of the League of United Latin American Citizens, said that many Anglos "still hate Mexicans and using 'massacre' is a subtle way for them to express it." Mr. Martinez said he ruffled feathers at a meeting of the Daughters of the Republic of Texas in April when he said that the 1836 killings should be described as an execution.

    Bishop Fellhauer and Newton M. Warzecha, director of the Presidio de la Bahia, consulted historians when a group of residents from the General Zaragoza Society, a Latino rights organization, sought to change the fort's description of events.

    Few experts dispute the brutality of the killings: Mexican forces shot hundreds of Texans on river roads near the Presidio, burned their bodies and left the remains to vultures. Documents from the time show that even among high-ranking Mexican officers there was ambivalence over carrying out the orders from Gen. Antonio Lopez de Santa Anna to kill the Texans, who had surrendered after a battle....

    Some people here think it folly to dwell so much on the past.

    "No wonder our town is not growing," said Rajesh Bhakta, an immigrant from India and manager of the Antlers Inn on Goliad's outskirts. "Who wants to invest in a place with all this unseemly fighting over long-ago affairs?"

    Click here to return to top of page.

    Was Newton Driven Mad by Mercury Poisoning? (posted 7-23-03)

    Joe Schwarcz, writing in the Montreal Gazette (July 19, 2003):

    You know about the falling apple. You know about the prism and the rainbow. You may even have heard that some surveys declared him to be the second most influential person of all time, ranking right in between Mohammed and Jesus. But I suspect few of you realize that Isaac Newton had a passion for chemistry and that he spent about 30 years of his life among the flasks and beakers of his laboratory near Cambridge, in the pursuit of, well, nobody really knows.

    Newton kept extensive notes but never formally published anything about his chemical investigations, probably because at the time such activities were frowned upon. The royals feared that if an alchemist discovered an easy way to make gold, the country's monetary system would be destroyed. On this account, they had nothing to fear from Newton. His genius does not appear to have extended to chemistry.

    Simply stated, genius is seeing what everyone else sees and thinking what nobody else thinks. By this credo, or indeed by any other, Isaac Newton was a genius. He saw that apple fall, and surmised that the force which attracted it to the Earth was the same as that which held the moon in orbit around the Earth. He then went on to formulate the laws of motion and indirectly gave rise to the science of space travel. Newton's Third Law, that for every action there is an equal and opposite reaction, is confirmed each time a rocket is launched. His use of a prism to separate white light into the colours of the rainbow laid the foundations to the science of optics. His Mathematical Principles of Natural Philosophy published in 1687, is one of the most important single works in the history of science. But there was another side to the great man.

    Isaac Newton, by all accounts, was not a pleasant fellow. As a youngster he had battled with his stepfather and even threatened to burn his house down. He hated the man with a passion for forcing him to live with his grandmother, separating young Isaac from his mother. Later in life, Newton showed psychotic tendencies characterized by occasional withdrawal from human contact, fits of anger and periods of depression. His battles with other scientists over details of his theories were ferocious. Let's just say that Newton did not take criticism well....

    According to his notes, he began to dabble in alchemy around 1687, after reading the works of George Starkey, an American who wrote under the pseudonym of Eirenaeus Philalethes (peaceful lover of truth). Starkey was educated at Harvard where he was introduced both to alchemy and medicine and spent his life looking for the "universal remedy." This was some vaguely defined potion that could change substances from one form to another and cure disease.

    Starkey believed that the ancient Greeks and Romans had discovered the secret and encoded the procedure in mythology. He was particularly taken by the story of Vulcan, the husband of Venus, who caught his wife in a delicate situation with Mars. Vulcan didn't like this one bit and hung the two cavorting lovers in a fine metal net for all Olympians to see. In alchemy, Venus stood for copper, Mars for iron and Vulcan for fire. Somehow, George Starkey "interpreted" the mythological story and heated a mixture of copper, iron and stibnite (antimony sulphide) to create a beautiful purple copper-antimony alloy he called "the Net."

    It wasn't clear what "the Net" was supposed to do, but it clearly interested Newton because he left exact instructions for its production. Chances are that Newton believed "the Net" to be a key substance in alchemy, perhaps needed to turn metals like mercury into gold.

    This is where our account gets really interesting because Newton's notebooks describe heating mercury compounds and tasting concoctions brewed with the element. And it is during this period of Newton's life that he began to show mental disturbances and began to suffer from sleeplessness and auditory hallucinations. These are all symptoms of mercury poisoning! Unfortunately, if Newton was indeed poisoned by mercury, it was for naught. As far as we can tell, his chemical exploits never amounted to anything significant.

    Click here to return to top of page.

    John Taylor: Why Was Truman Given a Pass and Nixon Never Was? (posted 7-22-03)

    John Taylor, executive director of the Nixon Center (July 22, 2003):

    What a difference a "D" makes.

    When a diary entry by Democrat Harry Truman was uncovered in early July containing statements some construed as being anti-Semitic, historians and journalists lined up to show how his words were belied by his actions as a courageous supporter of the young state of Israel.

    When President Nixon's White House tapes were found to contain references that some construed as being anti-Semitic, historians and journalists proclaimed him a notorious bigot in spite of his courageous actions in saving Israel from an attack by Egypt and Syria in October 1973.

    A July 17 Los Angeles Times article by Johanna Neuman offers a generous sampling of distinguished authorities granting absolution to the fabled plain speaker from Independence.

    Alonzo Hamby of Ohio University said Truman's comments about Jews were "an outburst, rather like some other outbursts that Truman was capable of from time to time. It's important to understand that Truman grew up in a small town and he absorbed the prevalent ethnic clichés." Readers should please be on the lookout for an historian who cuts Richard Nixon slack on racial and ethnic comments because he grew up in Yorba Linda and Whittier.

    Sara Bloomfield of the U.S. Holocaust Memorial Museum said, "Truman's sympathy for the plight of Jews was very apparent." Israel's late prime minister, Golda Meir, said that RN saved her country with the airlift he ordered during the 1973 Yom Kippur War. But few outside of friends and former aides have argued, as Bloomfield does about Truman, that President Nixon's actions were more important than his words.

    Biographer Robert Ferrell blames Truman's criticism of Jews on his declining poll numbers and various other pressures of office. Try arguing that RN was entitled to express frustration in private at American Jews because four out of five didn't vote for him in 1968.

    Reporter Neuman even refers to historians who offer as further exculpatory evidence that Truman's partner and friend in the haberdashery business was a Jew. The next time a controversial tape segment is released, let's see if the LA Times points out that President Nixon's associates and friends included Henry Kissinger, Herb Stein, Ben Stein, Len Garment, Bruce Herschensohn, William Safire, and Murray Chotiner.

    Balancing a President's words against his actions; assessing him in the context of his times; weighing a leader's whole record and making nuanced conclusions about his or her legacy - that's what scholars are supposed to do. These writers' sensitivity toward President Truman is admirable. With a few exceptions, President Nixon is still waiting for his fair deal from the academic community.

    The easy answer is that most scholars are liberal. Yet Dwight D. Eisenhower and even Ronald Reagan have gotten a fairer shake than RN from historians. President Nixon himself believed it would take at least half a century for his legacy to be viewed objectively. He realized that residual passions over Alger Hiss and Vietnam would keep his contemporaries from making balanced judgments.

    To be fair, President Truman is also subject to harsh judgments. The sharpest criticism I've read of his diary was a separate article in the LA Times by Peter J. Kuznick, director of the Nuclear Studies Institute at American University. Dr. Kuznick doesn't think Truman should have used atomic bombs against Japan to end World War II, a preoccupation that evidently prompts a much darker interpretation of the Truman diary. He writes, "We'll never know if Truman's attitudes toward minorities - including his comment in 1911 to Bess that he hated 'Japs' - influenced his decision…" Dr. Kuznick's article is headlined, "We Can Learn A Lot From Truman the Bigot." Now that sounds like a Nixon scholar.

    Click here to return to top of page.

    No One Knows How the Wright Brothers Plane Worked (posted 7-22-03)

    Meg Jones, writing in the Milwaukee Journal Sentinel (July 21, 2003):

    What better way to commemorate the 100th anniversary of the Wright brothers' first flight than by reproducing the famous plane Orville and Wilbur flew.

    After all, those who came up with the idea figured, how hard could it be? The plane is hanging in the Smithsonian, and the brothers probably left behind lots of records and notes explaining exactly how they built that first aircraft.

    Not so.

    While humans have sent men to the moon and our machines have reached the outer edges of the galaxy, no one knows exactly how the Wrights got off the ground at the Kill Devil Hills sand dunes in North Carolina on Dec. 17, 1903.

    It's "the world's greatest detective story," said Ken Hyde, who organized construction of the 1903 Wright Flyer reproduction. The reproduction will appear at EAA AirVenture in Oshkosh later this month and is scheduled to fly at the exact same spot near Kitty Hawk 100 years to the minute this December to commemorate the brothers' achievement.

    A century ago, many were scurrying to be the first to fly a heavier-than-air plane. The Wright brothers were understandably paranoid about a competitor stealing their inventions.

    "You've got to remember, Orville and Wilbur were the first businessmen who had put in place protection from corporate espionage," said Hyde, executive director of the Wright Experience in Warrenton, Va. "I've often thought if they were alive today they would probably have the biggest paper shredder."

    The brothers scribbled out their design for the flyer's 12-horsepower engine, but then threw away their sketches. Wilbur Wright was supposed to write their memoirs, but he died in 1912 before doing so.

    So that left the engineers charged with making an exact reproduction scouring bits and pieces of information gleaned from their letters, photographs, surviving parts of their gliders and airplanes and recollections of those who knew them.

    Click here to return to top of page.

    James F. Brooks: Rewriting the History of Slavery (posted 7-22-03)

    Scott McLemee, writing about James F. Brooks's new book, Captives and Cousins: Slavery, Kinship, and Community in the Southwest Borderlands (University of North Carolina Press); in the Chronicle of Higher Education (May 16, 2003):

    Accounts of slavery in America tend to begin in 1619, with the first shipload of Africans sold in Virginia. "People think of it as something that mostly existed in the Black Belt," says Mr. Brooks, referring to the region of the Deep South where African slaves worked the land. "And people assume that it ended in 1865." But a different form of bondage emerged in the 1500s, when Spanish invaders encountered the indigenous people of North America. A "distinct slave system," as Mr. Brooks calls it -- similar to chattel slavery in some ways, but distinct in others -- grew out of ethnic conflicts and commercial exchanges in the region that came under Spanish influence. And it existed until well after the Civil War....

    It was, Mr. Brooks says, a French anthropologist's analysis of slavery in Africa that opened his eyes to how the culture and economy developed in the American Southwest. In The Anthropology of Slavery (1986) -- which Mr. Brooks read after the University of Chicago Press published it in English translation in 1991 -- Claude Meillassoux provides a neo-Marxist interpretation of slavery that challenges many of the assumptions that grew out of the experience of the Atlantic slave trade.

    The chattel slavery practiced in the American South defined the slave as, in principle, an object available for sale -- to be purchased as a source of labor. But the notion of the slave as commodity, said Mr. Meillassoux, worked only in a society characterized by advanced market relations. It didn't apply very well to cultures in which slaves tended to be prisoners taken in combat. Besides their labor, those slaves had symbolic value as proof of a tribe's power and honor.

    Or rather, the honor of its men. For, as Mr. Meillassoux's analysis suggested, the role of slavery in Africa was ultimately inseparable from the rules governing gender. Male authority was exercised over both enslaved captives and the tribe's "kin" (the women and children). An enslaved captive might even be transformed into kin, through marriage or adoption -- unlike the situation on the Southern plantations, where the line between master and slave was fixed and immutable.

    The interaction of violence, honor, and kinship in African slave systems struck Mr. Brooks as key to understanding the zone of contact between the Spanish and Native American groups in the Southwest. Well before the European invasion of America, the lives of indigenous peoples often included the practice of raiding, with members of one tribe enslaving the women and children of another. The importance of the captives went well beyond their ability to toil -- or even their value as status symbols. Slaves provided valuable information about the language and way of life of the tribe from which they had been kidnapped. The captors might ransom slaves back to their kin -- an exchange that could also serve as the occasion for other useful economic transactions, to the advantage of both groups. Or a slave might be fully assimilated into the captor tribe.

    "It sounds like 'soft' slavery when an enslaved person can become kin," says Mr. Brooks. "But it actually perpetuates the system. When a slave becomes kin, you lose that unit of prestige that comes with ownership. If you intend to remain a high-prestige person, that means you have to go get more slaves."...

    When the Spanish arrived, they were by no means shocked at the indigenous captivity system. On the contrary, it was the one thing about the New World that looked familiar. In Spain, Roman Catholics and Muslims alike had been seizing captives from one another for generations -- according to analogous codes of manly honor -- resulting in similar forms of economic exchange and cultural cross-pollination. Mr. Brooks recounts a story from 1541 involving an Indian slave whom the Spanish dubbed El Turco ("The Turk") because of his resemblance to Islamic captives they had known back home.

    "Indian slavery was prohibited again and again by the Spanish crown, which was quite sensitive about this compared to the English," says Mr. Brooks. While Catholic theologians remained undecided on whether or not Africans had souls, they had concluded that Indians did -- which made enslaving them a problem, at least in theory. "On the ground, of course, it could be accomplished in any number of ways," says Mr. Brooks. "The subterfuge to develop alternative forms of slavery, without calling it that, was very sophisticated."

    Click here to return to top of page.

    The Origin of the White House Memoir (posted 7-18-03)

    Walter Isaacson, writing in the New Yorker (July 14, 2003):

    The venerable tradition of the White House memoir goes back to President James Madison’s extraordinary young slave Paul Jennings. Jennings’s memoir, which he wrote after Daniel Webster bought his freedom, contained most of what we have come to expect from such books, including thumbnail sketches of important players: “Mr. Robert Smith was then Secretary of State, but as he and Mr. Madison could not agree, he was removed, and Colonel Monroe appointed to his place. Dr. Eustis was Secretary of War—rather a rough, blustering man; Mr. Gallatin, a tip-top man, was Secretary of the Treasury; and Mr. Hamilton of South Carolina, a pleasant gentleman, who thought Mr. Madison could do nothing wrong, and who always concurred in every thing he said, was Secretary of the Navy.”

    Jennings also provided accounts of the policy battles over the War of 1812 (“Colonel Monroe was always fierce for it”) and tried to debunk a few myths. “It has often been stated in print that when Mrs. Madison escaped from the White House, she cut out from the frame the large portrait of Washington . . . and carried it off,” Jennings wrote. “This is totally false. She had no time for doing it. It would have required a ladder to get it down.” (The Gilbert Stuart painting was actually saved by a doorkeeper and a gardener, he said.) Most important, Jennings’s memoir set the genre’s standard for Presidential praise. “Mr. Madison, I think, was one of the best men that ever lived,” he wrote. “I never saw him in a passion, and never knew him to strike a slave.”

    The modern era of such memoirs began with a gusher from the Administration of Franklin Delano Roosevelt, whose insistence that his aides display a “passion for anonymity” had only a temporary restraining effect. The first of the Roosevelt books, “After Seven Years,” by the brain-truster Raymond Moley, established the tone: praise for the patron that subtly shades into self-praise, inside accounts of policy struggles in which the author turns out to have been right, a dollop of historical commentary, some gossip that gently settles old scores, and a good index for colleagues who may not want to read the whole thing. Moley was sometimes deft enough to work many of these themes into a single sentence: “I was able to achieve almost the impossible—the maintenance of friendly relations with both Louis Howe and Sam Rosenman—and the rivalry of these two men was the single factor that might have disrupted the logical course of events.”

    Although most such memoirs ended up in the ash bin, a few helped elevate the genre. Roosevelt’s best speechwriter, Robert E. Sherwood, who had won three Pulitzer Prizes as a playwright, won another for a memoir cast as a character study, “Roosevelt and Hopkins.” In 1965, two of John Kennedy’s speechwriters produced similarly stately tomes. Theodore Sorensen’s “Kennedy” is notable for being unflinching, at least in parts. Its assessment of the failed Bay of Pigs invasion, for instance, begins by saying of Kennedy, “His own mistakes were many and serious,” and then proceeds to catalogue them. The eminent historian Arthur Schlesinger, Jr., has sometimes been labelled a hagiographer for the Camelot chords he struck, but “A Thousand Days” is an intricate and serious narrative biography with sweeping historical themes and incisive drypoint character sketches. His depiction of Secretary of State Dean Rusk is typical: “As he would talk on and on in his even, low voice, a Georgia drawl sounding distantly under the professional tones of a foundation executive, the world itself seemed to lose reality and dissolve into a montage of platitudes.”

    The Watergate scandal posed a new challenge to White House memoirists: how to deal with the character flaws that unravelled a Presidency. William Safire, a Nixon speechwriter, set out to produce a book that would be “sympathetic but not sycophantic,” and the result, “Before the Fall,” succeeds by peeling back the multiple layers of Nixon’s tortured personality and offering up candid (and amusing) portraits of Henry Kissinger and other members of the court. Safire defended much of Nixon’s record, but recognized that Nixon might be “the only genuinely tragic hero in our history, his ruination caused by the flaws in his own character.” In “White House Years,” Kissinger, too, was willing to explore the loneliness, paranoia, insecurity, and lack of generosity that infected Nixon and unsettled his tenure. He notices the little things about Nixon—“his pant legs as always a trifle short,” his look of defiance mixed with uncertainty—and concludes by ruminating about “what extraordinary vehicles destiny selects to accomplish its design.”

    It subsequently became part of the tradition for aides to try to establish their credibility and integrity, and make some headlines, by including a few denigrating revelations about their former patrons. James Fallows, a Carter speechwriter, wrote a magazine memoir before Carter even had a chance to run for reëlection, in which he described Carter as “passionless” and revealed that the President micromanaged the sign-up schedule for the White House tennis court. Even more damaging was Donald T. Regan’s revelation about “the most closely guarded domestic secret” of Ronald and Nancy Reagan: “Virtually every major move and decision the Reagans made during my time as White House Chief of Staff was cleared in advance with a woman in San Francisco who drew up horoscopes to make certain that the planets were in a favorable alignment.” The current Administration has already produced such a memoir, by a former speechwriter, David Frum, in “The Right Man,” which is generally as flattering as the title implies but contains a few discomforting little revelations about the tenor of the White House—Frum, who is Jewish, opens the book with the line “Missed you at Bible study” spoken by one aide to another—and about Bush’s own shortcomings: “He is impatient and quick to anger; sometimes glib, even dogmatic; often uncurious and as a result ill-informed; more conventional in his thinking than a leader probably should be.”

    Minor score-settling aside, though, previous White House memoirs tried to appear reflective, above the fray, and candid about mistakes that were made. This was true, certainly, of the only other best-selling memoir so far from a Clinton aide, George Stephanopoulos’s “All Too Human.” Stephanopoulos dealt with the scandals in an admirably honest manner, and revealed his own conflicting emotions in an anguished portrayal of the period. “I didn’t think I was a hypocrite, because my defense of Clinton against past bimbo eruptions had been predicated on my belief that he wouldn’t create new ones, but maybe I was complicit because when I worked for Clinton I had been willing to suspend my disbelief about some of his more suspect denials,” he writes. “For several years, I had served as his character witness. Now I felt like a dupe.”

    Click here to return to top of page.

    Truman: Man for Our Times? (posted 7-18-03)

    Peter J. Kuznick, associate professor of history and director of the Nuclear Studies Institute at American University, writing in the LA Times (July 18, 2003):

    Truman's racism and anti-Semitism may surprise many Americans because he has been sanctified in recent years by hagiographic biographers such as David McCullough and by Democrats and Republicans who admire his leadership during the Cold War. As the country has moved to the right politically, Truman, who toward the end of his presidency had the lowest approval ratings of any modern president, has risen to "near great" status.

    Perhaps the latest revelations will inspire a more critical appraisal of Truman and a more insightful understanding of how his personal limitations made him incapable of seizing key historical opportunities.

    As a product of the corrupt Tom Pendergast machine that ran Kansas City, Truman was largely shunned during his first term in the Senate by his colleagues. After barely winning reelection without Franklin D. Roosevelt's support, he was put on the ticket in 1944 by conservative party bosses intent on ousting the crusading liberal Henry Wallace from the vice presidency. They chose Truman not for his convictions or qualifications but because he was pliable, with few enemies.

    A Gallup poll released the week of the 1944 Democratic Convention asked Democratic voters whom they wanted as vice president. Sixty-five percent chose Wallace; 2% favored Truman. Roosevelt thought so little of his new vice president that during the 82 days before Roosevelt died the two met just twice. Roosevelt did not even inform Truman that the U.S. was building an atomic bomb.

    Limited in vision, ill prepared and plagued by self-doubt, Truman stumbled through his early presidency, and his missteps would chart the course of future history. He quickly replaced the New Dealers and progressives with conservatives like Jimmy Byrnes, who encouraged Truman's confrontational stance toward Washington's wartime ally -- the Soviet Union -- at a time when the Cold War might have been averted.

    We'll never know if Truman's attitudes toward minorities -- including his comment in 1911 to Bess that he hated "Japs" -- influenced his decision to drop two atomic bombs at a point when the Japanese were already militarily devastated and seeking acceptable surrender terms. Truman understood that he was embarking on a course that could ultimately bring the extinction of mankind.

    Truman always insisted that he felt "no remorse" over that decision, about which, he commented, he "never lost a minute's sleep." Condoleezza Rice picked Truman as man of the last century in an interview with Time magazine, but he was no hero to most of his contemporaries. Those who subsequently orchestrated his historical revitalization have often used his refurbished image to justify a conservative political agenda.

    But as the United States rushes into a new era of global domination, military confrontation, international polarization, questionable alliances, spiraling defense budgets and increased reliance on nuclear weaponry, it behooves us to think anew about the wisdom of Truman's judgments and decisions.

    We should question whether his was the kind of presidential vision our own troubled times demand. And we should consider the dangers of placing unlimited power in the hands of extremely limited political leaders.

    Click here to return to top of page.

    Truman, Prejudice and America (posted 7-18-03)

    R. Emmett Tyrrell Jr., writing in the Washington Times (July 17, 2003):

    Richard Cohen, a columnist for The Washington Post, has been provoked by recent revelations about President Harry Truman to asseverate in all his comely humility that "It is ... a good thing that he [Truman] did not express his feelings to someone like me, because — had the Secret Service not been around — I would have decked him." Poor Harry, what did he say all those years ago that brings the he-man out of this otherwise exquisite moral conscience?

    History is the greatest of the humanities. To remind us of its consequentiality it leaves specimens of itself around for later generations to discover to their amazement and edification. The other day Mr. Cohen along with millions of other Americans discovered a specimen of the first half of the last century when the contents of a hitherto undiscovered diary from 1947 was made public by the National Archives. The 5,500-word diary was in the handwriting of President Truman. It had been found scrawled in the diary section of a book that had been gathering dust in the Truman Library for decades, and rightly so. The book's title is, alas, The Real Estate Board of New York Inc. Diary and Manual 1947. Not surprisingly visiting historians dismissed it as an old reference book, devoid of much value to them in their reconstructions of Truman. They were thunderously wrong.

    The diary section can be read as a personal confession from the president to his mother or perhaps to a sympathetic friend seated with him at the end of the bar. In smoldering dudgeon, the 33rd president opined, "The Jews have no sense of proportion nor do they have any judgement [a popular spelling in the 1940s] on world affairs." He had been provoked by a call he had received from his Jewish former treasury secretary Henry Morgenthau. Morgenthau was seeking Truman's assistance on behalf of a group of Jewish refugees from Hitler's Europe. That irritated Truman. "The Jews, I find are very, very selfish. They care not how many Estonians, Latvians, Finns, Poles, Yugoslavs or Greeks get murdered or mistreated as D[isplaced] P[ersons].... " And the president's rant went on to embrace other matters, "When the country went backward — and Republican in the election of 1946, this incident [another occasion when Morgenthau assisted Jewish refugees] loomed large on the D[isplaced] P[ersons] program."

    Americans have come to admire Harry Truman as a flinty defender of American interests. He was a doughty combatant and a very good president, at least in foreign affairs. Not surprisingly we have forgotten just how much controversy his administration found itself in. He was apparently honest, but nearly a dozen members of his government were convicted of criminal behavior, including his appointments secretary.

    Truman was also a fiery partisan. Readers of his very informative memoirs will note that he has rarely a generous word for those who oppose him. His remark, that "the country went backward" in 1946, is typical. In his memoirs he lumps isolationists such as Sen. Robert Taft in with Ku Kluxers and members of the far-right Silver Shirts — he was not being facetious. To him, most Republicans were reactionaries, and he was not any gentler toward those political parties on his left.

    The New York Sun has published specimens of his prejudices against other ethnic and religious groups, for instance, the Chinese and "Japs" whom he told his wife he "hate[ed]." Blacks, "wops" and others came off no better in his private reflections. When I read these outbursts, I was at first startled, but then I thought back about the America of the early 20th century. Its citizens almost all had strong prejudices.

    An important thing to remember is that America has changed. Few people hold such strong beliefs today, even in private. What is more, Truman's generation began an effort to mollify such prejudices and to extend tolerance to all.

    As for Truman, his public policies favored equal rights and statehood for Israel. Those policies were not easily implemented. History proceeds slowly.

    The irascible, bigoted Harry Truman that again stands revealed in this long ignored specimen of history brings to mind another truth that since the political battles of the 1990s I have become very aware of. Political commitment breeds anger and animosity. The 33rd president was for all his faults a decent man, but like most politically committed people he came to dislike and distrust those who opposed him. In the 1940s, he could become very angry with Morgenthau for the former secretary of state's importunities on behalf of Jewish refugees. In reading Truman's memoirs, you will see he had an even more intense ire for Republicans. Naturally Republicans had the same view of him. Politics breeds contempt.

    In the 1990s, a president was caught in obvious ethical and legal violations. What saved him was the mutual contempt the political parties hold for each other. Hillary Rodham Clinton's memoirs are as angry toward Republicans as were Truman's. And you can be sure if Newt Gingrich ever quiets down long enough to produce a memoir he will match them both in spite.

    Harry Truman is not the only politician made angry by politics. A more intriguing point for me would be to know how many members of the political class enter politics free of anger. In my experience, the only one I knew who seemed to be free of anger was Ronald Reagan, whom Richard Cohen has few charitable thoughts for.

    Click here to return to top of page.

    The Media's Double Standard: Comparing Coverage of Nixon's Anti-Semitism and Truman's (posted 7-18-03)

    Jason Maoz, senior editor of the Jewish Press, writing in frontpagemag.com (July 17, 2003):

    Harry Truman reached out from the grave last week and exposed the media’s double standard when it comes to judging Democrats and Republicans. A librarian at the Truman Library in Independence, Missouri, discovered a 1947 diary of Truman’s that had been sitting unopened on a shelf for some four decades. The book contained the following edifying remarks:

    “The Jews, I find, are very, very selfish. They care not how many Estonians, Latvians, Finns, Poles, Yugoslavs or Greeks get murdered or mistreated as D[isplaced] P[ersons] as long as the Jews get special treatment. Yet when they have power, physical, financial or political, neither Hitler nor Stalin has anything on them for cruelty or mistreatment to the underdog.”

    The media’s response? Either a rush to “explain” the remarks in a positive light or a relative disinterest (playing down the entries about Jews while highlighting some other aspect of the diary). The New York Times, for example, headlined its July 11 story on the diary “Truman Wrote of ‘48 Offer to Eisenhower” — and didn’t get around to blandly mentioning Truman’s anti-Semitic comments until the sixth paragraph.

    The Times failed to give its readers the full flavor of Truman’s rant, reproducing only a partial quote from the diary and excising the president’s comparison of Jews — a mere two years after the Holocaust — with Hitler and Stalin.

    Just about every media account quoted so-called experts who strained to place Truman’s remarks in historical context and to differentiate between his words and deeds. Such fair-mindedness is, of course, noticeably lacking whenever the media rehash the anti-Semitic statements made by Richard Nixon, whose deeds vis-a-vis Israel trumped Truman’s — Nixon saved Israel from catastrophe during the Yom Kippur War while Truman, after granting recognition to Israel in 1948, refused to provide desperately needed arms to the new Jewish state as it fought for its life against invading Arab armies.

    Some of the aforementioned “experts” professed shock at the very idea that Harry Truman could have harbored dark thoughts toward Jews. Sara Bloomfield, director of the United States Holocaust Memorial Museum, reacted with a particularly appalling display of ignorance: “Wow!” she said. “It did surprise me because of what I know about Truman’s record.”

    Ms. Bloomfield obviously doesn’t know very much. At a Cabinet meeting in 1946, Truman complained bitterly to his Cabinet about Jewish organizational leaders, remarking, “If Jesus Christ couldn’t satisfy them here on earth, how the hell am I supposed to?....I have no use for them and I don’t care what happens to them.”

    On another occasion, referring to Jews who were pressing the case for a Jewish state, Truman snapped to some aides, “I’m not a New Yorker. All these people are pleading for a special interest. I’m an American.”

    Truman’s anti-Jewish tantrums were hardly limited to his inner circle: Ted Thackrey, editor of the New York Post and husband of the paper’s flamboyant publisher, Dorothy Schiff, recalled how stunned he and his wife were when they paid a call on Truman at the White House and broached the subject of Palestine. “Now, Thackrey,” Truman said, anger visibly rising, “if only the [expletive deleted] New York Jews would just shut their mouths and quit hollering.”

    In his book "Confessions of a White House Ghostwriter," James Humes, a speech writer for five U.S. presidents, relates a little-known but highly revealing story that was told to him by the television producer David Susskind, who worked on a documentary with Truman several years after the latter left office.

    “Susskind,” writes Humes, “said that each morning...he would arrive at Truman’s house at Independence. He would wait on the porch on a cold February day while Mrs. Truman went to inform her husband of his arrival. After about the fourth morning, he asked the president in his walk why he was never asked inside.

    “You’re a Jew, David,” Truman replied, “and no Jew has ever been in the house.”

    According to Humes, Truman went on to explain to a stunned Susskind that the house was his wife’s: “Bess runs it, and there’s never been a Jew inside the house in her or her mother’s lifetime.”

    Click here to return to top of page.

    Should We Be Selling Our National Heritage? (posted 7-18-03)

    Bruce Craig, writing in the San Francisco Chronicle (July 16, 2003):

    Did you ever hanker to own a fragment of the nation's history, a hallowed bit of ground from Gettysburg, perhaps, or a paperweight made from Independence Hall bricks? Your wish may soon be granted if Golden Gate National Recreation Area's scheme to market bits of heritage under its control at Alcatraz expands to other units in the National Park Service.

    The financially strapped park recently began selling boxed chunks of the famous federal prison on Alcatraz Island. The sale of chips off the old cell blocks to susceptible tourists may make money, but it violates the spirit if not the letter of the NPS guidelines. It is an ill-conceived venture with undesirable consequences, and the NPS should terminate the program immediately.

    The prison complex known as "The Rock," which once housed notorious criminals such as Al Capone, George "Machine Gun" Kelly and Robert "Birdman" Stroud came under the care of the NPS in 1972 and today is a part of the Golden Gate National Recreation Area, a unit of the park system.

    Hit by budget cuts, the GGNRA staff discovered that it did not have sufficient funds to properly renovate the historic structures on the island. Particularly taxing were the high costs involved in removing a large amount of so-called debris -- remnants from the historic structures under renovation including a decaying cell block and an old guard house.

    On the grounds that the debris has no intrinsic "research value or other unique characteristics" and would never be placed in the park's museum, park staff came up with the idea of selling bits of the debris to visitors as "mementos." For a mere $4.95, anyone could buy a piece of "The Rock" and even feel good about the purchase! As the NPS brochure given to the buyers encouragingly explains, the purchase helps the NPS "accomplish its sustainability goals as well as provide support for preservation and rehabilitation of this National Historic Landmark."

    The purchase would contribute toward the costs of hauling away debris and help offset other preservation expenses. It sounds great -- an ideal marriage between hardheaded business sense and responsible protection of the historic heritage.

    In reality though, the sale of cultural remnants and specimens is a dangerous deviation from time-honored NPS historic preservation policies and practices. According to NPS guidelines designed to frame park superintendents' decision making, "the sale of original objects, artifacts, or specimens of a historical, archaeological, paleontological, or biological nature is prohibited." Moreover, the guidelines clearly state that the sale of "any object or item that is fashioned from or incorporates parts" of various classes of historic resources and objects "is an offense against the ethical standards upon which the Service was founded."

    Click here to return to top of page.

    We Are In Danger of Losing Our Historical Memory (posted 7-18-03)

    Mona Charen, writing in the Washington Times (July 18, 2003):

    Bruce Cole, chairman of the National Endowment for the Humanities, enjoys a nice view of the Capitol dome from his office window. He is less satisfied with what he sees of America's common culture.

    Mr. Cole, who taught Renaissance art for many years before undertaking a stint in government, thinks the United States is in danger of losing its national identity through a loss of historical memory. If the words Yorktown, Bleeding Kansas, Reconstruction, Ellis Island, Marbury vs. Madison, "Remember the Maine," the Spirit of St. Louis, Midway, "I shall return," the Battle of the Bulge, the Hiss/Chambers case and "Ich bin ein Berliner" mean no more to most Americans than to the average Malaysian, what is it that makes us Americans?

    Part of what makes America unique is that nationality arises from this shared history and from shared values and beliefs. It is not possible to become a Frenchman, a Swiss or a Russian by moving to those countries and adopting their beliefs. Nationality there is too bound up in blood, ethnicity and land. But every immigrant who arrives in the United States can become an American by adopting our beliefs.

    There was a time when we had so much confidence in the superiority of our way of life that we aggressively taught our values to new immigrants and insisted they master the basics of American history, the English language and civics before being eligible for citizenship. Today, we're not even teaching history to our own schoolchildren. And we are in the grip of a truly frightening collective ignorance. As Cole warns, when you don't know your history, you're more inclined to believe the kooky versions of it served up by everyone from Oliver Stone to Michael Moore. That is why the National Endowment for the Humanities (NEH) is sponsoring a "We the People" initiative to improve the teaching of American history.

    If you doubt the scope of the problem, consider a poll commissioned by the American Council of Trustees and Alumni. David McCullough, author of "John Adams" and other wonderful works of history, said, "Anyone who doubts that we are raising a generation of young Americans who are historically illiterate needs only to read this truly alarming report."

    Testing only seniors at the top 55 liberal arts colleges, the poll consisted of questions from a high-school-level exam (or what used to be high-school-level work). Eight-one percent received a grade of D or F. Only one student got every question right.

    Thirty-five percent thought that "From each according to his ability, to each according to his needs" (the Marxist nostrum) was in the Constitution. Thirty-four percent didn't know. More than half thought Germany, Italy or Japan was a U.S. ally during World War II. Only 29 percent knew Reconstruction referred to post-Civil War political arrangements. Thirty percent believe the president may suspend the Bill of Rights in wartime. (They didn't ask how many knew what the Bill of Rights was.)

    Only 29 percent could correctly place the Gulf of Tonkin Resolution in the context of the war in Vietnam. Forty percent could not place the Civil War in the correct half-century. (Ken Burns, call your office.) Only 42 percent knew to whom the words "first in war, first in peace, first in the hearts of his countrymen" referred (George Washington). Fewer than one-quarter could identify James Madison as the "father of the Constitution," and only 22 percent recognized the words "Government of the people, by the people and for the people" as belonging to the Gettysburg Address. (Here's a question for buffs: Who was the keynote speaker at Gettysburg that day? Answer: Edward Everett.)

    Mr. Cole hopes the NEH grant to improve the teaching of American history will spur colleges to reinstate history requirements. Among the 55 leading schools in the survey, none requires a course in American history. In grades K-12, history has been replaced by "social studies," which is like replacing beef stew with Gatorade.


    comments powered by Disqus

    More Comments:


    Eugene - 1/8/2004

  • History News Network