History People Are Talking About Archives 8-29-03 to 10-23-03





  • Michael Beschloss: The One Question Voters Should Ask Themselves

  • The Hindu Holocaust

  • Cover-Up Alleged in Probe of USS Liberty

  • Digging Up the Dead to Settle Historical Debates: A Good Idea?

  • Mark von Hagen: NYT Should Give Up 1932 Pulitzer Prize for Reporting by Duranty

  • A New Theory of Minoan Decline

  • "The Reagans": The TV Movie

  • The Battle Between History and Social Studies

  • Martin Kramer: Dershowitz Vs. Finkelstein

  • Lewis and Clark Celebration Is Being Ruined by Anti-American Academics

  • Report: Unit Killed Hundreds in Vietnam

  • Cannibalism Has Ancient Roots

  • The Wright Brothers' Hyperbole

  • It's Time to Re-Evaluate FDR and Some Other Presidents

  • Faking Biblical History

  • The Mysterious Death of Subhash Chandra Bose

  • What's New in Historic Preservation

  • Columbus Beat Columbus to America

  • David Greenberg: Nixon's Anti-Semitism

  • Edward Said's Dishonest Career

  • Only Now Are We Finally Recognizing the Underrated Genre Of Literary Reportage

  • Descendants of the Confederates Who Settled in Brazil

  • Attitudes Toward Suicide Through History

  • L'Apres L'Empire?

  • Myth of Robin Hood Reconfigured to Appeal to Theater Audiences in the 16th Century?

  • Historians Pledge to Be Honest in New History of Indonesia

  • Journalists' Anecdotes from the JFK Assassination

  • Spiro Who?

  • In the 1980s African-American Studies Was All About Black Men and Women’s Studies Was All About White Women

  • Iain McCalman: Writing for the Popular Market

  • Arnold as History

  • Coulter Loves to Generalize

  • Is Israel a Pariah Nation?

  • Tom Palaima: NYT Reporter Chris Hedges Copied Hemingway ... Was It Plagiarism?

  • Did the Catholic Church Retard the Growth of Science?

  • Did Kennedy Plot to Kill Diem?

  • Ronald Reagan, Man of Letters

  • Mark Oppenheimer: The 60's Was About Style

  • Stanley Kutler: Henry Kissinger, Historian?

  • Thomas J. Curry: The Confusion Concerning Religion and the First Amendment

  • How Ecumenical Was Spain Under Islamic Rule?

  • Ford and Lindbergh, Anti-Semites Who Helped Hitler

  • Utah Textbook Writer Neglects Dark Side of State's History

  • Chester E. Finn, Jr.: States Are Failing to Educate Students in History

  • Californians Seek to Right an Old Wrong for 'Repatriated' Mexican Americans

  • Reagan and Thatcher:"Linked by the Lord"

  • Jewish? Africans Knew It All Along; DNA: Genes Support a Tribe's Belief

  • Did Kennedy's Many Illnesses Adversely Impact His Presidency?

  • In Defense of the CIA's Bill Colby

  • Poles Enraged By Memorial To Expelled Germans

  • The Myth of the Lazy Native

  • Italy Obsessed with the Killing of Aldo Moro

  • McGuire Gibson: We Are Losing the Cities of Ancient Sumer

  • Reagan's Letters: Supporters Say They Show His Serious Side

  • After Historic Flight, Wrights Went to Court

  • Native American History Needs the Native American's Perspective

  • Was Hitler Hypnotized After WWI? Did this Account for His Feeling of Destiny?

  • Remembering Japan's Occupation of Manchuria--and Its Slave Labor Camps

  • Has Italy Covered Up Its Fascist Past?

  • When Wall Street Was Bombed

  • Campus Watch, One Year Later

  • Scientists Can Be Eccentric in Every Way But One

  • Fred Vinson ... Was He a Better Chief Justice than People Think?

  • Hitler's Fan Mail to American Eugenics Leaders

  • Magruder Lied

  • Yoichi Funabashi: The Third Atomic Bomb

  • Too Soon to Include 9-11 in History Textbooks?

  • Scotland's Forgotten History: Connections to Slavery

  • Even Big Events Like 9-11 We Forget After Awhile

  • The Neglected Invasion of Italy in 1943

  • Gay History's Slow Beginnings

  • The Beatles Did More to Bring About the Demise of the USSR than Alexander Solzhenitsyn and Andrei Sakharov

  • Paul Gagnon: Students Need to Be Taught to Appreciate America's Democratic Heritage

  • H.W. Brands: It's Time to Take the Founding Fathers Down a Notch or Two

  • The Nuclear War that Almost Happened in 1983

  • When J. Paul Getty Helped Hitler

  • Preserving the Past in a Digital Era

  • You Try Writing a Biography of Arafat

  • Bernard Lewis's Ahistorical Approach to Islamic History

  • Christians Are Ignoring History

  • Gospel Music's Scottish Roots

  • Remembering the Slaughter of Thousands of Foreigners in Japan After the Earthquake of 1923

  • Students Are Ignorant of History

  • Ancient Greek Had Brain Surgery

  • The Colonial Origins of the Rwanda Genocide

  • The Stuff Museums Decline to Exhibit

  • How Textbook Publishers Are Dealing with 9-11 as History

  • Howard Meyer: Embracing the World Court

  • Michael Novak: Why He Loves Mel Gibson's Movie

  • Irving Kristol: What Is Neoconservatism?


    Click here to return to top of page.

    When Wall Street Was Bombed (posted 9-18-03)

    James Barron, writing in the NYT (Sept. 17, 2003):

    On the block where it happened, there were no "we will never forget" speeches, no candles or bronze plaques bolted to the wall that has never been repaired. All that was there yesterday was the noontime crowd, swirling by with lunch to be gulped, errands to be run and an afternoon of work waiting to be done. In other words, no one was paying much attention.

    That was pretty much what the noontime crowd was doing on Sept. 16, 1920 -- 83 years ago yesterday -- when a bomb exploded there. And that was why, after the dead had been taken to the morgue and the injured to hospitals on that Thursday afternoon, there were so many descriptions of the bomb-laden cart that had been parked beneath a window of the J. P. Morgan & Company bank headquarters at 23 Wall Street.

    In the aftermath, there were questions: What had the horse looked like? What had been painted on the cart? Some witnesses recalled the letters "D," "N" and "T," others the word "dynamite," others the word "DuPont." And what color was the smoke, anyway? Black, from dynamite? Yellow, from nitrogylcerine? Blue, from some other explosive? Among witnesses who survived the devastating hail of metal and glass, there was no consensus.

    But the damage was clear. The fortresslike facade of the Morgan building was pocked with craters that remain deep enough to sink a palm into. The columns of what is now Federal Hall, across the street, were blackened. More than 30 people were killed and several hundred wounded, and the damage exceeded $2 million -- more than $18.4 million in 2003 dollars.

    "The number of victims, large though it was, cannot convey the extent of the inferno produced by the explosion, the worst of its kind in American history," Paul Avrich, a professor of history at Queens College, wrote in reviewing the case more than a decade ago.

    The investigators sniffing for clues long ago went from being detectives to historians. The police never charged anyone in the bombing, and it is a mostly forgotten moment in New York City history.

    "Nobody remembers," said Beverly Gage, whose book "The Wall Street Explosion: Capitalism, Terrorism and the 1920 Bombing of New York," is to be published next year by Oxford University Press.

    One reason is the speed with which the attack went from rating a banner headline to barely rating a footnote. "Wall Street's Wall Street," said Meg Ventrudo, the assistant director of the Museum of American Financial History. "Wall Street is more concerned with tomorrow's trades than yesterday's news."

    And as Ms. Gage noted, "The Morgan bank from the first was rather self-conscious about wanting to get the whole thing over with and forgotten because it wasn't terribly good for business."

    Click here to return to top of page.

    Campus Watch, One Year Later (posted 9-18-03)

    Daniel Pipes, writing in the NY Post (Sept. 18, 2003):

    "'Intellectual thugs," huffed Rashid Khalidi, now of Columbia University."Cyber-stalking," whined Juan Cole of the University of Michigan."Crude McCarthyism" sniffed David Bartram of the University of Reading."Totalitarian" thundered Jenine Abboushi of New York University.

    What so outrages these academic specialists on the Middle East? It's called Campus Watch (campus-watch.org), and it's a project I started a year ago today to"review and critique Middle East studies in North America, with an aim to improving them."

    Campus Watch provides peer review of a vital topic - think how many problems come out of the Middle East. Given the centrality of this region to current world politics, how the scholars fare is not a recondite matter but an issue of importance for our security and welfare.

    Trouble is, Middle East studies have become an intellectual Enron. Scholars of the Middle East are:

    • Incompetent: They consistently get the basics wrong. Militant Islam they portray as a democratizing force. Osama bin Laden and al Qaeda they dismiss as irrelevant. The Palestinian Authority they predict to be democratic. So wrong so consistently are the academics that government officials have largely stopped asking them for advice.
    • Adversarial: Many American scholars are hostile to U.S. national interests. Thus, the Middle East Studies Association (MESA) board has recommended that its members"not seek or accept" U.S. government funded scholarships. That three specialists were recently indicted on terrorism charges caused no alarm among their colleagues.
    • Intolerant: The field is hobbled by political uniformity and an unwillingness to permit alternate viewpoints. In one infamous case at Berkeley, the section leader of a course on Palestinian poetics made this bias explicit in the course catalog ("Conservative thinkers are encouraged to seek other sections").
    • Apologetic: Specialists generally avoid subjects that reflect poorly on their region, such as repression in Saddam Hussein's Iraq, Muslim anti-Semitism and chattel slavery in Sudan. The MESA president recently discouraged studying what he called"terrorology." Specialists sometimes actively deceive, for example, by denying that jihad historically has meant offensive warfare.
    • Abusive: Specialists too often coerce students into regurgitating a party line and penalize freethinkers with lower grades.

    Campus Watch seeks to remedy these problems with a two-pronged approach: offer specialists an informed, serious and constructive critique; and alert university stakeholders - students, alumni, trustees, parents of students, regents, government funders - to the failings of Middle East studies.

    The professorate responded to Campus Watch's launch last Sept. 18 with furious allegations of"McCarthyism" and worse. This intense reaction to our work suggested that it (however reluctantly) heard our message. With time, the hysteria has subsided, replaced by an apparent resignation to our continued review of their scholarship and actions.

    On its first anniversary, Campus Watch can claim to have had an impact. The U.S. House Subcommittee on Select Education held an unprecedented hearing on"questions of bias" in Middle Eastern and other area studies programs. At Columbia University, students, faculty and alumni have begun agitating against their institution's one-sided coverage of the Middle East. The University of Michigan shut down a Web site that disseminated the extreme Wahhabi version of Islam.

    The Campus Watch staff lectured at 48 educational institutions during the past academic year, offering a rare break from one-sided presentations of the Middle East. Unhappily, our presence sometimes so inflamed the opposition that bodyguards, metal detectors and (in one memorable instance) mounted police were required to insure our right to speak. On the bright side, such furor prompted wide media coverage and useful debates about the Middle East and the need for diverse viewpoints.

    Click here to return to top of page.

    Scientists Can Be Eccentric in Every Way But One (posted 9-17-03)

    Robert Matthews, writing in the London Telegraph (Sept. 14, 2003):

    Discovering that some great historical figure had the scruples of a Mafia hit-man or the sexual morality of a rabbit is nothing new these days. While such revelations often ruin the reputation of run-of-the-mill celebrities, this is not always the case for great scientists, whose po-faced image often benefits from a whiff of scandal.

    Many physicists still delight in exchanging anecdotes about the late, great American Nobel prizewinner Richard Feynman, who enjoyed breaking into safes and frequenting topless bars. Madame Curie made tabloid headlines in 1911 with an affair with a fellow physicist, and was told by a member of the Nobel Prize committee not to collect her award for the discovery of radium (she turned up anyway). Erwin Schrodinger, one of the founders of quantum theory, did his best work between sessions with his mistress in a skiing lodge.

    There is only one form of behaviour that is still regarded as utterly beyond the pale in the scientific mind, and that is any form of flirtation with the occult. Even the likes of Sir Isaac Newton knew his reputation would take a severe beating if anyone learned of his fascination with matters spiritual and alchemical. In public, Newton insisted that he had no interest in putting forward the explanation of gravity, and focused purely on its mathematical description. Only centuries after his death did it emerge that Newton believed gravity to be a manifestation of God's all-pervading spirit.

    The same sentiments hold sway today. Professor Brian Josephson of Trinity College, Cambridge, is widely regarded to have "cracked up" after winning the 1973 physics Nobel at the precocious age of 33, simply because he refuses to dismiss evidence for paranormal phenomena.

    Clearly anyone who hopes to succeed in the world of science is best advised to keep their flaky ideas to themselves. Just how far some scientists have been prepared to go to avoid being labelled fruitcases is made clear by a paper in the current issue of Physics World by Dr Jeff Hughes, a scientific historian at the University of Manchester.

    Click here to return to top of page.

    Fred Vinson ... Was He a Better Chief Justice than People Think? (posted 9-17-03)

    Cameron Mcwhirter, Bill Rankin, writing in the Atlanta Jurnal and Constitution (Sept. 14, 2003):

    Fred Vinson, once the chief justice of the U.S. Supreme Court, hasn't received much respect since he dropped dead of a heart attack in his Washington apartment 50 years ago.

    Largely forgotten, the Kentucky Democrat has been labeled by the few court historians who mention him as an incompetent jurist and a Southern political hack reluctant to tamper with segregation. In fact, Vinson is noted more for his death than for his life, because his passing ushered in a new era for the high court.

    His demise at 63 on Sept. 8, 1953, brought glee to his enemies. Fellow Justice Felix Frankfurter told a law clerk that the chief justice's passing was "the first indication I have ever had that there is a God."

    Vinson doesn't even get much respect at his alma mater, Centre College in Danville, Ky. Members of Vinson's fraternity, Phi Delta Theta, carry his portrait, proudly called "Dead Fred," to every football game as a sort of creepy mascot.

    Vinson's reputation has been overshadowed by that of his successor, Chief Justice Earl Warren, credited with uniting a fractious court and transforming U.S. civil rights and privacy laws. Only months after Vinson's passing, the court under Warren ruled unanimously in the landmark case Brown v. Board of Education, signaling the end of educational segregation. Under Vinson, the court had been divided on the case.

    "The Age of Darkness was transformed into the Age of Progress," said Charles Ogletree, a Harvard University law professor who is writing a book about the Brown case.

    But a small group of legal historians has set out to revise Vinson's bad rep. A biography and several histories of Vinson's court have come out in the past two years, and more are set for publication next year.

    These scholars acknowledge that Vinson was not a great legal mind and admit that he was hesitant about abruptly ending segregation. But they argue that rulings by the Vinson court played a key role in unraveling prior court precedents buttressing segregation. Vinson paved the way, they argue, for the Brown ruling and other civil rights reforms.

    "Vinson is a largely forgotten figure who was never given the credit he was due," said Robert George, a Princeton University law professor and constitutional scholar. "He is someone who deserves to be remembered."

    Born in 1890 in Louisa, Ky., Vinson became a prominent politician in the state by the mid-1920s. He was elected to Congress from 1924 to 1929, then again from 1931 to 1938. Vinson was a quick study on tax law and budgets, a close ally of Harry Truman, then vice president, and a strong supporter of President Roosevelt's New Deal.

    In 1938, Roosevelt nominated him to the U.S. Court of Appeals for the D.C. Circuit, where he served until 1943. In 1943, Roosevelt put Vinson in charge of the Office of Economic Stabilization, which ran the country's wartime economy. In 1945, President Truman made him treasury secretary. His ability take on various jobs for the Democratic administration earned him the moniker "Available Vinson." Truman, who regularly played cards with Vinson, called him "the man I depend on most."

    In 1946, Truman nominated Vinson to become the nation's 13th chief justice.

    "Vinson had a reputation, prior to coming to the court . . . of being able to bring people together," said Linda Gugin, co-author with James St. Clair of a biography of Vinson.

    But from the beginning, the contentious, intellectual and highly educated justices such as Frankfurter looked down on Vinson.

    "They did not have the respect for him that he would have needed," Gugin said. "They saw him as a crony of Truman."

    St. Clair said Vinson had a mind for politics, not for jurisprudence.

    "He had a favorite saying, 'Things go better when you don't get all hot and bothered,' " St. Clair said. "That worked well in Congress and the bureaucracy, but it didn't work at the court, obviously."

    Click here to return to top of page.

    Hitler's Fan Mail to American Eugenics Leaders (posted 9-17-03)

    Dan Vergano, writing in USA Today (Sept. 15, 2003):

    Hard as it might be to believe, Adolf Hitler wrote fan mail, finding time in the early 1930s to express his admiration of the American leaders of a vaguely scientific movement called eugenics.

    In a new book, War Against the Weak, investigative reporter Edwin Black makes the case that 20th-century American proponents of eugenics -- the belief that controlled breeding can improve humanity -- had substantive ties to the architects of Hitler's racial extermination machine.

    Black documents many links, such as the Hitler letters, between the American eugenicists and Nazi Germany prior to World War II, including how one prominent eugenicist's book, Madison Grant's The Passing of the Great Race, became Hitler's "bible."

    Eugenics came into vogue in the early 20th century. With a name coined in 1883 by British anthropologist Francis Galton, who hoped to see arranged marriages improve mankind, the movement eventually led to racist laws, such as ones prohibiting miscegenation in many U.S. states and the sterilization of more than 60,000 mental and moral "defectives."

    "It's startling how much Hitler idealized American eugenics," Black says. His book required two years of research by dozens of volunteers who culled records from about 110 archives, diaries of eugenicists, case records of their victims and research reports on removing the unfit from humanity. The research builds on Black's best-selling book, IBM and the Holocaust, which examined Nazi use of data-processing technology to fill concentration camps.

    In War Against the Weak, Black lays bare the veins of collaboration between American eugenicists and Nazi scientists. There was financial support of genetic research and travel by Nazi doctors from the Rockefeller Foundation, the Carnegie Institution of Washington, and Cold Spring Harbor (N.Y.) Laboratory, a leading genetics research institute. There was research collaboration and reports on the Nazi efforts in respected journals like the Journal of the American Medical Association (JAMA). Black also describes:

    * Biologist Charles Davenport, head of the Eugenics Record Office based at Cold Spring Harbor Laboratory. He wrote eugenics textbooks widely used in universities and high schools and led drives for sterilization laws that eventually emerged in 33 states. He supported "racial hygiene" concepts.

    * The lauding of eugenics by prominent Americans, including Alexander Graham Bell and Woodrow Wilson.

    * The career of one Harvard-credentialed doctor, Edwin Katzen-Ellenbogen, an original member of the Eugenics Research Association created in 1913, who ended up as a physician prisoner and SS collaborator at the Buchenwald concentration camp.

    Black says the labs and foundations he contacted, such as Cold Spring Harbor, were open to examining their past and are committed to legitimate scientific work today.

    Science historian and geneticist Elof Carlson of the State University of New York, Stony Brook, argues that Black does not capture the scope of historical bigotry and global racism.

    The author of last year's The Unfit: A History of a Bad Idea, Carlson says that "liberals, left-wing ideologues, social reformers, people of good intentions, scholars, and totally innocent scientists all contributed to the eugenics movement" -- not just a few malevolent scientists. (Black does note that Planned Parenthood founder Margaret Sanger was "a bigot if not a racist" who associated with eugenicists.) "Evil movements try to pick legitimate science to bolster their fanaticism," Carlson adds.

    Click here to return to top of page.

    Magruder Lied (posted 9-15-03)

    John Taylor, executive director of the Nixon Center (Sept. 11, 2003):

    After Richard Nixon’s death in April 1994, his family and the friends responsible for his estate dared to hope that a better place in history could be secured by the same fragile loops of coated plastic that had strangled his Presidency. Historians concurred that the Nixon White House tapes, when cross-referenced with the documentary record of the Nixon years, would offer extraordinary insights into the dynamics of Presidential decision-making. Nearly a half-century of partisan score-settling that has typified commentary about Mr. Nixon ever since the Alger Hiss case would finally give way to a flood of theses, dissertations, and biographies by students and scholars less possessed than their forebears by the ideological passions of the Cold War and Vietnam eras.

    We did not think it would happen overnight. We assumed that working journalists would first cull the tapes for profanity and racial and ethnic references by the President and his aides, all of them uttered during private conversations. At least that assumption proved correct. Yet we trusted that the tapes would be eventually used to illuminate his deft policy-making in Vietnam, foreign affairs, and domestic policy and also to provide new perspectives on the scandal that destroyed his Presidency.

    In retrospect, we proved to be especially naïve when it came to Watergate. Journalists and prosecutors had pushed hard for the release of the tapes during 1973-74 so we could see what they revealed about Watergate. What we never anticipated was that a generation later, journalists and scriptwriters would ignore the tapes when what they revealed about Watergate proved to be inconsistent with the conventional wisdom.

    For instance, in July PBS broadcast a documentary featuring a charge by former campaign aide Jeb Stuart Magruder that President Nixon had personally approved the Watergate break-in in a phone call on March 30, 1972. Since the President was in the White House that day, such a conversation would have been caught on tape. The tapes show that no such conversation took place. Mr. Magruder’s statement was contradicted by other evidence as well, including his own conflicting statements over the years. In their rush to promote and amplify Mr. Magruder’s explosive charge, the producers revealed none of the contradictory evidence.

    President Nixon would not have been surprised. Yet for a little while, we had dared hope it would be otherwise. The former President had long resisted the release of his tapes on the grounds that the National Archives had not fulfilled its court-mandated obligation to return to him tapes of personal and family conversations. Two weeks after his death, President Nixon’s son-in-law Edward Cox reached out to executors and attorneys for the Nixon estate. The accolades recently heaped on the late President by his eulogists and even by some in the media suggested that the era of harsh anti-Nixon commentary was over, Mr. Cox said, which meant that the expensive court battles should end as well. He said while the President had been right to fight to protect his and his family’s privacy, it was time for his executors to cut a deal.

    Mr. Cox’s suggestion was a relief to many on the late President’s battle-scarred legal team as well as to those of us working on his staff and at his library. It was tantalizing to think that an era was dawning when discerning scholars would patiently comb the files and tapes and write balanced accounts of the Nixon years. In July 1995, we reached an agreement with the National Archives setting a timetable for opening the thousands of hours of tape recordings. Eight years later, over half the tapes have been opened to scholars at the Nixon Project in College Park, Maryland. The archivists themselves control the pace of the openings. Their painstaking work is sometimes slowed by new declassification rules and other factors. The Nixon estate has not formally objected to the opening of a single second of tape. A few years ago we even agreed to permit the archivists to sell copies of the tapes to the public earlier than the July 1995 agreement had stipulated.

     

    Yet the reading room at College Park is not clogged with listeners. Officials say about five people a week come in to listen to the tapes. Even for dedicated students of Presidential decision-making, taped conversations are sometimes too much of a good thing. Listening to and transcribing tapes is expensive and laborious. All 4,000 hours of Nixon tapes would fill about 480 500-page volumes, and that’s without any annotations. Our best source for accurate, thoughtfully annotated transcripts of important taped conversations from Kennedy, Johnson, and Nixon White Houses is the project underway at University of Virginia’s Miller Center. Still, it will take experts many years to complete transcripts of relatively few selected conversations.

    Yet even when transcripts are available, journalists with an interest in Watergate tend to overlook them unless they bolster the conventional wisdom. Our first disappointment came in 1997 with press coverage of the first book containing extensive transcripts of the newly-released Watergate tapes, Abuse of Power by Stanley Kutler of the University of Wisconsin. Dr. Kutler published selected transcripts that actually confirm President Nixon’s own account of his actions during Watergate. In suggestive, sometimes misleading annotations, Dr. Kutler tried his best to explain away his transcripts’ exculpatory flavor. The transcripts themselves ultimately received little if any notice from reporters and reviewers in spite of the insights they offered into the state of mind of a President overseeing a war in Vietnam, peace negotiations in Paris, and a political campaign at home.

    To paraphrase Sen. Howard Baker’s famous question, the keys to understanding Watergate are what the President thought and when he thought it. Though critics ridiculed his assertion that he acquiesced in a limit on the Watergate investigation because of national security, the tapes show he was telling the truth. Some of the burglars had also worked on a team, called the Plumbers, that had investigated Daniel Ellsberg after he stole top-secret Vietnam files, the Pentagon Papers, and gave them to the newspapers. Mr. Nixon was dismayed to learn in the spring of 1973 that the team had performed a 1971 break-in at the office of Dr. Ellsberg’s psychiatrist’s office, Louis Fielding. But in June 1972, when the Watergate break-in occurred, he was still operating on the assumption that the Ellsberg investigation had been above board. He thought Dr. Ellsberg had put American fighting men at risk, and he considered his right to investigate him inviolable, as well as unrelated to Watergate. So he blithely approved his White House counsel John Dean’s plan to limit the investigation – only to revoke the order two weeks later after the FBI complained.

     

    The tapes for the rest of 1972 reveal that he thought the burglars should be accountable for Watergate but not for investigating Ellsberg – exactly the distinction he said he had kept in his mind all along. Again and again he counseled his aides to avoid a Watergate cover-up. On June 30, he said, “I think the best thing to do is cut your losses in such things, get the damn thing out.” On July 19, he said, “You know, I’d like to see this thing work out, but I’ve been through these. The worst thing a guy can do, the worst thing – there are two things and each is bad. One is to lie and the other one is to cover up.” On September 18, he said, “The cover-up is what hurts you, not the issue. It’s the cover-up that hurts.” On October 16, he tells chief of staff H.R. Haldeman, “I just want to know whether [Appointments Secretary Dwight] Chapin or you guys were involved in Watergate….I don’t want anybody to lie about Watergate, do you know what I mean?…If we are, we’ve got to admit it, you know what I mean, because I have said it and I’m out on a limb.” As for his mentality about national security, when Haldeman reminds him on June 30 that the same crew had done earlier work for the White House, the President barks, “You mean in the Pentagon Papers? What the hell is the matter with that?”

    Yet upon the publication of these transcripts, no paper carried the headline, “Tapes Show Nixon Pressed Aides To Avoid Cover-up.” Instead, Dr. Kutler claimed that the tapes showed that Mr. Nixon had actually known about the Fielding break-in at the time it occurred. In fact, the tapes Dr. Kutler himself transcribed and published strongly support Mr. Nixon’s contention that he had not learned about it until the spring of 1973. These dates are vitally important – perhaps the most important in the whole Watergate saga. If in June 1972, the President had known the Plumbers had an earlier break-in under their belt, then his acquiescence in Mr. Dean’s suggestion to limit the investigation indeed seems questionable. But if he was not yet aware of the Fielding job, the tapes make abundantly clear that he was making a careful distinction between Watergate, which he considered wrong and fair game for prosecutors, and the Plumbers’ Ellsberg work, which he considered his legitimate purview as a wartime commander-in-chief. Perhaps that’s why so many of his critics persist in claiming or implying that he “must have known about Fielding” and so had to order the Watergate cover-up in order to cover up the White House role in the earlier burglary as well.

    Equally elusive has been any evidence that Mr. Nixon knew in advance about the June 1972 Watergate break-in -- until Mr. Magruder’s star turn on the July 30 PBS documentary, Watergate Plus 30: Shadow of History. Mr. Magruder said that during a meeting in Key Biscayne, Florida with the late John Mitchell, then Mr. Nixon’s campaign manager, he had heard Mr. Nixon’s voice, coming over a telephone held by Mr. Mitchell and approving a plan by G. Gordon Liddy for a break-in at the Watergate. In making the charge, Mr. Magruder contradicted statements he had made in his 1974 memoir and in taped interviews with scholars in 1988 and 1990. When we asked the program’s publicist,Colby Kelly, about the discrepancy with the Magruder memoir, she wrote back that he had freely admitted the contradiction and “explained that it was written before he went to prison and he was hoping for a pardon.” Yet the interviews in which he also contradicted his new charge were given long after Mr. Nixon had lost his pardon power. Indeed fingering the boss would have enhanced his chances for a pardon from subsequent Presidents. Asked about the contradiction in July, Mr. Magruder didn’t mention pardons but said that he had never been asked a direct question about Mr. Nixon’s involvement, which is also untrue.

    When a source appears this conflicted, changing his story and wrapping inconsistencies in more inconsistencies, responsible journalists back off. Mr. Magruder, a retired Presbyterian pastor, may still be seeking expiation. Such speculation increased in mid-August when he was arrested, booked, and jailed near his home in Columbus, Ohio after police said he was lying drunk on a sidewalk and refused an officer’s request to get up, a charge his attorney denies.

    Mr. Magruder’s arrest did not attract the same nationwide publicity as his accusation against President Nixon, which the PBS program’s promoters released in advance to selected reporters to bolster viewership. It is hard to avoid the impression that PBS and the show’s London-based producer, Carlton Productions, did not want to try too hard to test their source’s shaky memory. They did not report, for instance, that John Mitchell’s friend and aide, Fred LaRue, had attended the March 30, 1972 meeting during which Mr. Magruder now says he heard the President’s order. Mr. LaRue says that the telephone call never took place. Ms. Kelly, the publicist, did not respond to two e-mails asking if producers had reached out to Mr. LaRue, whose number is in the phone book. Mr. LaRue says he was never contacted.

    In response to a Nixon library statement noting that Mr. Magruder’s statement was also contradicted by the White House tapes, Ms. Kelly wrote, “I know the producer investigated this and felt that the issue was more complicated than your statement allows.” But when we asked her when the producer had consulted the White House records, she didn’t reply. National Archives records show that no one associated with PBS, Carlton, or the documentary had visited. In the script for the program, the producers failed to point out that White House tapes and logs make clear that the Mr. Nixon said nothing all day about the Key Biscayne meeting and participated in no telephone calls with Key Biscayne or anyone in the meeting.

    A PBS spokesperson, Carrie Johnson, also declined to respond to questions about whether Mr. LaRue or the tapes had been consulted.

    Washington Post Watergate reporter Bob Woodward likes to call Mr. Nixon’s tapes “the gift that keeps on giving.” But it’s unfortunate that when the tapes could help President Nixon, his critics withhold the gift of the benefit of the doubt. Ironically, the Post lent its name to the PBS/Carlton production. During Watergate, the Post said that it always insisted on two sources before printing a Watergate accusation. The rule must no longer apply. Offered a chance to double-check Mr. Magruder’s charge, the Post’s documentary team excused its source’s obvious confusion, overlooked Mr. LaRue, and ignored the tapes. Was pinning the momentous burglary on President Nixon just too hard to resist? Whatever PBS’s motives, its program demonstrates that the real story of Watergate, the scandal sparked by our nation’s argument with itself over Vietnam, remains to be told. Whenever the true inquirers are ready to role up their sleeves, the tapes are waiting.

    Click here to return to top of page.

    Yoichi Funabashi: The Third Atomic Bomb (posted 9-12-03)

    Yoichi Funabashi, writing in the International Herald Tribune (Aug. 5, 2003):

    I visited Tinian Island in the western Pacific, the base from which the B-29 bomber Enola Gay took off to drop the uranium-fueled atomic bomb Little Boy on Hiroshima and the plutonium-type Fat Man on Nagasaki. The components of the atomic bombs were shipped to Tinian from the U.S. mainland on the USS Indianapolis and reassembled locally. A single palm tree stands in each of the two rectangular pits where the atom bombs were loaded.

    "No matter how many times they plant palm trees, they die because of the aggregates in the soil. I heard they replace them with new ones each time," said a guide....

    Since the United States had revealed its method of attack to the Japanese side at Hiroshima and Nagasaki, the U.S. military was looking at ways to stage a different attack in the event a third atomic bomb became necessary. At the time, the U.S. military planned a third bombing, according to "Beigun Shiryo: Genbaku Toka no Keii" (U.S. military documents: Details of atomic bombings) by Yoshishige Okuzumi and Yozo Kudo.

    Stanford University professor Barton Bernstein, a prominent scholar in the history of atomic bombs, writes in an essay titled "Eclipsed by Hiroshima and Nagasaki": "Had the surrender not arrived at the 14th and if the war had dragged on into the next week, (Harry) Truman would undoubtedly have used at least one more A-bomb on a city and probably even more cities or other targets. If such nuclear pummeling did not soon produce the desired surrender, and if Truman did not retreat to offer softer surrender terms, Marshall's loose plan for tactical nuclear usage with the Kyushu invasion might have looked attractive to the White House."

    The father of a former U.S. diplomat whom I know was later told he was scheduled to board a B-29 as a bomber in case the United States decided to drop a third atomic bomb. The man, now 76, belonged to the 314th Bomb Wing on Guam. He took part in 18 expeditions in which he dropped bombs on Yokohama, Mito, Omuta in Fukuoka Prefecture and other locations across Japan.

    When I had lunch with them in San Francisco a while ago, the father recalled his experiences and said he could smell the burning bodies even at an altitude of 50,000 feet (about 15,000 meters).

    He said he dropped bombs because he felt obliged to do so as an American serviceman; but he is not proud of what he did.

    Regardless of whether they used nuclear or conventional weapons, the attacks were indiscriminate.

    When we talked about the Enola Gay, the man asked, shaking his head in disbelief: "What made (Tibbets) come up with the name? What did his mother think?" Tibbets named the plane after his mother. He wrote her name on the left side of the plane's nose.

    The third bomb was not dropped.

    For nearly six decades since, humankind has barely managed to avoid a third atomic bombing. However, since the Sept. 11, 2001 terrorist attacks, the world has been seized by the fear that weapons of mass destruction may fall into the hands of terrorists. Now it appears the United States, more than anyone, is shuddering with fright.

    Seneca, Roman philosopher and tutor of Nero, said: "Power over life and death-don't be proud of it. Whatever they fear from you, you'll be threatened with." Seneca died a tragic death.

    Halfway between the two atomic bomb pits stands a giant banyan tree casting a shade all around. The place was completely deserted and quiet.

    Click here to return to top of page.

    Too Soon to Include 9-11 in History Textbooks? (posted 9-12-03)

    S. Mitra Kalita, writing in the Washington Post (Sept. 11, 2003):

    According to some of the crisp, shiny textbooks distributed recently, the terrorist attacks of Sept. 11, 2001, are history -- and that does not sit well with some historians, who say it's too soon.

    For the first time since hijackers flew planes into the World Trade Center and the Pentagon, killing thousands, many Washington area students can read about it in new textbooks that include special sections or chapters devoted to terrorism on U.S. shores. Some school systems revamped social studies curricula to include the new texts, while others simply received the updated editions when they placed orders for the school year.

    But some historians say that two years after the event is too early to place it in any historical context, and some teachers are concerned that the language used to describe the aftermath of Sept. 11 is suffused with more patriotism than fact.

    "I was quite stunned that publishers are rushing to include this coverage," said Jerold M. Starr, director of the Pittsburgh-based Center for Social Studies Education. "They didn't used to. It took them many years to include Vietnam."

    A study in 1983, he said, concluded that the average U.S. history textbook devoted six paragraphs to the Vietnam War a decade after it ended.

    But publishers say students learn better if they can relate recent events to historic ones.

    In early September 2001, editors at McDougal Littell, a division of Houghton Mifflin, were about to send their line of history and government textbooks to press. Then the terrorists struck.

    "It was literally one of those 'stop the presses'-type moments," said Collin Earnst, a spokesman for Houghton Mifflin. "The editorial staff sat down and began discussing how they would cover this. What made it tricky was that you are writing a history textbook in real time."

    The publisher's solution was to insert a 14-page supplement into many of its textbooks, of which the most popular in Northern Virginia high schools is "The Americans."

    The latest edition of the book now includes a chapter titled "History in the Making: The War on Terrorism." It includes maps of the world and graphics about removing debris. Earnst said editors decided to focus on the "heroism of this time period. There's a focus on the firefighters, the police workers, so there's a focus on patriotism."

    In a U.S. history class she taught over the summer at South Lakes High School in Reston, teacher Jenny Lindner used the new version of "The Americans." She didn't get as far as Sept. 11, but she said she liked having the option.

    "It's imperative that they put 9/11 in the history textbooks because it is a new phase in American history," she said. She added that she never teaches just from the textbook. "We want them to think out of the box."

    In Prince William County, social studies departments received "The Americans," along with "The American Vision," published by Glencoe, a division of McGraw-Hill.

    In a seven-page section, "The American Vision" describes the country's mood after the attacks. "Everywhere across the nation, Americans put up flags to show their unity and resolve. . . . Cheerleaders in Virginia organized car washes to raise money for the Red Cross. . . . If the terrorists had hoped the attacks would divide Americans, they were wrong."

    Click here to return to top of page.

    Scotland's Forgotten History: Connections to Slavery (posted 9-12-03)

    Lorna Martin, writing in the Glasgow Herald (Sept. 11, 2003):

    IN the People's Palace, there is a painting of one of Glasgow's most important historical characters. John Glassford, the eighteenth-century tobacco merchant, who has a city-centre street named after him, is depicted with his family.

    If you know where to look, there is a vague outline and shadow of another figure next to the hugely successful businessman's family. It is of Glassford's young black servant, who was painted out when it became politically incorrect to be remembered as slave holders. It is a dramatic metaphor for the way Scotland has chosen to airbrush out an important part of its past.

    Now, however, historians are increasingly exploring Scotland's links with Africa and questioning why Scots tend to be more at ease thinking of ourselves as an oppressed nation rather than confronting the fact that we played a substantial and brutal part in the slave trade.

    Angus Calder, the historian and a contributor to a forthcoming BBC Radio Scotland series on the issue, says we easily overlook the black presence in Scottish history. "As you travel around Scottish history you are confronted paradoxically by the missing black man who was always there. Occasionally, it turns up in the record that, for instance, one of the first people arrested for wearing Highland 'garb' after it was proscribed following Culloden was Oronoce, a black servant of the Laird of Appin.

    "James IV employed black entertainers in his court. Black sailors worked ships in and out of the Clyde in the days when slave-grown Virginia tobacco was making Glasgow rich, and Glasgow must have acquired, like other British seaports, a permanent black population, though local historians have ignored it. Men successful in the New World brought blacks back with them as servants. Scots swarmed into the West Indies, dominating whole islands, including the most important, Jamaica, and the profits were vast."

    Calder believes that, with black genes in Scotland for almost two millennia, many of us will have black ancestors. He claims that intermarriage, more than racism, explains why so few of us are aware of the black heredity of many Scots....

    Recognising our black ancestory is one thing, but confronting our unsavoury involvement in the West Indian slave trade is quite another. It took one black man's appearance in 1778, at the Court of Session in Edinburgh, for many to face up to our involvement in the trade in humans.

    Joseph Knight, born around 1750, was taken by slave ship to Jamaica at the age of 10 or 11. He was sold to a Scottish planter called John Wedderburn, who brought him to Scotland as his personal servant in 1768. Knight was a determined character and two or three years later decided he had had enough of being another man's chattel. After five years of legal wrangling, Scotland's law lords decided that "the dominion assumed over the negro, under the law of Jamaica, being unjust, could not be supported in this country to any extent", and gave him his liberty.

    Despite its historical importance, the Knight case has long been forgotten except among the few chroniclers of black history in Britain. James Robertson, whose novel, Joseph Knight, was published earlier this year, believes the Scots need to re-examine our own truths, no matter how distasteful. "I don't think the truth has been deliberately suppressed but perhaps subconsciously it has because it goes so against the grain of what we often think Scottish history is about - about ourselves being oppressed.

    "It is clear that in eighteenth-century Scotland there were many Scots making good livings out of oppressing other people and I think that makes uncomfortable reading for us, but I think it is an important aspect of our history and it's high time it was brought into the open."

    Click here to return to top of page.

    Even Big Events Like 9-11 We Forget After Awhile (posted 9-12-03)

    Janet Albrechtsen, writing in the Australian (Sept. 10, 2003):

    [A] week before the second anniversary of the September 11 terrorist attacks one of Australia's most distinguished historians, Geoffrey Blainey, warned how quickly people forget.

    "The World Trade Centre was a theatre on an astonishing scale and it will enter into people's imagination for a long time to come. But even for large numbers of people who saw it on television within two hours of it happening, I would think that in 20 years it will rank as a minor memory," he told The Australian.

    As the memory fades, people may become complacent about terrorism, just as the West has grown complacent about an earlier evil -- communism. The warning followed a startling prediction Blainey made in his Quadrant address last week. He said socialism and communism would emerge from the ashes of history and stake out a claim in the West.

    "Just as capitalism for a time seemed unlikely to survive, now it is triumphant. I think the same process can easily happen again. While socialism or communism now seems completely out and done, the day will come when somebody will repackage it," he says. "They will try to explain away the past, they will find a method of explaining away why the Soviet Union was so tyrannical and why communist China was so tyrannical."

    Blainey predicts a 21st-century version of socialism or communism, one palatable to a Western society. "And a lot of the young will take it up and a lot of academics will take it up, and many poor people will rejoice in it." Surely not. Who can forget communism's repugnant inhumanities in the Soviet Union and in every place that embraced communism -- from China and North Korea to Latin America and Africa? The forced collectivisations. The hundred million slaughtered. The millions imprisoned. The barbarity of Vladimir Lenin and Joseph Stalin, who sold a utopian idea premised on equality and freedom from exploitation, only to deliver evil. The Great Purge of 1936-38. The Moscow Trials. On it went.

    Blainey's prediction is grounded in how people learn history. People forget, says Blainey, because facts lose their force when read in books.

    "If you have some experience yourself, that may rank 10 [out of 10] on the scales of learning," Blainey says. "Someone who has lived in a communist regime and has suffered from it, or their relatives have suffered from it, as long as they live they will never support it again."

    READING about evil at a distance is much lower -- about two -- on Blainey's scale of learning because theory never delivers the same punch as practice. "Not having experienced it, having only read a little bit about it, they won't have a strong aversion to it," he predicts.

    Numerous markers support his prediction. As The Economist noted last year, communism may be dead or dying as a system of government, but as a set of ideas it is alive and kicking.

    Consider this. In a 1999 BBC poll, taken within weeks of the 10th anniversary of the Berlin Wall falling, Karl Marx was voted people's choice for greatest thinker of the millennium. Albert Einstein came second, Isaac Newton third and Charles Darwin fourth. Adam Smith is to liberal capitalism what Marx is to Marxism. Smith's political system has delivered the West unparalleled prosperity in the past 50 years. Yet an online search of Dymocks bookstores in Australia shows Marx the subject of more books on the shelves than Smith by a factor of five.

    As The Economist also noted, new books favourable to Marx appear at an astonishing rate, mostly from academics. There's Marx's Revenge by Meghnad Desai, an economics professor from the London School of Economics; Why Read Marx Today? by Jonathan Wolff, a professor from University College, London; and Marxism, Modernity and Postcolonial Studies, by an international team of contributors that, according to the blurb, "refuses to accept the inevitability of the so-called New World Order" and are concerned to correct the "neglect of Marxist analysis in postcolonial studies".

    A search of titles in the University of Sydney's library brings up twice as many on Marx as on Smith. Is this to warn students off Marxism and communism? Unlikely. More likely that prosperity and complacency have lulled us into forgetting evil.

    Blainey's is a sober assessment from someone devoted to teaching and explaining history. He admits that we do not necessarily learn from history.

    Click here to return to top of page.

    The Neglected Invasion of Italy in 1943 (posted 9-12-03)

    Robert Verkaik, writing in the London Independent (Sept. 8, 2003):

    ON THE morning of 9 September 1943 - three years after Hitler's all-conquering Panzer divisions had thrown the British Expeditionary Force out of France and back into the sea - Coldstream Guardsman Philip Gourd became one of the first British soldiers to regain a foothold on mainland Europe.

    Guardsman Gourd was part of a small reconnaissance unit set down on the southern Italian beaches of Salerno three hours ahead of the main Allied landings, at the time the biggest invasion in history.

    Without the Allies' success at Salerno, which cost the lives of 8,000 British and American servicemen, D-Day would have had to be postponed and the war extended beyond 1945. Tomorrow, Guardsman Gourd, 83, from Teignmouth, Devon, and dozens of his comrades will gather at a Salerno cemetery overlooking the landing beaches to mark the 60th anniversary of the battle.

    But no government minister will be there to take the veterans' salute.

    Next year it will be the turn of 3,000 surviving D-Day veterans to honour their fallen comrades on the beaches of Normandy.

    But the Secretary of State for Defence will not be in attendance and neither will a senior member of the Royal Family.

    For the soldiers of both campaigns, this is a bitter blow as they sense their place in history slipping away.

    Guardsman Gourd, who was later promoted to Sergeant, says the Government's position represents a "terrible insensitivity. We expected something so much better, something that would really recognise the sacrifice of all those who are not with us now."

    Most of the veterans of both these campaigns are in their 80s and the events will be their last chance to stand together at a major anniversary to honour their achievements.

    But the Government has signalled that the time has come to scale down its support for such commemorations.

    This approach contrasts sharply with that of the American and French governments. Washington will be sending high-profile delegations to both commemorations. President George Bush is expected to attend next year's D-Day event as part of his campaign for re-election, while President Jacques Chirac is to head the French celebrations of the liberation of France.

    The best the British can manage in Normandy next year will be a single "junior minister", accompanied by two military bands.

    In Salerno tomorrow, the veterans will be joined by a single bugler and perhaps the Rome-based British military attache and the ambassador in Italy.

    The veterans have been told that ministers are determined to give the events a much lower priority than they did the 40th and 50th anniversaries of the landings, and that funding has been reduced accordingly.

    Instead, the Ministry of Defence wants to spend the money on three other famous battles of the Second World War - the Battle of Britain, the battle of the Atlantic and El Alamein - which they say have received less public support in the past.

    Jeremy Lillies, the head of corporate affairs at the Royal British Legion, said: "This is the last time these elderly veterans will be able to go to an event to mark any major anniversary. It's a real pity that Salerno can not be marked in a more significant way."

    The Imperial War Museum in London will mark the Salerno anniversary by hosting a reunion for veterans of the Italian campaign, who became known as the "D-Day dodgers" because it was perceived that they had been lucky to escape the invasion of France the following year.

    But the casualties suffered in the Italian campaign were proportionately just as heavy as on D-Day.

    Among the guests of honour at the museum today will be Lord Healey, the former deputy leader of the Labour party, Alan Whicker, from the Whicker's World television programme, and the military historian Sir Michael Howard.

    Click here to return to top of page.

    Gay History's Slow Beginnings (posted 9-9-03)

    Scott McLemee, writing in the Chronicle of Higher Education (Sept. 12, 2003):

    "Gay history has developed very slowly," says John D'Emilio,
    director of the program in gender and women's studies at the
    University of Illinois at Chicago. His point is debatable.
    Thirty years ago, the field barely existed, apart from studies
    of homosexuality in ancient Greece and Rome, and the
    occasional biography speculating on the private life of some
    famous person. So, arguably, the historical study of gay and
    lesbian people has grown at a remarkable rate over a fairly
    brief period. But it doesn't seem brief to Mr. D'Emilio,
    perhaps because it overlaps with his entire adult life.

    When asked what it was like to do research on gay history in
    the field's early days, he begins to laugh. A minute or two
    later he composes himself enough to describe what it was like
    to work on his dissertation at Columbia University in 1974.
    Its focus was the Mattachine Society -- a "homophile"
    organization, as it termed itself, that emerged in the 1950s
    to campaign for tolerance, mainly through lectures and
    publications aimed at professionals in law and medicine. Mr.
    D'Emilio recalls mentioning his topic to a senior faculty
    member. The man leaned against a four-drawer filing cabinet
    for support and said, in hushed tones, "Do you know what this
    will mean for your career?"

    "And I just wasn't going to engage him on that," says Mr.
    D'Emilio. "I yammered something like, 'Oh, well, it'll be
    fine. Nobody's ever written on this. ...'" It was clearly an
    uncomfortable discussion for both parties. "My presumption, at
    least until the early 1980s, was that I would never have an
    academic career," he says.

    He did finally land a tenure-track job, at the University of
    North Carolina at Greensboro, in 1983 -- the same year that
    the University of Chicago Press published a revised version of
    his dissertation as Sexual Politics, Sexual Communities: The
    Making of a Homosexual Minority in the United States,
    1940-1970. He was an exception, though: Much of the early work
    on gay and lesbian history was written by independent
    scholars, because there simply wasn't support for it in
    academe.

    Scholars also faced the problem of finding historical sources.
    "We were trying to reconstruct something that, until the
    1970s, was characterized by silence and invisibility," says
    Mr. D'Emilio. The agenda for his early research was strongly
    influenced by the fact that the Mattachine Society, as a
    public organization, had left a record he could actually
    locate and study.

    Other historians found traces of gay-rights activism in
    Germany in the late 19th and early 20th centuries, and
    rediscovered the work of Edward Carpenter, a British writer of
    the Victorian era who proclaimed the dignity of "the third
    sex." But aside from those rare signs of early militance, the
    history of gay and lesbian identity seemed to be a blank
    slate, especially in the United States.

    Click here to return to top of page.

    The Beatles Did More to Bring About the Demise of the USSR than Alexander Solzhenitsyn and Andrei Sakharov (posted 9-9-03)

    Mikhail Safonov, senior researcher at the Institute of Russian History at St Petersburg, in an article in the Guardian published in conjunction with the debut of a new documentary on A & E that raises the tantalizing possibility that the Beatles helped bring about the demise of the USSR (Sept. 8, 2003):

    During a chess match between Anatoly Karpov and Gary Kasparov in the 1980s, the two grandmasters were each asked to name their favourite composer. The orthodox communist Karpov replied: "Alexander Pakhmutov, Laureate of the Lenin Komsomol award". The freethinking Kasparov answered: "John Lennon."
    A few years ago, Russian television screened a film on Mark Chapman, the man who assassinated John Lennon in 1980. Chapman thought Lennon preached one message but lived by a quite different set of commandments. Lennon therefore was a liar and a cheat. He must die. The name of Chapman has become linked with that of Lennon, as other murderers are connected to their victims: Brutus and Caesar, Charlotte Corday and Jean-Paul Marat. Paradoxically, Lennon himself can be linked with the name of the Soviet Union in just the same manner. It was Lennon who murdered the Soviet Union.

    He did not live to see its collapse, and could not have predicted that the Beatles would cultivate a generation of freedom-loving people throughout this country that covers one-sixth of the Earth. But without that love of freedom, the fall of totalitarianism would have been impossible, however bankrupt economically the communist regime may have been.

    I first heard of the group in 1965. An article about some unknown "Beatles" was published in the journal Krokodil. The name grated on the ear, perhaps due to its phonetic content, associated in my mind with whipped cream (vzbeetiye slivki) and biscuits (beeskvit).

    The article described how a BBC announcer had told the world that Ringo Starr had had his tonsils removed - but had pronounced tonsils so indistinctly that listeners thought the drummer had had his toenails removed, and how the Liverpool postal service was having to work overtime due to the number of letters requesting the toenails in question.

    The first song I heard was on Leningrad radio. It was A Hard Day's Night. I didn't like it - it seemed monotonous, and I doubted if it was worth all those "toenail" requests. Then a collection of songs was released in the German Democratic Republic, taken from the first album. It was impossible not to listen when all anyone was talking about was the Beatles. The music came to us from an unknown, incomprehensible world, and it bewitched us.

    In his 1930s novel, The Master and Margarita, Mikhail Bulgakov says that love fell upon the heroes like a mugger with a knife from a side street. Something similar happened to the souls of our "teenagers" (a word we learned thanks to the Beatles).

    In the Soviet Union, the Beatles were proscribed. In the early days, infatuation with the Beatles implied an unconscious oppositional stance, more curious than serious, and not at all threatening to the foundations of a socialist society. For instance, during an astronomy lesson, my schoolmate had to give a talk about a planet. Having recited everything that he had copied from a journal, he made his own addition: "And now the latest discovery of four English astronomers - George Harrison, Ringo Starr (and the two others) - the orbit of such and such planet is approaching the Earth, and in the near future, there may well be a collision." The physics teacher barely knew more than we did about the planets. So she listened to this talk of "a possible collision" unsuspecting. She had not heard of these "astronomers". She hadn't even heard of the Beatles....

     

    One of the Leningrad schools staged a show trial against the Beatles. A mock public prosecutor was appointed, and the proceedings were broadcast on the radio. The schoolchildren proclaimed themselves outraged by all that the Beatles had done. The verdict of the trial was that the Beatles were guilty of anti-social behaviour. All this reeked of 1937. But even in Stalin's time, show trials were not held for famous foreigners, who had become almost an integral part of the way of life of the Russian people.

    Yet the more the authorities fought the corrupting influence of the Beatles - or "Bugs" as they were nicknamed by the Soviet media (the word has negative connotations in Russian) - the more we resented this authority, and questioned the official ideology drummed into us from childhood. I remember a broadcast from a late 1960s concert of some high Komsomol event. Two artists in incredible wigs, with guitars in hand, walked around the stage back to back, hitting one another and making a dreadful cacophony with their instruments. They sang a parody of a Beatles tune: "We have been surrounded by women saying you are our idols, saying even from behind I look like a Beatle! Shake, shake! Here we don't play to the end, there we sing too much. Shake, shake!"...

    Beatlemania washed away the foundations of Soviet society because a person brought up with the world of the Beatles, with its images and message of love and non-violence, was an individual with internal freedom. Although the Beatles barely sang about politics (our country was directly mentioned only once in their repertoire, in Back in the USSR), one could argue that the Beatles did more for the destruction of totalitarianism than the Nobel prizewinners Alexander Solzhenitsyn and Andrei Sakharov.

    Click here to return to top of page.

    Paul Gagnon: Students Need to Be Taught to Appreciate America's Democratic Heritage (posted 9-9-03)

    A press release from the Albert Shanker Institute (Sept. 9, 2003):

    The typical American high school student has neither an understanding of nor appreciation for the basic democratic principles that make the United States different from most other nations. This is the conclusion of several polls and student assessments over the past few years. Now a new study suggests why – many schools aren’t teaching history and civics in a comprehensive fashion.

    The provocative nationwide report, authored for the Albert Shanker Institute by noted historian and educator Paul Gagnon, finds that most states need to overhaul their academic standards if students are to learn – and understand – the history, politics, geography, and economics indispensable to committed, thoughtful citizens.

    Currently, education standards in only 24 states and the District of Columbia have documents that include, fully or partly, the specific study topics to make an adequate civic core of learning, finds the report. Yet, even in these cases cases, essential topics are scattered and lost in an overwhelming mass of material. Some standards cite a laundry list of topics and ideas that teachers must try to cram into the school day. Others provide only vague guidance about what is to be taught, while posing broad, sweeping themes and questions. The result is that standards are not even “coverable” in the time schools have. Much less are they teachable in imaginative, memorable ways.

    Gagnon gives credit to the 48 states (plus the District of Columbia and the Department of Defense schools) for taking on the difficult, contentious, task of developing standards on the content that students should master in history, civics and the social studies. But the quality of their citizenship education won’t improve, he concludes, until states require common cores of historical and political learning for all students.

    “Since September 11 we sense a new eagerness among students to better understand their country. But, for students to effectively learn the important lessons in civics studies, states must prioritize content,” says Sandra Feldman, president of the American Federation of Teachers and of the Shanker Institute. “All of our kids need knowledge of the most important individuals, ideas, and events behind our democracy’s struggle to survive and flourish.”

    “Some may challenge the idea of a required common core of civics and history,” says Gagnon, “but the answer goes to the heart of democracy: all citizens, whatever their origin or status, have the right to a common body of learning that gives them to power to talk to each other as equals about their society’s priorities and political choices it faces.”

    The report was partly motivated by the growing number of indicators pointing to a troubling lack of student understanding of politics and history at all grade levels.

    The 2001 National Assessment of Educational Progress (NAEP) found that only 17 percent of eighth graders scored at proficient or advanced levels in United States history. Fewer than half knew that the Supreme Court could decide a law’s constitutionality. At the same time, only 11 percent of twelfth graders scored at the proficient or advanced level. Only a third knew what the Progressive Era was and many were unsure whom we fought in World War II. Recently, 81 percent of college seniors at 55 leading colleges and universities scored an F or D when quizzed on American history.

    The report recommends that states give veteran teachers and scholars a larger role in advising states what should be studied. Narrowing standards to an essential “core” would help their colleagues select and teach the most important topics within the available instructional time for the subject, the report says. Dr. Gagnon offers one “model of a civic core” for states and their expert advisors might consider. He also provides ideas on how school’s limited instructional time could be allocated to teach a civic core that is rich in both breadth and depth, and still leave time for local choices.

    Once revised, standards should more clearly define the content that contributes to students’ civic competencies; and states should work with schools and universities to train new and existing teachers to meet these revamped standards.

    Click here to return to top of page.

    H.W. Brands: It's Time to Take the Founding Fathers Down a Notch or Two (posted 9-5-03)

    Sarah Cohen, writing in the Atlantic (August 7, 2003):

    Patriotism is thriving in America today, and its many symbols abound—flags, stars-and-stripes bumper stickers, and freedom fries are all going strong, and so are the reputations of the Founding Fathers. As global insecurity and economic uncertainty become ways of life and leaders appear increasingly tarnished by the compromises of politics, it's comforting to think about the successes of Washington, Adams, Jefferson, and their compatriots, and encouraging to know that we are continuing their experiment. Confidence in the genius of the Founders and the conviction that their blueprint for our nation is infallible can lend a patriot a rare sense of security, even in troubled times.

    And yet this security can be dangerous. In "Founders Chic" (September Atlantic), the historian H. W. Brands offers a reality check to a Founders-obsessed nation. From the newspapers of the Founders' own time, Brands points to some typically hostile opinions on the part of their contemporaries: Washington is said to have a "cold hermaphrodite faculty" that is responsible for his false reputation as a man of "prudence, moderation and impartiality"; John Adams is mocked for his "sesquipedality of belly"; and Thomas Jefferson is called "a mean-spirited, low-lived fellow, the son of a half-breed Indian squaw, sired by a Virginia mulatto father." Not only were the Founders anything but deified in their own time, they were also held responsible by later generations for some of the young nation's most severe problems—and the questions they left unresolved did have serious ramifications, most notably the contradictions over slavery that eventually led to the Civil War. Through all this, the Founders have emerged as heroes, particularly in times demanding national unity; they have served as symbolic anchors of nationhood during the post-Civil War Reconstruction period, the World Wars, and again today.

    Although Brands admires the Founders, he argues that their most remarkable quality was their boldness in the face of great risk and uncertainty—the very quality that excessive reverence for previous generations stifles. Through their impressive feat of creating a structure of stability for their political descendants, the Founders created a leadership class with a genuine respect for the status quo, and bequeathed to these new leaders a complicated set of problems, both ideological and practical. Today's leadership class, Brands suggests, would do well to take a page out of the Founders' book and apply all their ingenuity to the nation's needs, reworking the Constitution when necessary to address the issues of the day. He argues that the confidence to do this would be a more important inheritance from the Founders than the particulars of the Constitution, a document produced by a small group of men during three months in 1787. It would also, paradoxically, provide a new opportunity for the kind of leadership for which the Founders have been remembered so reverently.

    Click here to return to top of page.

    The Nuclear War that Almost Happened in 1983 (posted 9-5-03)

    Scott Shane, writing in the Baltimore Sun (August 31, 2003):

    Twenty years ago this fall, as the Orioles triumphed in the World Series, baby boomers flocked to The Big Chill and radios played Michael Jackson's Thriller, the superpowers drifted obliviously to the brink of nuclear war.

    That is the disturbing conclusion of a number of historians who have studied the bellicose rhetoric and mutual incomprehension of the United States and the Soviet Union, which then had more than 20,000 nuclear warheads between them. With the possible exception of the Cuban Missile Crisis in 1962, they say, the autumn of 1983 might have been the most dangerous moment in the history of the Cold War.

    Ailing Soviet leader Yuri V. Andropov, urged on by Soviet hard-liners and acutely aware that his country was losing the military technology race, had become increasingly worried that the Americans might be planning a nuclear first strike. President Ronald Reagan's rhetoric about the "evil empire" and U.S. military exercises poured fuel on Soviet war paranoia.

    A series of mishaps and misunderstandings made conceivable a catastrophe that would have dwarfed today's worst fears of terrorists wielding weapons of mass destruction. That it did not happen is in part thanks to a KGB turncoat who alerted the West to Soviet fears and a Russian duty officer who did not panic when an archaic satellite reported that U.S. missiles were on the way.

    "In retrospect, it was a pretty terrifying time," says Benjamin B. Fischer, a 30-year CIA veteran who now works on the agency's historical staff. "We're lucky it ended the way it did."

    Fischer, who has published a seminal paper on the period through the CIA's Center for the Study of Intelligence, says the danger resulted when wildly exaggerated fears of a U.S. surprise attack took hold in "the geriatric ward" of the aging Soviet leadership. But a contributing factor was the reluctance of Reagan and his advisers to believe that Russian fears were genuine, he says.

    "We don't do Pearl Harbors. So we couldn't believe they really thought we were capable of a first strike," Fischer says.

    John Prados, a Cold War historian and senior analyst at the National Security Archive in Washington, calls the fall of 1983 "a moment of high danger, in some respects more dangerous than the Cuban Missile Crisis. Both sides were more heavily armed. Both sides were more hostile."

    Yet, by comparison with the Cuban crisis, what historians call the "Soviet war scare" of 1983 remains little known.

    "The Cuban Missile Crisis evolved in a very public manner," Prados says. "In 1983, the whole thing happened in secret."

    Behind the tensions of 1983 was a program devised in 1981 by then-KGB chief Andropov called Operation RYAN, not a code name but a stark Russian acronym for "nuclear missile attack." Secret orders were issued to KGB officers around the world to look for signs, however subtle, that the United States might be preparing a pre-emptive strike.

    KGB officer Oleg A. Gordievsky, who as a British agent later played a key role in the drama, watched RYAN unfold from KGB headquarters in Moscow and then as an intelligence officer posted to London. He saw a gulf between the views of worldly Soviet intelligence officers in the field and the paranoia of Kremlin leaders.

    "I never met anyone in the KGB who believed in the possibility of a first nuclear strike by the U.S.," Gordievsky says in an interview from his home in England. "Yet they all reported on the signs of such an attack because they were ordered to do it. They were afraid to say what they really thought."

    By 1983, Soviet fears were growing, heightened by the imaginary evidence of a planned surprise attack reported by the Kremlin's cowed spies and by tough talk from Reagan.

    In March, Reagan announced the Strategic Defense Initiative, a futuristic missile-defense scheme many American scientists thought technically implausible. Soviet military leaders took the plan far more seriously, worrying that it might eventually make the Soviet nuclear arsenal useless.

    That month, Reagan famously stepped up his anti-Soviet rhetoric. Speaking to the National Association of Evangelicals in Orlando, he urged the Christian group in considering nuclear freeze proposals not to "label both sides equally at fault, to ignore the facts of history and the aggressive impulses of an evil empire, to simply call the arms race a giant misunderstanding and thereby remove yourself from the struggle between right and wrong, good and evil."

    Meanwhile, NATO plans to deploy Pershing II nuclear missiles in West Germany were seen by the Soviet Union as shortening to a few minutes the warning they might have of a strike. From Moscow, the Pershing deployment created fears like those that shook Washington when U.S. spy planes discovered Soviet missiles in Cuba.

    "The danger was in the Soviet leadership thinking, 'the Americans may attack, so we better attack first,' " says Oleg D. Kalugin, a former KGB chief of foreign counterintelligence, who lives outside Washington. While the KGB and military had institutional interests in exaggerating the risk of attack, Andropov's distrust of American leaders was profound, says Kalugin, who knew him well.

    But American officials tended to interpret shrill protests from Andropov and his colleagues as empty propaganda. "The Reagan administration was committed to believing its own rhetoric - that SDI didn't threaten the Soviet Union and that the Pershings were not a first-strike weapon," Prados says.

    Then came a series of potential sparks. On Sept. 1, 1983, a Soviet fighter shot down a South Korean airliner that had strayed into Soviet airspace, killing the 269 people aboard and prompting angry accusations from both sides.

    Next, on Sept. 26, a Soviet satellite misinterpreted sunlight glinting off clouds above a U.S. intercontinental ballistic missile site in Montana as the launch of five ICBMs.

    Lt. Col. Stanislav Petrov, the duty officer overseeing the warning system, kept his cool, reasoning that a U.S. first strike would involve hundreds of missiles, not just five. Instead of sounding the alarm, Petrov checked ground radar and other data, and decided that the satellite's alert was false.

    Petrov's health was broken by the stress of the incident and its aftermath, and he soon retired from the military. When his actions that day were revealed by a Russian magazine in 1998, reporters found him infirm and surviving on a tiny pension outside Moscow.

    Petrov's unsung heroism did not end the danger. A major U.S.-NATO military exercise called Able Archer was planned for November 1983. With emotions still running high on both sides, some Soviet officials feared that the exercise might be a cover for the long-feared nuclear attack.

    But by then, Gordievsky had begun to feed to skeptical contacts in British intelligence documents from Operation RYAN showing that Soviet leaders' war fears were genuine.

    "There was incredulity at first. The British couldn't believe the Soviet leaders could think like this," recalls Gordievsky. "The Americans were even more disbelieving."

    Nonetheless, in part because Gordievsky's warnings were being passed on by British intelligence, the United States scaled back Able Archer, which initially foresaw a role-playing part for Reagan. Tensions gradually eased, though the CIA's Fischer has found the war scare had a second phase in East Germany as late as 1985-1986. By then, Mikhail S. Gorbachev had come to power, quashed the first-strike fears and moved boldly to negotiate an end to the Cold War.

    Gordievsky, 64, escaped Russia in 1985 but was sentenced to death in absentia for treason, a penalty never canceled by post-Soviet Russian governments. He says he considers Reagan's defense buildup "brilliant" because it persuaded the Soviet leadership that they could no longer compete and sped the end of the Cold War.

    But the strategy was also extremely risky, he says.

    "If the Soviet Union had overreacted, it could have gone very badly," he says, adding in a mild understatement: "If war had come, Soviet missiles would have destroyed Britain entirely, at least half of Germany and France and America would have lost maybe 30 percent of its cities and infrastructure."

    Click here to return to top of page.

    When J. Paul Getty Helped Hitler (posted 9-5-03)

    From DW-World.de (September 2, 2003):

    Newly released documents have revealed that oil billionaire and museum founder J. Paul Getty was a friend and admirer of Adolf Hitler and even lent his support to Nazi Germany in the early days of World War II.

    He was always a man of eccentric habits and he had an even more unusual resumé. Jean Paul Getty made his first million three years after graduating from college, married and divorced five times, collected art with a passion and was reputed to be the richest man in the world at his death. He was also an early fan of Adolf Hitler and supplied the Nazis with fuel at the outset of the war.

    The intelligence linking the billionaire oil man with the German dictator has come to light through the declassification of files from the British National Archives at Kew, in southern England. Getty and his war-time activities show up in a "suspect persons" dossier that was prepared by a New York-based security team that was part of Britain's Foreign Office for the Ministry of Economic Warfare.

    Getty appears to have been at the center of a shadowy group of financiers that provided support to Nazi Germany in the early days of World War II. The dossier says Getty sold one million barrels of oil to Germany. The fuel had to be delivered via Russia, a German ally at the time, because a British blockade was in place.

    The deliveries of oil and fuel continued until Germany attacked Russia during June 1941.

    The declassified information also paints the luxurious Hotel Pierre, which Getty bought in 1938, in colors befitting a bad spy novel. The Manhattan hotel, which Getty reputedly bought because he wanted to fire a waiter there who had been rude to him, was filled with "doubtful and flashy" characters, many with Nazi connections, such as the Austrian baron, the German war veteran and the former U-boat captain who were the hotel's first new employees after Getty took control.

    The intelligence file reads: "Among residents at the Pierre --living in luxury suites on no visible income sources -- were a Countess Mohle who spent her time making herself attractive to U.S. Army officers and was in a perpetual state of wide-eyed curiosity about military matters."

    Another long-term guest and suspicious Getty associate was a Russian-born Briton who claimed to work for British secret services but whose sympathies appeared to lie with the Nazis. Reports say he was sighted meeting with German agents in Cuba as well as members of the American far right, including the Ku Klux Klan.

    Before the outbreak of war, Getty had strong business and personal connections to Germany under Hitler.

    As a passionate art collector, he was interested in art collections that the Nazis had confiscated in Germany, the Netherlands and Austria and those that were broken up due to the "purifying" of their supposed "non-German" elements. It was also reported he showed interest in the furniture collection of the Rothschild banking family.

    In 1939, Getty returned from a trip to Europe talking jauntily about his "old friend" Hitler.

    But the Japanese attack on Pearl Harbor in December 1941 (photo) and America's subsequent entry into the war on the allied side seems to caused a change of heart. Getty became a patriotic American seemingly overnight, even volunteering for naval service at the age of 49.

    Click here to return to top of page.

    Preserving the Past in a Digital Era (posted 9-5-03)

    Roy Rosenzweig, writing in the American Historical Review (subscribers only) (June 2003):

    On October 11, 2001, the satiric Bert Is Evil web site, which displayed photographs of the furry Muppet in Zelig-like proximity to villains such as Adolf Hitler, disappeared from the web—a bit of collateral damage from the September 11th attacks. Following the strange career of Bert Is Evil shows us possible futures of the past in a digital era—futures that historians need to contemplate more carefully than they have done so far.

    In 1996, Dino Ignacio, a twenty-two-year-old Filipino web designer, created Bert Is Evil ("brought to you by the letter H and the CIA"), which became a cult favorite among early tourists on the World Wide Web. Two years later, Bert Is Evil won a "Webby" as the "best weird site." Fan and "mirror" sites appeared with some embellishing on the "Bert Is Evil" theme. After the bombing of the U. S. embassies in Kenya and Tanzania in 1998, sites in the Netherlands and Canada paired Bert with Osama bin Laden.

    This image made a further global leap after September 11. When Mostafa Kamal, the production manager of a print shop in Dhaka, Bangladesh, needed some images of bin Laden for anti-American posters, he apparently entered the phrase "Osama bin Laden" in Google's image search engine. The Osama and Bert duo was among the top hits. "Sesame Street" being less popular in Bangladesh than in the Philippines, Kamal thought the picture a nice addition to an Osama collage. But when this transnational circuit of imagery made its way back to more Sesame Street–friendly parts of the world via a Reuters photo of anti-American demonstrators (see Figure 2 ), a storm of indignation erupted. Children's Television Workshop, the show's producers, threatened legal action. On October 11, 2001, a nervous Ignacio pushed the delete key, imploring "all fans [sic] and mirror site hosts of 'Bert is Evil' to stop the spread of this site too."

    Ignacio's sudden deletion of Bert should capture our interest as historians since it dramatically illustrates the fragility of evidence in the digital era. If Ignacio had published his satire in a book or magazine, it would sit on thousands of library shelves rather than having a more fugitive existence as magnetic impulses on a web server. Although some historians might object that the Bert Is Evil web site is of little historical significance, even traditional historians should worry about what the digital era might mean for the historical record. U. S. government records, for example, are being lost on a daily basis. Although most government agencies started using e-mail and word processing in the mid-1980s, the National Archives still does not require that digital records be retained in that form, and governmental employees profess confusion over whether they should be preserving electronic files. Future historians may be unable to ascertain not only whether Bert is evil, but also which undersecretaries of defense were evil, or at least favored the concepts of the "evil empire" or the "axis of evil." Not only are ephemera like "Bert" and government records made vulnerable by digitization, but so are traditional works—books, journals, and film—that are increasingly being born digitally. As yet, no one has figured out how to ensure that the digital present will be available to the future's historians.

    Click here to return to top of page.

    You Try Writing a Biography of Arafat (posted 9-5-03)

    Barry Rubin, writing in the Chronicle of Higher Education (September 5, 2003):

    Writing a biography of anyone is a challenging task, but narrating and analyzing Yasir Arafat's life is a particularly daunting one. Arafat has held the international spotlight for longer than almost any other politician on the planet. He has been a political activist for 55 years, head of his own organization for 44, leader of his people for 36, and head of a virtual Palestinian government for 10. He has achieved little material progress for his people, but, even in the twilight of his career, he has neither given up nor been pushed aside.

    Despite that long and dramatic history, Arafat remains largely an unknown person. Everything about him is controversial, starting with the location of his birthplace. The most basic facts about his background, thoughts, and activities are disputed. Even the emotions he evokes are passionate and opposing. To make matters still more complex, he has always used highly secretive methods as the leader of what was -- and in many ways remains -- an underground organization.

    Indeed, it has been Arafat's inability to transform himself from clandestine revolutionary, his preferred persona, to statesman on the world stage (or even pragmatic politician) that has been a key factor in his failure: He has succeeded at creating the world's longest-running revolutionary movement, the Palestine Liberation Organization, but has been unable to bring it to a successful conclusion.

    Consider Arafat in a bunker in 2002 at his headquarters in the town of Ramallah, his provisional capital, as the Israeli army advanced. Once again, he was surrounded by the enemy, the sound of gunfire echoing in his ears, the world riveted by his every word. What could be more proper, fulfilling, glorious? No one could call him a sellout. And so, once more, he achieved that state of revolutionary nirvana. What others would have thought to be his most desperate moment seemed to satisfy him far more than negotiating peace or administering his near-state in the West Bank and Gaza Strip.

    "The more destruction I see, the stronger I get," newspapers quoted him saying.

    Length and continuity of observation are important here. I have been studying Arafat for more than 30 years, and such continuity makes a difference in making connections among themes and events, even in hearing the echo of specific statements and how they hark back to earlier situations. Over time, too, one can glimpse the ability of a political figure to change -- or to be paralyzed by the inability to do so.

    Without that kind of long-term perspective on Arafat, it is much harder to understand his career. Many people have thought everything about him was obvious -- even if they could not agree on facts and interpretations. There have been journalistic biographies, some hagiographic, some apologetic. Those works are at least a decade old, written before key events and without access to material in British and American archives subsequently opened; more important, they lack the perspective that could be provided only near the end of Arafat's career.

    Still, my own research project faced major barriers. Those who have not dealt with the Middle East may be unable to conceive of how difficult it is to establish even the simplest facts. A Palestinian journalist has written that she was asked, in passing, how many people lived in her hometown, Ramallah. She spent weeks on a quest for the answer, talking to a wide variety of officials, each of whom gave wildly differing figures. When my co-author, Judith Colp Rubin, and I were trying to put together a list of the members of leading PLO bodies, officials in the West Bank were unsure and had to call the PLO headquarters in Tunis to find the answers.

    Click here for the rest of the article.

    Click here to return to top of page.

    Bernard Lewis's Ahistorical Approach to Islamic History (posted 9-3-03)

    Adam Sabra, who teaches Middle East history at Western Michigan University, in an essay for Middle East Report Online about Bernard Lewis's What Went Wrong? (August 2003):

    For all the historical anecdotes which Lewis includes in his book, his approach to Islamic civilization is strangely ahistorical. For Lewis, the unity of religion and state in Islam originates in Muhammad himself, since he was both political and religious leader of the fledgling Muslim community (umma). Here Lewis, like many of the modern Islamists, accepts the idea that later Muslim institutions were already prefigured in the practice of the Prophet, an assertion of dubious historicity for which Lewis provides no evidence. For the Islamists, the state established by Muhammad provides the model against which the "Islamicness" of any subsequent Muslim society is to be judged. Those societies which most closely resemble or imitate this ahistorical ideal are considered properly Islamic, while the others are seen as in need of reform or revolution. For the less compromising of the Islamists, there can be no historical development or accommodation to local culture when it comes to God's eternal plan for society.

    While Lewis does not attribute Islamic political ideals to God, he shows a similar lack of interest in the manifest variety of Islamic societies which have developed across time and space. Lewis considers the relationship between religion and state to have been determined from the beginning, and not as a result of subsequent historical developments. He pays no attention to the fact that Muslim polities have produced very different political systems over the centuries, in large part due to developments within Islamic political institutions and the interaction of these institutions with different cultures and historical circumstances. For example, although he recognizes that Ayatollah Khomeini's theory that the jurists should rule a Shiite state in the place of the Messiah is a deviation from Imami Shiite tradition, he never explains what has inspired this change in attitude, or why some Shiite scholars have embraced it while others have maintained their traditional distance from political power. For Lewis, what counts is the clash of Islam and the West, not internal developments within the Islamic world, unless those developments somehow can be shown to enact the collision with external forces.

    Indeed, Lewis shows little interest in the entire period that falls between the early Abbasid caliphate of the eighth and ninth centuries and the late Ottoman period of the eighteenth century, which directly precedes the European colonization of the Middle East. Thus, he has nothing to say about the changes in political culture and institutions that occurred in this long period. At the beginning of it, the caliphal rule over the Islamic world was legitimized not by law, but by the caliphs' claim to be members of the Prophet's family (ahl al-bayt). This was as true of the Abbasids as it was of the Alids, the succeeding dynasty, since the two groups represented rival branches of the same extended family tree. Despite some efforts to the contrary, the caliphs never succeeded in controlling the development of Islamic theology or law. Their most famous attempt to intervene in these matters occurred during the debate over the nature of the Quran, and here they suffered an unmitigated defeat at the hands of the religious scholars (ulama). In an apparent attempt to bolster the caliphs' claim to interpret the Quran, the Abbasid Caliph al-Ma'mun tried to force government office-holders to accept the doctrine that the Quran had been created in time and was therefore not an eternal attribute of God. The eventual failure of this policy signaled the victory of the religious scholars over the caliphs in the struggle over who would determine correct theological doctrine. Since, however, the religious scholars disagreed about so many aspects of theology and law, what emerged was a range of opinions, not a universally accepted orthodoxy. During this period, a dizzying variety of theological, legal and philosophical ideas competed for influence among the scholars, each idea having its own partisans. Far from being absolute, religious doctrine in the Islamic world was highly fluid and hotly debated.

    Click here to return to top of page.

    Christians Are Ignoring History (posted 9-3-03)

    Editorial in Christianity Today (September 2, 2003):

    On any given week, of the top-selling 15 non-fiction books on The New York Times list, three to five are histories or biographies. In contrast, a weekly glance at the Christian Booksellers Association (CBA) nonfiction bestsellers list usually turns up none.

    Things are so bad that we are waving the white flag. Last spring, CBA multiplied its bestseller lists, so that separate rosters are now kept for Christian living, theology, Bible, inspirational, and other genres. But there is no list for the history/biography category. Apparently such books sell so rarely in Christian bookstores, there is no point in counting them.

    As a movement, evangelicalism seems fascinated with ministering to generations X, Y, or Z, unearthing Christian insights from The Matrix, fixing marriages, revitalizing the church, inspiring the discouraged—and on it goes. But we struggle to find time to reflect on our heroes, to treasure the great moments of our past, to "remember the deeds of the Lord … of long ago" (Ps. 77:11).

    One reason is that relatively few Christian books do the genre justice. Between hagiographies—inspirational histories—and academic treatises, lie relatively few accessible and responsible treatments.

    The ones that do balance these twin responsibilities strain to get the attention they deserve. Publishers say they would publish and promote more histories if bookstores would give them more shelf space. Bookstore owners say there are not enough accessible history titles to create a lively section. Many evangelical academics are afraid to do something popular lest their colleagues think they are slumming; the popularizers are prompted by some publishers to whitewash their histories to make them more "edifying." A lot of Catch-22s here.

    The good news is that there are accessible recent titles, written from a believer's point of view. Approachable scholarly histories can be found in Allen Guelzo's Abraham Lincoln: Redeemer President (Eerdmans) and Grant Wacker's history of early Pentecostalism, Heaven Below (Harvard). Recent evangelical biographies include John Perry's Charles Colson (Broadman & Holman), Bruce Shelley's Transformed by Love (on Vernon Grounds; Discovery House), and Kevin Belmonte's Hero for Humanity (on William Wilberforce; NavPress). Myrna Grant's Sacred Legacy (Baker) pulls together the writings of nine influential women in our history. And on it goes.

    Click here to return to top of page.

    Gospel Music's Scottish Roots (posted 9-3-03)

    Sarah Bruce, writing in the Press & Journal (Scotland) (Septmeber 1, 2003):

    An American professor of music believes his country's gospel tradition owes its existence to the solemn psalm-singing of the Scottish Hebridean churches.

    The passion of black church music - a discipline that nurtured such superstars as Whitney Houston, Tina Turner and Aretha Franklin - was always assumed to come from the days of American slavery.

    But Professor Willie Ruff, of Yale University, has put forward the theory that the style of leading the congregation on each line of the lyrics owes more to the Free Church of Scotland's style of psalm singing.

    Prof Ruff, 71, said that modern-day Afro-Americans had always assumed that their gospel style had been brought from Africa when their ancestors were sold into slavery in the US.

    But he claims his research has shown this is a misconception, and the musical traditions came from Scottish slave owners who brought their religious practices and psalms with them across the Atlantic.

    Prof Ruff said: "We as black Americans have lived under a misconception. Our cultural roots are more Afro-Gaelic than Afro-American. We got our names from the slave masters, we got our religion from the slave masters and we got our blood from the slave masters."

    The professor, who has played with Duke Ellington and Dizzy Gillespie, found his curiosity was aroused when he heard a Presbyterian congregation in Alabama singing in the same style as his own Baptist church, and wondered if it had come from the white Presbyterians. Research on the history of North Carolina showed Highlanders had settled there in the 1700s, and anecdotal evidence of African slaves speaking Gaelic convinced Prof Ruff he had to visit Scotland.

    He found himself in Stornoway, listening to the precentor lead a congregation by singing each line and having them repeat the haunting psalm melodies.

    The idea of rhythmic black gospel music having its roots in solemn Presbyterian Scotland is proving hard to swallow - Glasgow University sources said it was "plausible" and "intriguing", but one American gospel leader simply said: "Gospel music is black music."

    Prof Ruff added: "There will be Scots who are uncomfortable with the relationship and the involvement in the slave trade. But the Scots are like anyone, and there were many who were abolitionists and who set up schools for black children after emancipation."

    Click here to return to top of page.

    Remembering the Slaughter of Thousands of Foreigners in Japan After the Earthquake of 1923 (posted 9-2-03)

    From Japan Times (August 30, 2003):

    The Japan Federation of Bar Associations appealed Friday to the government to prevent a recurrence of the 1923 slaughter of thousands of Koreans and Chinese by the Japanese army and vigilante groups following a massive earthquake in the Tokyo area.
    In the lead up to the 80th anniversary of the Sept. 1, 1923, Great Kanto Earthquake, bar federation president Toru Motobayashi urged the government to prevent racial discrimination during emergencies by promoting human rights education among public servants, including Self-Defense Forces troops, police and immigration officers.

    After the massive earthquake flattened the Tokyo area, an estimated 6,000 Koreans and Chinese were killed by vigilantes and military forces when rumors spread that the foreign residents were poisoning wells, starting fires and planning an uprising.

    Motobayashi said he is making the appeal because people, "no matter their nationality or race, have the right to live in peace."

    Despite the events of seven decades earlier, Tokyo Gov. Shintaro Ishihara told troops at a Ground Self-Defense Force ceremony in April 2000 to be prepared for rioting by illegal foreigners in the event of a major disaster.

    In his remarks, which drew a firestorm of criticism, Ishihara used the term "sangokujin." The term, which literally means "people from third countries" was used as a derogatory label for people from Japan's former colonies, Korea and Taiwan, after World War II.

    Earlier this week, the bar federation recommended the government offer a state apology for the massacre, and urged it to investigate the incident in its entirety.

    The 1923 quake registered a magnitude of 7.9, flattening the capital, Yokohama and vicinities, and killing 140,000 people.

    Click here to return to top of page.

    Students Are Ignorant of History (posted 9-2-03)

    Editorial in the Arizona Republic (August 31, 2003):

    Scholars and educators at the Thomas B. Fordham Foundation, a Washington think tank that specializes in education reform, fear for the future of history - that is, the teaching of history or, as it generally is known today, social studies.

    They fear that the United States is producing a generation of students lacking an anchor in historical knowledge, notably the history of their own nation. Fordham has collected its litany of concerns into a fine little book, Where Did Social Studies Go Wrong?, written and edited by some of the nation's most accomplished education analysts.

    One of their fears is that the current emphasis on improving standardized test scores may further de-emphasize a field of study - history - that already was languishing. Such tests emphasize reading, writing and math, but they steer away from questions about the U.S. Constitution or its framers, mostly because very few states have fabricated a core curriculum for either history or its kin, civics and geography.

    The explanation for the low status of history, civics and geography is charged with politics. As Chester Finn of the Fordham Foundation observes, many of the colleges of education that provide the bulk of American teachers "interpret 'civics' as consisting largely of political activism and 'service learning' rather than understanding how laws are made and why it is important to live in a society governed by laws."

    At the same time, the teaching of history often is reconfigured today into a sort of ox-goring, with the United States as the designated ox. While it is important to teach of failures and cruelties - slavery and the plight of Native Americans, for example - it is just as important to explain how the creation of this democratic republic became a beacon of freedom like none the world has ever seen. High-school syllabuses too often seem to lack the latter.

    The terrorist attacks of Sept. 11 and the Iraq war have heightened the need for students to understand the world around them and the historical events that led us to where we are.

    To that end, Fordham has assembled another important analysis from political leaders, cultural analysts and others entitled, Terrorists, Despots and Democracy: What Our Children Need to Know. The collection of essays is a valuable primer on recent historical events and the politics that inspired them.

    It is a cliché, of course, to observe that those who are ignorant of history are doomed to repeat it. A more contemporary impetus for improving history instruction may be this: Tonight Show host Jay Leno strolling the streets of Burbank, Calif., asking questions like, "Who fought in the Spanish-American War?"

    Are you cringing yet?

    Click here to return to top of page.

    Ancient Greek Had Brain surgery (posted 9-2-03)

    Philip Pangalos, writing for theTimes (London) (September 2, 2003):

    Greek archaeologists on the eastern Aegean island of Chios have found evidence of successful brain surgery dating back to 250BC.

    The discovery - one of only a handful made in Greece providing evidence of successful ancient surgery, or trepanning, on the human skull - was made after what started as a routine appraisal for building permission on the outskirts of Chios town.

    Four tombs were found, in one of which was the body with signs of trepanning. The ancient process of drilling a hole in the skull for medical purposes was mentioned in the writings of Hippocrates, the Greek physician who lived in the 5th century BC and whose name survives in the Hippocratic oath.

    "The exciting thing about the skeleton is that it had signs of trepanation," Asterios Aidonis, an anthropologist at the Chios archaeological museum, said.

    "There's a 2cm hole in the left rear of the skull that has evidence of healing over time. This shows the person lived after the operation and there are very few signs of infection.

    "This is a very delicate operation, even in modern times. After preliminary investigations, we believe the subject survived for five to six years after the operation." He was estimated to be aged about 50 when he died.

    Mr Aidonis added:"This (find) is very significant as it is among very few such cases found in Greece."

    The process of trepanation, known to have taken place in neolithic times, is thought to have started as a ritual but over time it became a medical practice.

    Click here to return to top of page.

    The Colonial Origins of the Rwanda Genocide (posted 9-2-03)

    John Shattuck, writing in the NYT Book Review (August 30, 2003):

    Jean-Pierre Chrétien, a French historian with vast experience in the Great Lakes region of Africa, has undertaken the formidable task of tracing the roots of the region's violence and exposing the ideological myths on which the ancient-hatreds theory rests. In a monumental study that marches through two millenniums before approaching central Africa's contemporary agony, Mr. Chrétien punctures the sense of inevitability that permeates our thinking about the Rwandan genocide.

    Along the way, he illuminates the responsibility of a wide range of actors from the colonial period through the present. As warlords continue today to compete for power in a thoroughly ravaged Congo, Mr. Chrétien helps us understand how this all came about and why it matters that we know.

    The story begins with the geography of the central African highlands. Despite its equatorial location, Mr. Chrétien says, "the region is blessed with good climate, is rich with diverse soils and plants, and has prospered thanks to some strong basic techniques: the association of cattle keeping and agriculture; the diffusion of the banana a millennium ago; and the mastery of iron metallurgy two millennia ago." In this healthy environment, complex social structures evolved in which the idea of kingship and strong central authority took hold and flourished for more than 300 years before the arrival of colonial powers in the mid-19th century.

     

    The fertile lands around the Great Lakes were settled by indigenous Hutu cultivators, while the more mountainous areas were used for the raising of cattle by Tutsi pastoralists. In the early kingdoms of the region, agricultural and pastoral systems were integrated because they controlled complementary ecological zones and served mutually beneficial economic interests. As Mr. Chrétien argues convincingly, nowhere at this time could the "social dialectic be reduced" to a Hutu-Tutsi cleavage.

    That began to change in the 19th century. As social structures became more complex, the success of the central African kingdoms depended increasingly on territorial expansion through raiding, colonizing and annexing of neighboring lands. At the same time, Tutsi cattle raisers in search of more land began to emerge as a new elite and a driving force behind expansion. The kingdoms of Rwanda and Uganda were particularly expansionist, but were soon thwarted by the arrival of colonial powers. The immediate effect of colonialism was to reorient the stratified and dynamic societies of the Great Lakes around competing poles of collaboration with, and resistance to, the new foreign occupiers.

     

    Since these remote societies had been untouched by the slave trade that ravaged Africa's coastal regions, they presented the Europeans with a range of robust aristocracies and royal courts to win over.

    At this crucial point, the issue of race entered the picture. Obsessed by their theories of racial classification, 19th- and early-20th-century Europeans rewrote the history of central Africa. Imposing their own racist projection of superiority on Tutsi "Hamito-Semites" and a corresponding inferiority on Hutu "Bantu Negroes," missionary and colonial historians began to attribute the rise of the Great Lakes kingdoms to the arrival of a superior race of "black Europeans" from the north.

    Anointed by the Belgians as their administrators and collaborators in Rwanda and Burundi, the Tutsis, who never constituted more than 18 percent of the population, were presented with a poisoned chalice combining ethnic elitism with economic favoritism.

    In educating their chosen elites, the Belgians were relentlessly racist. Starting in 1928, all primary schools in Rwanda were segregated, while at the secondary-school level Rwandan (and later Burundian) Tutsis were three to four times better represented than Hutus.

    Not surprisingly, the majority Hutu population chafed at this discrimination, and in the late 1950's a Hutu counter-elite began calling for the end of "Tutsi feudalism." On the eve of independence, the growing Hutu rebellion was backed, in a catastrophic reversal, by the Roman Catholic Church and the colonial administration, which now claimed that the Hutu majority represented "democratic values." The outcome, as Mr. Chrétien shows, was that "the new Rwanda declared its national past `Tutsi' and thus despicable."

    Click here to return to top of page.

    The Stuff Museums Decline to Exhibit (posted 9-2-03)

    From NPR (August 28, 2003):

    Most museums display less than five percent of the objects they own. Some items are too fragile to display or no longer fit a museum's mission. But as Harriet Baskas found out, there are plenty of other reasons some objects stay in storage.

    Back in 1719, when Russian Czar Peter the Great opened what is considered to be the first public museum, he displayed all sorts of natural and human rarities: exotic butterflies, skeletons -- and living specimens such as Foma the Lobster Boy. Born with fused fingers and toes, Foma was popular -- even after his death. That's because he was stuffed and kept on display, which Columbia College professor Stephen Asma says "must have been horrific."

    "The level of taxidermy at that time was really rudimentary," he tells Baskas. "Sort of like the same technique you'd use upholstering chairs. It was really low-tech."

    Asma is the author of a book about the culture and evolution of natural history museums, called Stuffed Animals and Pickled Heads. He says that in 18th-century America, there was even a proposal to pickle such prominent people as Thomas Jefferson, Ben Franklin and other founding fathers.

    "Up to the present, people continue to want see something exotic, bizarre, that hits them at this emotional, passionate level," Asma explains. "And I think the trick for current curators now is to have that sort of thing in the museum but to turn that emotional experience into something that people can learn from."

    It's a balance between showing off great objects and telling a great story.

    At the Smithsonian Institution, there are hundreds of worthy objects kept off display.

    Steven Lubar is chairman of the history of technology department at the Smithsonian Institution's National Museum of American History in Washington, D.C.

    "We have a nice collection of scrimshaw," Lubar says, "and as you can imagine, occasionally sailors made vaguely pornographic scrimshaw and I don't think we'd display that."

    What's considered appropriate to display does change over time. Lubar says it took more than 125 years to transform objects collected the night of President Abraham Lincoln's assassination from ghoulish peep show to history lesson.

    "Right after President Lincoln was assassinated," Lubar explains, "artifacts that were in that room ended up at the Smithsonian and the secretary of the Smithsonian then said, 'We'll take that stuff but we'll keep it offsite, it's not what we do here, we don't display that kind of thing.'"

    Now, says Lubar, put into the right context, the objects seem appropriate. The items are currently part of the exhibit, "The American Presidency: A Glorious Burden."

    Chicago's Field Museum has a collection of shrunken heads made by an Ecuadorian tribe once known as the Jivaro. But curators at the Field Museum have decided to take the remains off exhibit. The heads were very popular, but the museum decided they weren't portraying an accurate image of the tribe.

    Click here to return to top of page.

    How Textbook Publishers Are Dealing with 9-11 as History (posted 9-2-03)

    Eleanor Chute, writing for the Associated Press (August 29, 2003):

    When hijacked airliners attacked U.S. targets on Sept. 11, 2001, major textbook publishers were in the final weeks of updating their social studies texts for the school year that began this month.

    Nearly two years after the tragedy, those books reflect the challenges of trying to write history while it's still a current event and still fresh in the minds of children.

    Some editors chose not to run photos of the World Trade Center or the Pentagon, at least in part, they say, because some of their readers may have lost a loved one in the attacks.

    "The American Nation," a high school text published by Holt, Rinehart and Winston, devotes six pages to "A Day That Changed the World." The only photo showing any destruction features the Statue of Liberty prominently, with Manhattan veiled by heavy smoke.

    "We were looking for pictures that would show the significance of the event, would be memorable pictures for what we could show of the events, but also pictures that did show the patriotic feelings that it stirred up," said Robert Wehnke, managing editor of social studies for Holt, Rinehart and Winston, a division of Harcourt Education, which also publishes "Call to Freedom" used in middle schools.

    The high school text included a mention of congressional members singing "God Bless America" on the U.S. Capitol steps and the large number of American flags that were sold.

    Another high school and middle school textbook publisher, McDougal Littell, a division of Houghton Mifflin, chose a more graphic approach in its 14 pages devoted to its section, "The War on Terrorism." Its books are called "Creating America" for eighth-graders and "The Americans" for high school students.

    The books include photos of the twin towers before the tragedy, a photo of smoke streaming from the World Trade Center after impact, and damaged sections of both the twin towers and the Pentagon.

    "How can you tell the story?" said Chris Johnson, editorial director of social studies at McDougal Littell. "We looked at literally hundreds of photos. We chose photos we felt would tell the story and give an idea of the devastation. We certainly were not going to show falling bodies and really gruesome aspects."

    But the text does note that some victims jumped from the skyscrapers to their deaths, and it spends a page recounting the story of an airline pilot who, a month after the hijackings, told passengers to beat up anyone who might try to take over the plane and gave them suggestions on how to do it.

    Johnson said McDougal Littell issued the pages early last year as a supplement to those who already owned the books, and they were well-received.

    Like some other publishers, McDougal Littell is in the process of more revisions. This time, Johnson said, it won't include the mention of people jumping or the pilot who advised self-defense.

    "I think the story has changed. I think that now we have gone in, and we have toppled Saddam Hussein. I think the emphasis is going to shift a little bit," Johnson said.

    The Holt book doesn't provide as wide a context of world terrorism as some other texts, but Wehnke said Holt wanted to concentrate on the events themselves because "we thought that is how it would be taught in a classroom situation."

    "In the balance, the events we picked up to talk about are still the events that people remember," he said.

    "In 15 years, I don't think people will remember people jumping out of the building, but they will remember President Bush's speech, what (then-New York City Mayor) Rudolph Guiliani did. They will remember the firefighters and policemen and their sacrifice."

    On the elementary level, publisher Scott Foresman, a division of Pearson Education, shows the New York skyline with and without the World Trade Center as well as a photo of firefighters raising the American flag. Firefighters and others who raced to the rescue are emphasized.

    Many of the books note that the terrorism was done by Muslim extremists. The elementary text, "Our Nation," by McMillan, a division of McGraw-Hill, says, "Muslim-Americans were quick to denounce the terrorist attacks."

    It says that many Muslim-Americans are recent immigrants, adding, "Like earlier immigrants, the new immigrants have brought their customs, traditions and religion, and are working hard to be good citizens."

    Click here to return to top of page.

    Howard Meyer: Embracing the World Court (posted 9-2-03)

    Jennifer R. Johnson and Ami Mudd, in the course of a review of a new book on the World Court (Rowman & Littlefield, 2002) by Howard Meyer (Santa Clara Law Review, December 10, 2002):

    In 1905, Massachusetts led the United States in instituting an order for all schools to celebrate May 18, the anniversary of the first Hague Conference, an international conference set up to discuss establishing the “American idea” of substituting law for war.2 By contrast, international law is not even part of the standard curriculum in American law schools today; rather it is a “subject for specialists.”

    For those who want to learn about the history of the “World Court”4 and the United States’ role in its formation, evolution, and effectiveness, Howard N. Meyer’s book, The World Court in Action: Judging Among the Nations,6 is a good primer on the topic. Meyer is a lawyer and well-regarded social historian, and his familiarity and comfort with the subject matter shines through ....

    Meyer makes a strong and persuasive argument for why the United States should become a true participant in what could then legitimately be called the World Court, a court formed as a result of a successful U.S. arbitration experience and based upon the U.S. Supreme Court. ...

    As detailed in the book, in 1862 the English ship The Alabama, later joined by other ships, conducted a series of attacks on a total of eighty U.S. merchant ships. The U.S. public was enraged when the USS Kearsage sank the Alabama, finding this act no less treasonous than the attacks retaliated against. Then President [U.S.] Grant avoided further confrontation by negotiating a treaty to submit the Alabama claims to neutral, third party arbitration. The U.S. damage award was $15.5 million (roughly the value of the destroyed ships), but the greater victory was held to be “for peace and arbitration.” This success was seen as definitive evidence of the feasibility of substituting arbitration for war, and fueled discussion by European peacemakers for a permanent arbitration tribunal, which they called the “American plan.”

    The culmination of U.S. and foreign energies resulted in the Hague Conference of 1899. Although the concept of an international court of arbitration was only one of the goals of the conference, it became the central aspect of it. What emerged was the so-called Permanent Court of Arbitration, which contained the essential features American plan, but contained no commitment to arbitrate, a feature that would cripple the Court’s virility for decades to come.

    The international response to the results of the conference was the negotiation of more than 150 treaties within a few years. However, many believed the aims of the Court were not yet satisfied. Ideas emerged for converting the Court to an entity with a true panel of permanently sitting judges and for a second Hague Conference flaw the first conference suffered was the exclusion of the Central and South American countries–ironic because arbitration had flourished among these nations in the nineteenth century....

    Bernard Loder, a Dutch representative, stated at the establishment of the World Court in 1920, that “administer[ing] justice between two contesting parties only after having obtained their mutual consent[,] . . . agreement on the wording of the complaint[,] and [ ] choice of judges” would be “not worth the trouble.”122 Myer suggests that if viewed through the eyes of Andrew Carnegie or other peace enthusiasts of the early twentieth century the answer to which it was "worth the trouble" may well be “no.” The Court did not abolish or prevent war, nor has the Court become a court of “compulsory jurisdiction." Meyer clearly sees this absent aspect as the failing point of the Court, if it has failed at all.

    Meyer documents the early support of the United States--in the form of Presidents, congressmen, Supreme Court Justices and law school deans and professors--to settle disputes through persuasive means. In so doing, Meyer suggests, that what continues to make the question itself “Was it worth the trouble?” even a consideration hinges on the United States maintaining its stance of immunity from the Court's judgment.

    Meyer says that it is easy to question the effectiveness of a World Court, yet no court has power of its own–it must be given its power from the community it serves. He believes the fact that most of the Court’s decisions have been followed demonstrates that the Court has been given this power. However, Meyer makes no mention made of exactly why the United States refrains from compulsory jurisdiction.

    Click here to return to top of page.

    Michael Novak: Why He Loves Mel Gibson's Movie (posted 8-29-03)

    Michael Novak, writing about The Passion for the Weekly Standard (August 25, 2003):

    It is the most powerful movie I have ever seen. In the days since watching that rough cut, I have not been able to get the film out of my mind. Although I have read many books on the death of Jesus, and heard countless sermons dwelling on its details, I would never have believed a human being could suffer as much as Gibson's Christ does. Seen through the perspective of the mother of Jesus, as this film allows the viewer to do, the suffering is doubly painful--for with her, we watch the unbearable scourging, gustily delivered by the Romans at Pilate's orders nearly to the point of death. The pillar to which Jesus is chained is less than waist-high, so that his back is bent while he must keep himself on his feet. When he is dragged away, blood lies pooled and splattered on the white marble floor. The soldiers' laughter echoes again at the moment of the awful downward push when he is crowned with thorns. And then there are the thundering falls of the scourged Christ upon his flailed and bleeding back, under the impossible weight of the cross.

    There are, in a sense, only five historical accounts of the Passion: in the gospels of Matthew, Mark, Luke, and John, and, in bare but vivid outline, in the letters of St. Paul. Paul's accounts are by some thirty years the earliest and represent in large strokes the settled beliefs of the first generation of Christians. Down the centuries, the narrative of Christ's death and its meaning have remained much the same.

    The fuller accounts of Matthew, Mark, Luke, and John supplement each other, often overlapping and sometimes contradicting one another on the sort of contingent details that eyewitnesses (or their note-takers) often report differently. But all the Christian accounts agree that Jesus Christ suffered and died for the sins of all human beings of all time, under the command of the Roman consul in Jerusalem, Pontius Pilate.

    Jewish accounts concur that Jesus was a Jew who suffered and died under the Roman authorities. His claims for himself seemed to Jewish authorities then (and since) to be blasphemous--for Christ clearly announced that he owned an authority higher than the high priests and the rabbis', said forthrightly that he was greater than Solomon, and put himself on a higher plane than Moses. He went even further, daring to call God his father.

    The claims Christ made for himself seemed at the time divisive and dangerous. Many people, the Jewish authorities told Pilate, were following this man's lead. His history, they said, showed that he worked magic, performed miracles, and consorted with demons. He had been sent by God, he as much as said, to "fulfill the Scriptures." His continued preaching might lead to riot and rebellion. But only the Romans had the power to do to Jesus what was actually done, and so it was under the authority of Pontius Pilate, and at the hands of the Roman Empire, that Jesus "was crucified, died, and was buried."


    AT THE TIME of Christ's death, Christianity was still internal to Judaism. The Christian Church itself began not at the Passion, but fifty-three days later on Pentecost, when the apostles left an "upper room" in Jerusalem speaking in tongues. With his preaching Jesus had clearly put a challenge to Judaism, expressly announcing a "new" covenant, whose mandate was to "complete" and "fulfill" the "old" covenant. And there is no doubt that Jesus' death meant a parting of the ways between Christians and Jews. Nonetheless, from a Christian point of view, the life and teachings of Jesus and his new covenant do not remove or destroy the old covenant. God cannot be unfaithful to his promises. Besides, if the Creator is not faithful to his first covenant with the Jews, how can Christians expect Him to be faithful to His new covenant with them?

    Thus, Christians hold that Christianity fulfills the hopes launched into the world by Judaism. They also hold that those Jews who reject Christianity remain vessels of God's first love. In God's mysterious plan, the continuation of Judaism in time is a grace to be respected, on the same principle on which the faith of Christians rests--the fidelity of God to his everlasting promises.

    The Jewish leaders of the generation that knew him did in fact reject Jesus and his claims, and they did accuse him of blasphemy. "Nevertheless," as the Second Vatican Council said in its statement on Judaism, "the Jews still remain very dear to God, for the sake of the patriarchs, since God does not take back the gifts he bestowed or the choice he made." The Council strictly forbids Catholics to hold Jews to be "repudiated or cursed by God, as if such views followed from the Holy Scriptures." And it deplores "all hatreds, persecutions, displays of anti-Semitism leveled at any time and from any source against the Jews." This condemnation includes the Church's own sins. The Council stressed the two covenants' common spiritual heritage and foresaw a future in which both communities would serve God "shoulder to shoulder."

    Gibson's film is wholly consistent with the Second Vatican Council's presentation of the relations of Judaism and the Christian Church. But "The Passion" will not be easy for Jews to watch. One reason is simply that its entire subject is the death of one who, for many Jews, is a figure of division, Jesus Christ. And a second reason is that it is never easy to relive a moment in which the leaders of one's community, however justified they might have been by their own lights and their own sense of responsibility, do not appear to viewers to be acting in a noble way. As a Catholic, I cringe every time I go to the theater when a pope, cardinal, archbishop, or even priest is portrayed in an unflattering light. Even when they deserve it, I do not enjoy the spectacle.

    In the first part of the gospels' account of the Passion, the high priests of Jerusalem standing before Pilate are, painfully no doubt to contemporary Jews, the voice for the prosecution. During the early scenes of the movie, which I tried to watch as if I were Jewish or seated alongside a Jewish colleague, I thought: This is too painful. Having sat through many analogous moments as a Catholic, I did not like the experience.


    VERY SOON, though, the action in the film belongs to the Romans. Roman soldiers inflict systematic pain on Jesus with gusto, lighthearted bantering, and the practiced sadism of those who know how to keep subdued populations subdued. The overwhelming drama consists in Christ's willing endurance of unbearable suffering, for the purpose of inaugurating an entirely new order in human life. The movie, like the gospels, is unmistakable in setting this meaning before our eyes. It is, somehow, our sins for which Jesus is dying.

    Click here to return to top of page.

    Irving Kristol: What Is Neoconservatism? (posted 8-29-03)

    Irving Kristol, writing in the Weekly Standard (August 25, 2003):

    WHAT EXACTLY IS NEOCONSERVATISM? Journalists, and now even presidential candidates, speak with an enviable confidence on who or what is "neoconservative," and seem to assume the meaning is fully revealed in the name. Those of us who are designated as "neocons" are amused, flattered, or dismissive, depending on the context. It is reasonable to wonder: Is there any "there" there?

    Even I, frequently referred to as the "godfather" of all those neocons, have had my moments of wonderment. A few years ago I said (and, alas, wrote) that neoconservatism had had its own distinctive qualities in its early years, but by now had been absorbed into the mainstream of American conservatism. I was wrong, and the reason I was wrong is that, ever since its origin among disillusioned liberal intellectuals in the 1970s, what we call neoconservatism has been one of those intellectual undercurrents that surface only intermittently. It is not a "movement," as the conspiratorial critics would have it. Neoconservatism is what the late historian of Jacksonian America, Marvin Meyers, called a "persuasion," one that manifests itself over time, but erratically, and one whose meaning we clearly glimpse only in retrospect.

    Viewed in this way, one can say that the historical task and political purpose of neoconservatism would seem to be this: to convert the Republican party, and American conservatism in general, against their respective wills, into a new kind of conservative politics suitable to governing a modern democracy. That this new conservative politics is distinctly American is beyond doubt. There is nothing like neoconservatism in Europe, and most European conservatives are highly skeptical of its legitimacy. The fact that conservatism in the United States is so much healthier than in Europe, so much more politically effective, surely has something to do with the existence of neoconservatism. But Europeans, who think it absurd to look to the United States for lessons in political innovation, resolutely refuse to consider this possibility.

    Neoconservatism is the first variant of American conservatism in the past century that is in the "American grain." It is hopeful, not lugubrious; forward-looking, not nostalgic; and its general tone is cheerful, not grim or dyspeptic. Its 20th-century heroes tend to be TR, FDR, and Ronald Reagan. Such Republican and conservative worthies as Calvin Coolidge, Herbert Hoover, Dwight Eisenhower, and Barry Goldwater are politely overlooked. Of course, those worthies are in no way overlooked by a large, probably the largest, segment of the Republican party, with the result that most Republican politicians know nothing and could not care less about neoconservatism. Nevertheless, they cannot be blind to the fact that neoconservative policies, reaching out beyond the traditional political and financial base, have helped make the very idea of political conservatism more acceptable to a majority of American voters. Nor has it passed official notice that it is the neoconservative public policies, not the traditional Republican ones, that result in popular Republican presidencies.

    Click here to return to top of page.

    Michael Beschloss: The One Question Voters Should Ask Themselves (posted 10-23-03)

    Michael Zitz, writing in fredericksburg.com (Oct. 22, 2003):

    American voters, presidential historian Michael Beschloss told a Fredericksburg Forum audience last night, should ask themselves one question when casting their ballots:

    Is the candidate someone who would be willing to lose an election by doing an unpopular thing that happens to be the best thing for the country?

    During his talk in Mary Washington College's Dodd Auditorium, Beschloss cited former President John F. Kennedy's efforts on behalf of civil rights as an example.

    Beschloss said Kennedy knew his push for civil rights could very well cost him the South and the 1964 presidential election.

    In an interview before the forum, Beschloss went further, saying Kennedy's civil rights policy may have cost him his life.

    One of the primary reasons Kennedy went to Texas in November 1963, he said, was to try to shore up support in a pivotal state that was vehemently opposed to his civil rights legislation.

    And, of course, Kennedy was killed during his motorcade's ride through Dallas.

    Beschloss said the Kennedy administration was well aware of how dangerous it was for the president to go to Dallas, which was a hotbed of extremism at the time.

    He said Vice President Lyndon B. Johnson was to give a speech at a fund-raising dinner for Kennedy in Austin the evening of Nov. 22. Johnson's opening line was to be a joke--'Mr. President, we are all very glad you made it out of Dallas alive.'"

    Of course, that line was never delivered because Kennedy was assassinated, and Johnson returned to Washington on Air Force One that night, taking the oath of office during the flight.

    Click here to return to top of page.

    The Hindu Holocaust (posted 10-23-03)

    Francois Gautier, writing in rdiff.com (Oct. 21, 2003):

    The massacre of 6 million Jews by Hitler and the persecution Jews suffered all over the world in the last 15 centuries has been meticulously recorded by the Jews after 1945 and has been enshrined not only in history books, but also in Holocaust museums, the most famous one being in Washington, DC.

    It has not been done with a spirit of revenge -- look at Israel and Germany today -- they are on the best of terms; yet, facts are facts and contemporary Germany had to come to terms with its terrible actions during World War II.

    Hindus, Sikhs and Buddhists have also suffered a terrible holocaust, probably without parallel in human history. Take the Hindu Kush for instance, probably one of the biggest genocides of Hindus. There is practically no serious research ever done about it and no mention in history books. Yet the name Hindu Kush appears many times in the writings of Muslim chroniclers in 1333 AD

    Ibn Battutah, the medieval Berber traveller, said the name meant 'Hindu Killer,' a meaning still given by Afghan mountain dwellers. Unlike the Jewish holocaust, the exact toll of the Hindu genocide suggested by the name Hindu Kush is not available. 'However,' writes Hindu Kush specialist Srinandan Vyas, 'the number is easily likely to be in millions.'

    A few known historical figures can be used to justify this estimate. The Encyclopaedia Britannica recalls that in December 1398 AD, Taimurlane ordered the execution of at least 50,000 captives before the battle for Delhi; likewise, the number of captives butchered by Taimurlane's army was about 100,000.

    The Britannica again mentions that Mughal emperor Akbar ordered the massacre of about 30,000 captured Rajput Hindus on February 24, 1568 AD, after the battle for Chitod, a number confirmed by Abul Fazl, Akbar's court historian. Afghan historian Khondamir notes that during one of the many repeated invasions on the city of Herat in western Afghanistan, which used to be part of the Hindu Shahiya kingdoms '1,500,000 residents perished.' 'Thus,' writes Vyas, 'it is evident that the mountain range was named as Hindu Kush as a reminder to the future Hindu generations of the slaughter and slavery of Hindus during the Moslem conquests.'

    Or take the recent plight of the Kashmiri Pandits. Over 400,000 Kashmiri Pandits have been forced to flee their homeland. Many Pandit men, women and children have been brutally murdered. About 70,000 still languish in makeshift refugee camps in Jammu and Delhi. Scores of temples in Kashmir have been desecrated, destroyed, looted, more than 900 educational institutions have been attacked by terrorists. Properties of Pandits have been vandalised, businesses destroyed or taken over, even hospitals have not been spared.

    Did you know that this huge human tragedy is taking place in Free India?

    Burning books, looting culture is a very important part of the plan as we have seen during early Muslim invasions, where Buddhist centres of learning were ruthlessly burnt and razed to the ground.

    Click here to return to top of page.

    Cover-Up Alleged in Probe of USS Liberty (posted 10-23-03)

    Jennifer Kerr, writing for the Associated Press (Oct. 23, 2003):

    A former Navy attorney who helped lead the military investigation of the 1967 Israeli attack on the USS Liberty that killed 34 American servicemen says former President Lyndon Johnson and his defense secretary, Robert McNamara, ordered that the inquiry conclude the incident was an accident.

    In a signed affidavit released at a Capitol Hill news conference, retired Capt. Ward Boston said Johnson and McNamara told those heading the Navy's inquiry to "conclude that the attack was a case of 'mistaken identity' despite overwhelming evidence to the contrary."

    Boston was senior legal counsel to the Navy's original 1967 review of the attack. He said in the sworn statement that he stayed silent for years because he's a military man, and "when orders come ... I follow them."

    He said he felt compelled to "share the truth" following the publication of a recent book, "The Liberty Incident," which concluded the attack was unintentional.

    The USS Liberty was an electronic intelligence-gathering ship that was cruising international waters off the Egyptian coast on June 8, 1967. Israeli planes and torpedo boats opened fire on the Liberty in the midst of what became known as the Israeli-Arab Six-Day War.

    In addition to the 34 Americans killed, more than 170 were wounded.

    Israel has long maintained that the attack was a case of mistaken identity, an explanation that the Johnson administration did not formally challenge. Israel claimed its forces thought the ship was an Egyptian vessel and apologized to the United States.

    After the attack, a Navy court of inquiry concluded there was insufficient information to make a judgment about why Israel attacked the ship, stopping short of assigning blame or determining whether it was an accident.

    It was "one of the classic all-American cover-ups," said Ret. Adm. Thomas Moorer, a former Joint Chiefs of Staff chairman who spent a year investigating the attack as part of an independent panel he formed with other former military officials. The panel also included a former U.S. ambassador to Saudi Arabia, James Akins.

    "Why in the world would our government put Israel's interest ahead of our own?" Moorer asked from his wheelchair at the news conference. He was chief of naval operations at the time of the attack.

    Moorer, who has long held that the attack was a deliberate act, wants Congress to investigate.

    Israeli Embassy spokesman Mark Regev disputed any notion that Israel knowingly went after American sailors.

    "I can say unequivocally that the Liberty tragedy was a terrible accident, that the Israeli pilots involved believed they were attacking an enemy ship," Regev said. "This was in the middle of a war. This is something that we are not proud of."

    Calls to the Navy seeking comment were not immediately returned.

    David Lewis of Lemington, Vt., was on the Liberty when it was attacked. In an interview, he said Israel had to know it was targeting an American ship. He said a U.S. flag was flying that day and Israel shot it full of holes. The sailors on the ship, he said, quickly hoisted another American flag, a much bigger one, to show Israel it was a U.S. vessel.

    "No trained individual could be that inept," said Lewis of the Israeli forces.

    In Capt. Boston's statement, he does not say why Johnson would have ordered a cover-up. Later in a phone interview from his home in Coronado, Calif., Boston said Johnson may have worried the inquiry would hurt him politically with Jewish voters.

    Moorer's panel suggested several possible reasons Israel might have wanted to attack a U.S. ship. Among them: Israel intended to sink the ship and blame Egypt because it might have brought the United States into the 1967 war.

    Click here to read Jay Cristol,"USS Liberty: Israel Did Not Intend to Bomb the Ship" (HNN).

    Click here to return to top of page.

    Digging Up the Dead to Settle Historical Debates: A Good Idea? (posted 10-23-03)

    Timothy W. Maier, writing for Insight (News World Communications) (Nov. 10, 2003):

    Digging up the dead for forensic examination falls somewhere between morbid curiosity and setting the historical record straight. Or maybe it's a little bit of both. Some critics charge exhumations show disrespect for the dead and drain scarce resources that could be used to solve prosecutable crimes. Others argue that selectively collecting DNA from the grave can open the door to solving historical mysteries. Whatever the motive, few can rest in peace with the exhumation movement ghoulishly alive and kicking.

    "Exhume, exhume," wrote the late Murray N. Rothbard in one of his satirical essays. "Let's exhume the body of every president who died in office, and let's take another more scientific look." Devilish fun? Rothbard was given to that sort of thing. But why stop with presidents? Dig in the history books and pick a body for the next archaeological dig. Historians, forensic scientists and law-enforcement officers appear to be doing just that as they increasingly turn to the graveyards for clues to mysteries in a range of cases from the true identities of notorious outlaws to presidents who dropped dead under suspicious circumstances.

    Critics suggest exhumation may be occurring at an alarming rate and claim many dead historical figures are being exploited by profit-hungry vultures who would dig up their own grandmothers if it were likely to produce a headline and make a buck. Could this be happening in New Mexico, where Democratic Gov. Bill Richardson is pushing hard to dig up the truth about legendary Wild West outlaw William Bonney, alias Billy the Kid?

    In supporting the dig Richardson, who was energy secretary during the Clinton administration, says he aims to put an end to the conspiracy story that claims Billy the Kid was not gunned down by Sheriff Pat Garrett on July 14, 1881. While historians generally agree that the Billy the Kid who robbed and murdered people was shot by Garrett and subsequently buried at Fort Sumner, a posse of historical revisionists and conspiracy theorists calls the accepted version bunk.

    In fact, several others have claimed to be the real Kid, insisting Garrett killed the wrong man. One of these was the late Ollie P. "Brushy" Bill Roberts of Hico, Texas, who surfaced in 1950 to seek a pardon from New Mexico's then-governor Thomas Mabry. The governor declined the pardon, declaring Brushy's story to be flawed. Nevertheless, Texans credited Mabry's tale and built a museum in downtown Hico to honor their man with a bronze plaque proclaiming: "We believe his story."

    Richardson has teamed up with Lincoln County, N.M., Sheriff Tom Sullivan to support a media-frenzy exhumation of the claimants that has attracted both national and international press and sparked a boost in tourism for the state. They plan to exhume four graves those of the man history credits as being Billy the Kid, his mother Catherine Antrim, Roberts and another man some suspect survived the gun battle with Garrett. Sullivan also hopes the investigation might clear up 122-year-old suspicions about Garrett's conduct and circumstances surrounding Billy's escape from a Lincoln County jail where he reportedly stole a pistol and killed two deputies.

    Fort Sumner Mayor Raymond Lopez says Billy the Kid tourism is all he has to keep people coming to his town and opposes the exhumation. The mayor is backed by the Billy the Kid Historical Preservation Society, which recently attacked the exhumation on its Website. An article there by British historian Frederick Nolan, author of The West of Billy the Kid, says, "I hate to be a party-pooper, but this whole project is complete and utter nonsense."

    The disputations among the boosters, hobbyists, historians and scientists continue to build. And noted exhumation expert James Starrs, a law professor and forensic scientist at George Washington University, tells Insight that since the woman believed to be the mother of Billy the Kid may not be his real mother DNA comparisons will mean nothing. In fact, there is evidence that the diggers may have to sort through several sets of remains to find what they are looking for because a flood years ago unearthed a few caskets and locals may have put them back under the wrong tombstones.

    The expected exhumation of Billy the Kid follows a long list of the famous and infamous dead who have seen the light of day in recent years. Here's a snapshot of such exhumation projects in which the departed have been regarded as human time capsules of evidence to confirm or disprove legends and tales:

    * John F. Kennedy assassin Lee Harvey Oswald was exhumed in 1981 in response to allegations that he had a double. The leaky casket in Rose Hill Cemetery in Fort Worth, Texas, yielded tremendous damage to the body. Indeed, the body had so many irregularities that doubters still didn't believe forensics authorities who maintained the body was that of Oswald.

    * President Zachary Taylor, who was exhumed in 1991 after a descendent of the 12th president plopped down $1,200 to check out a conspiracy theory that "Old Rough and Ready," a Mexican War hero, may have been poisoned. Indeed medical examiners did find traces of arsenic in his body, but the eager conspiracy theorists failed to account for the minor detail that traces of arsenic are found in nearly everyone who is embalmed.

    * Dr. Sam Sheppard, convicted of murdering his wife, inspired the hit TV series and big-screen movie The Fugitive. His son, Samuel Reese Sheppard, exhumed his father's remains in 1997 to extract DNA and compare it to newly discovered blood evidence found on a mattress and a door at the crime scene. The father served 10 years for the crime before being freed in 1964 by attorney F. Lee Bailey's argument that the media had so inflamed the community that Sheppard had not received a fair trial and was convicted wrongfully. The DNA did not match, proving the presence of an intruder and confirming that Sheppard was innocent.

    Who might be next is anybody's guess. In fact, humans may not be the only ones being dug from their graves. Rumors continue to persist that former president Richard Nixon's dog, Checkers, will be exhumed from a Long Island, N.Y., pet cemetery and reburied on the grounds of the Nixon library. The most famous of all White House dogs, Checkers is credited with saving Nixon's career when his master, as vice president, denied in a 1952 speech allegations of accepting $18,000 in cash but confessed to keeping the dog as a gift to his children that he vowed not to return.

    Disinterment is not exactly a new phenomenon. For hundreds of years the dead have been moved from one cemetery to another all around the world. The remains of Medieval theologian John Wycliffe, a papal critic who was responsible for translating the Bible into English, were disinterred by papists in 1414 to be burned and scattered into the Thames. Both classical composers Ludwig van Beethoven and Johann Sebastian Bach also were dug up. Beethoven, who died in 1827, was exhumed in 1863 to be reburied in a secure casket and again was exhumed in 1888 to be moved to the Central Cemetery in Vienna. Bach, who died in 1750, was disinterred 144 years later in Leipzig because authorities wanted to know the precise location of his unmarked grave. They dug up several corpses and settled on a body which they believed might be that of Bach. Local authorities then commissioned a sculptor to reconstruct the face, compared it to the only official portrait of the famed musician and concluded that they had found the right corpse.

    In the United States digging up the historical dead has a long tradition. Betsy Ross, reputed designer of the American flag, died in 1836, was dug up twice and eventually was moved to Arch Street in Philadelphia. Nearly 35 years after Abraham Lincoln was assassinated in 1865, a tomb holding the "Great Emancipator" had to be rebuilt and his body was held in a temporary grave until reburied in 1901. Lincoln's coffin has been moved 17 times and opened five times because of vandalism and reconstruction of his tomb. Civil War general Thomas J. "Stonewall" Jackson also was exhumed at least partially. He had been seriously wounded by his own men when he returned from the battlefield at Chancellorsville in Virginia and his left arm had to be amputated. Jackson died eight days later and he was buried in Lexington, Va. In 1929, his severed arm was exhumed, placed in a small box and reburied at the Ellwood Family Cemetery near Spotsylvania.

    Historians, forensic scientists and lawyers have not always been successful in getting their night forays approved. Courts or descendants have decided against exhumation proposals for Lincoln's assassin, John Wilkes Booth, FBI director J. Edgar Hoover, the parents of accused ax-murderess Lizzie Borden, explorer Meriwether Lewis, writer Edgar Allan Poe and Hollywood sex symbol Marilyn Monroe. But that doesn't mean they have given up trying. Starrs continues to be outspoken about his desire to find out if the top G-man really died of a heart attack in 1972 since no autopsy was done. Starrs obtained a medical examiner's secret file on the Hoover body, consisting of notes by those who viewed it as it lay in wake. The secret file did not clear up the mystery of Hoover's sudden death, Starrs says.

    Right now, Starrs is appealing a court ruling that prevents him from digging up Lewis, who is buried underneath a monument near Lewis Park, about 80 miles southwest of Nashville. He is, of course, the explorer who with William Clark led the famed expedition from 1803-1806 to explore the American West. He died of reportedly self-inflicted gunshot and stab wounds on his way to Washington while he was governor of the Louisiana Territory. Starrs is convinced Lewis' death was the result of foul play. He has a full blessing for the exhumation project from Lewis' descendants, about 160 people, but has been resisted by the U.S. Park Service, which fears that approval would open a Pandora's box and that the dead in its care, such as President Franklin Delano Roosevelt, might never rest undisturbed.

    "Yes, the National Park Service says, 'Look at all these people in these grave sites who might be dug up next.' But the only one I can think of in their care is Lewis well, maybe [Warren] Harding," Starrs tells Insight. "I don't want FDR." Roosevelt reportedly died of a stroke in 1945 at Warm Springs, Ga., while with a mistress. President Harding reportedly died in 1923 of natural causes, but rumors persist he may have been poisoned by his wife for his notorious womanizing or by others focused on the emerging Teapot Dome scandal.

    Starrs' projects have infuriated many who view him as a bone hunter and conspiracy theorist out to profit from media buzz. His desire to dig up Lewis set off an angry response with historian Stephen Ambrose, who wrote a best seller on Lewis and Clark, and Ken Burns, who produced an award-winning PBS series on the expedition. Both Ambrose and Burns were invested in their insistence that this military officer and superb marksman and hunter became depressed and committed suicide by stabbing himself unsuccessfully and then shooting himself e twice in the head and chest. Ambrose repeatedly slammed Starrs as a publicity hound, and Starrs refers to Ambrose's book as "wonderfully written historical fiction."

    Says Starrs, who notes that he pays all the expenses himself, "I don't simply go into an exhumation to make a mystery when there isn't any evidence. I use scientific tools to resolve the mystery. I don't go by a cemetery in a ghoulish delight and come back with a shovel at two in the morning. I do legitimate work."

    New Mexico resident Elbert Garcia, who claims to be the great-grandson of the outlaw, has said he is willing to provide a DNA sample for comparison. But there appears to be some question whether Garcia is any relation to outlaw Billy since there is no record of the gunman having been married or produced children.

    Click here to return to top of page.

    Mark von Hagen: NYT Should Give Up 1932 Pulitzer Prize for Reporting by Duranty (posted 10-23-03)

    Eric Wolff, writing in the NY Sun (Oct. 22, 2003):

    A report commissioned by the [NY] Times said the work of 1932 Pulitzer Prize-winner Walter Duranty had a “serious lack of balance,” was “distorted,” and was “a disservice to American readers of the New York Times…and the peoples of the Russian and Soviet empires.”

    According to the writer of the report, a Columbia University history professor, Mark von Hagen, a committee of Times senior staff that included publisher Arthur Sulzberger Jr. read it and then forwarded it to the Pulitzer board, along with a recommendation from Mr. Sulzberger.

    The nature of that recommendation is unknown.

    Duranty’s award is under review by a subcommittee of the Pulitzer board, as reported by The New York Sun in June.

    The study, commissioned less than a month after the resignation of the executive editor, Howell Raines, over the Jayson Blair plagiarism and fraud scandal, marks a change in position at the Times.

    In June, the paper issued a prepared statement that said,“The Times has not seen merit in trying to undo history.”

    A Times spokeswoman said she had no comment on the apparently new policy. The administrator of the Pulitzer board, Sigvard Gissler, would not comment, saying,“This is an internal matter.”

    In an interview with the Sun, Mr. von Hagen said, “I was really kind of disappointed having to read that stuff, and know that the New York Times would publish this guy for so long.”

    Mr.von Hagen’s paper said Duranty’s 1931 pieces were “very effective renditions of the Stalinist leadership’s style of self-understanding of their murderous and progressive project.”

    He said Duranty’s reporting was “neither unique among reporters” nor “particularly unusual, let alone profound.” He noted Duranty’s failure to use the diverse sources available to him, and the way Duranty “ignored the history of 20th century Russia.”

    Duranty reported that Soviet citizens celebrated their “freedom” from religion by increasing factory production on religious holidays.

    “One waits in vain for some signal of ever so slight tongue-in-cheek,” wrote Mr. von Hagen.

    Duranty’s work has been reviewed before, in 1990, prompted by Sally Taylor’s biography, “Stalin’s Apologist.” The biography suggested that Duranty was not ideological Communist, but rather a greedy man who had made a comfortable life for himself in Moscow.

    Mr. von Hagen believes Duranty’s misdirection may have come from a vested interest in seeing the Soviet Union recognized by the United States. When Franklin Roosevelt was elected in 1933, he invited Duranty to dinner to discuss the matter.

    At the banquet at the same, in which the U.S.S.R. was formally recognized, the biggest applause, according to Malcolm Muggeridge, was given to Duranty.

    Though Duranty has achieved lasting posthumous fame for covering up the Ukrainian famine of 1932-33 in which as many as 10 million people died, the Pulitzer was awarded for his writing in 1931.

    In an effort to divest Duranty of his prize, the Ukrainian Congress Committee of America organized a postcard campaign that ultimately led to the formation of the subcommittee for review.

    A spokeswoman for the UCCA said she found the Times’s actions “very encouraging” considering Duranty’s “betrayal of the most fundamental aspects of journalism.”

    In November, they will be launching a campaign to get the Times to voluntarily return the prize, a sentiment that sits well with Mr. von Hagen.

    “I wish they didn’t give Duranty the prize in the first place,” he said. “But I think it should be rescinded now, for the honor of the New York Times, if for nothing else.”

    Click here to return to top of page.

    A New Theory of Minoan Decline (posted 10-22-03)

    William J. Broad, writing in the NYT (Oct. 21, 2003):

    For decades, scholars have debated whether the eruption of the Thera volcano in the Aegean more than 3,000 years ago brought about the mysterious collapse of Minoan civilization at the peak of its glory. The volcanic isle (whose remnants are known as Santorini) lay just 70 miles from Minoan Crete, so it seemed quite reasonable that its fury could have accounted for the fall of that celebrated people.

    This idea suffered a blow in 1987 when Danish scientists studying cores from the Greenland icecap reported evidence that Thera exploded in 1645 B.C., some 150 years before the usual date. That put so much time between the natural disaster and the Minoan decline that the linkage came to be widely doubted, seeming far-fetched at best.

    Now, scientists at Columbia University, the University of Hawaii and other institutions are renewing the proposed connection.

    New findings, they say, show that Thera's upheaval was far more violent than previously calculated -- many times larger than the 1883 Krakatoa eruption, which killed more than 36,000 people. They say the Thera blast's cultural repercussions were equally large, rippling across the eastern Mediterranean for decades, even centuries.

    "It had to have had a huge impact," said Dr. Floyd W. McCoy, a University of Hawaii geologist who has studied the eruption for decades and recently proposed that it was much more violent than previously thought.

    The scientists say Thera's outburst produced deadly waves and dense clouds of volcanic ash over a vast region, crippling ancient cities and fleets, setting off climate changes, ruining crops and sowing wide political unrest.

    For Minoan Crete, the scientists see direct and indirect consequences. Dr. McCoy discovered that towering waves from the eruption that hit Crete were up to 50 feet high, smashing ports and fleets and severely damaging the maritime economy.

    Other scientists found indirect, long-term damage. Ash and global cooling from the volcanic pall caused wide crop failures in the eastern Mediterranean, they said, and the agricultural woes in turn set off political upheavals that undid Minoan friends and trade.

    Click here to return to top of page.

    "The Reagans": The TV Movie (posted 10-22-03)

    Jim Rutenberg, writing in the NYT about a new CBS docudrama about the Reagans (Oct. 21, 2003):

    "The Reagans" takes sides on plenty of issues and incidents that are vigorously contested by biographers, and some that are historically questionable. In one early scene Mr. Reagan's talent agent, Lew Wasserman, tells him that his anti-Communist activism is hurting his career. "People know you're an informer for the blacklist," Mr. Wasserman says. Mr. Reagan replies, "I've never called anybody a Commie who wasn't a Commie."

    Mr. Reagan was long suspected of supplying names to the Hollywood blacklist but denied it. F.B.I. records show he cooperated with agents investigating communism in Hollywood, but historians disagree about whether his assistance was of any real significance.

    The script also accuses Mr. Reagan not only of showing no interest in addressing the AIDS crisis, but of asserting that the patients of AIDS essentially deserved their disease. During a scene in which his wife pleads with him to help people battling AIDS, Mr. Reagan says resolutely, "They that live in sin shall die in sin" and refuses to discuss the issue further.

    Lou Cannon, who has written several biographies about Mr. Reagan, said such a portrayal was unfair. "Reagan is not intolerant," he said. "He was a bit asleep at the switch, but that's not fair to have him say something that Patrick Buchanan would say."

    Elizabeth Egloff, a playwright who wrote the final version of the script, acknowledged there was no evidence such a conversation took place. But, she said, "we know he ducked the issue over and over again, and we know she was the one who got him to deal with it." She added that other biographies noted that Mr. Reagan had trouble squaring homosexuality with the Bible. In "Dutch," Mr. Reagan's authorized biography, the author, Edmund Morris, writes that Mr. Reagan once said of AIDS, "Maybe the Lord brought down this plague," because "illicit sex is against the Ten Commandments."

    Another likely controversial moment in the television movie comes in a scene that implies strongly that President Reagan's inspiration for the Star Wars space-based system was a 1940 movie in which he starred, "Murder in the Air." Some experts have said that the film may have influenced Mr. Reagan's decision to sign off on the program. Others have dismissed such claims as overemphasized by liberals.

    Mrs. Reagan, meanwhile, comes across in the script as her husband's protector, constantly fending off ambitious and amoral political operatives. But in depicting the control she exerted, not only over his schedule but over more substantive decisions, the television movie makes some controversial claims.

    The final shooting script heavily implies that Mrs. Reagan, in agitating for the resignation of Alexander M. Haig Jr., President Reagan's first secretary of state, went so far as to write his resignation letter. But no account holds that Mrs. Reagan wrote such a letter. After a consultation in response to a reporter's question, the filmmakers decided last week to remove that scene from the film, saying they would have deleted it in any case.

    Mrs. Reagan's associates said she was most likely to be upset about scenes in which she is shown keeping her children at arm's length and those in which she takes prescription pills, as detailed in "The Way I See It," the memoir of the Reagans' daughter, Patti Davis.

    Mrs. Reagan had no comment for this article. John Barletta, a former Secret Service agent who served the Reagans and maintains contact with Mrs. Reagan, said he had spoken with her about the film. "She kind of said, 'Well, hopefully it won't be that bad,' " he said.

    He said he had his own concerns about the film because "when it comes to the Hollywood people, they're all very liberal against him."

    Mr. Zadan and Mr. Meron, acknowledge their liberal politics, as do the stars of the television movie, Mr. Brolin and Ms. Davis. But Mr. Meron, said: "This is not a vendetta, this is not revenge. It is about telling a good story in our honest sort of way. We all believe it's a story that should be told."

    Nonetheless some involved in the making of "The Reagans" said in interviews that they were girding for a considerable outcry from some of Mr. Reagan's more die-hard supporters.

    "With the climate that has been in America since Sept. 11, it appears, from the outside anyway, to not be quite as open a society as it used to be," Ms. Davis said during an interview at her hotel in Montreal. "By open, I mean as free in terms of a critical atmosphere, and that sort of ugly specter of patriotism."

    She added, "If this film can help create a bit more questioning in the public about the direction America has been going in since the 1970's, I guess then I think it will be doing a service."

    Mr. Brolin said he, too, hoped that the film would prompt Americans to be more suspect of their leaders. "We're in such a pickle right now in our nation," he said, "that maybe if learn something from this."

    Mr. Morris, Mr. Reagan's biographer, said he had some misgivings about the mini-series, given the political leanings of the producers and actors.

    "The provenance of the movie makes me suspect it will not be fair," he said. But he added that it could also work as a reality check on Mr. Reagan's record.

    "The best thing one can say about a movie of this kind," he said, "is it does redress or counteract the sentimentalities that are being perpetrated all of the time in his name by his fanatical followers."

    Click here to return to top of page.

    The Battle Between History and Social Studies (posted 10-22-03)

    April Austin, writing in the Christian Science Monitor (Oct. 21, 2003):

    Pop quiz: Which description best fits Thomas Jefferson?

    A. Founding Father; third president of the United States elected in 1800; and author of the Declaration of Independence, adopted in 1776.

    B. Statesman, author, inventor, architect, but also slaveholder and member of the landowning elite.

    The answer depends on whom you are asking.

    History students are more likely to answer A. Social studies students would gravitate toward B.

    The hypothetical test question illustrates two approaches that are fighting for prominence in schools around the country. Traditional history classes would pay more attention to Jefferson's leadership, carefully placed within a framework of dates. Social studies classes, however, are more likely to study Jefferson as a multifaceted individual, with his position of wealth and privilege coming under the microscope.

    At the core lie two distinct views of education. History advocates insist on a return to traditional instruction, while opponents assert that students need context. What the argument hides is a basic agreement that schools need to do a better job of teaching history. But neither side seems prepared to listen to the other.

    In recent years, the issue has taken on added urgency. Standardized testing in math and English has forced many school districts to spend less time and money on both history and social studies. Research grants are dwindling. Recent reports on the lack of knowledge of history and civics among US students have grabbed headlines.

    But if concerns have heightened about the quality of social studies and history instruction, the debate about what should be taught and how is hardly new.

    Once upon a time, history was a staple in US public school curricula. But social studies became popular starting in the 1960s, inspired by the work of Charles Beard, an early 20th-century social reformer and Columbia University professor.

    Social studies was supposed to remedy rote learning by encouraging an interdisciplinary approach. After all, Professor Beard pointed out, history didn't occur in a vacuum. The varied perspectives of economics, geography, sociology, anthropology, and current events would add meaning and relevance to history, or so the theory went.

    In many schools, social studies were adopted for younger grades, seen as a softer study, preparing preteens for the more rigorous study of history in high school.

    But in the view of some, a certain fuzziness crept into the field with the social studies approach and has been corrupting history classes ever since.

    History advocates sputter at the mention of social studies, a field they see as too touchy-feely and lacking in rigor. But those who favor social studies blanch at what they see as an attempt to drive history back into the territory of rote learning.

    But arguments about rigor or the lack thereof sometimes conceal another, deeper disagreement. It's an ancient conundrum: whether the purpose of education is to transmit the culture or transform it. Traditional history advocates say that learning history should enable one to join the culture, to participate as a citizen. A more liberal view deems the teaching of history a stepping- stone to improving society.

    Lately, the rhetoric has grown hostile. Social studies teachers "have contempt for history," says Will Fitzhugh, who directs the National History Club.

    "More than half of them didn't take history to begin with," he says. "The old joke that social studies is taught by athletic coaches is still sadly true in many places."

    Stephen Thornton, a professor of social studies and education at Columbia University's Teachers College, says the attack on his profession is unfounded. "What do they think we're teaching if it's not history? The problem is that they want a particular kind of history - their version."

    He continues, "They want a return to the Eisenhower era, a time when there was less open dissent in society."

    History advocates such as Chester Finn at the Thomas B. Fordham Foundation say that some consensus is needed on what kids learn.

    "Social studies is peddling bad stuff," he says. "It's teaching kids that democracy is deeply flawed, and that America's evils exceed its virtues."

    Click here to return to top of page.

    Martin Kramer: Dershowitz Vs. Finkelstein (posted 10-22-03)

    Martin Kranmer, writing on his blog (Oct. 22, 2003):

    Over the last month, Alan Dershowitz and Norman Finkelstein have been going at one another over Finkelstein's charge that Dershowitz plagiarized passages of his book, The Case for Israel. I'll spare my readers the details. Israelis and Palestinians lurch from crisis to crisis, while two professors debate the finer points of the Chicago Manual of Style. I find it difficult to take the whole business seriously, but if you do want to track the controversy, here is a link that will take you to Finkelstein's charges, subsequently amplified by Alexander Cockburn; Dershowitz's rejoinder; Finkelstein again; Dershowitz again....

    So why even mention it here? The controversy provides me with a perfect opportunity to post my review of the book that Dershowitz allegedly plagiarized: From Time Immemorial: The Origins of the Arab-Jewish Conflict Over Palestine, by Joan Peters. I wrote the review more than nineteen years ago, when I was young and obscure, and it appeared in a journal that isn't exactly a must-read: The New Leader. I have no recollection of why I agreed to review the book, but I did, and in retrospect I managed to identify both its strong points and its weaknesses. Finklestein, with his taste for hyperbole, has called the book a hoax, which it wasn't. It raised an important question about Palestinian demography, but it did so in ways that left it vulnerable to attacks by serious people. Nevertheless, other serious people have substantiated aspects of her argument, at least for certain periods.

    These days, the demographic argument is not so much about what was but what will be. Until Sandstorm approaches it, content yourself with my resurrected review of From Time Immemorial.

    Click here to return to top of page.

    Lewis and Clark Celebration Is Being Ruined by Anti-American Academics (posted 10-22-03)

    Mark Yost, writing in the Wall Street Journal (Oct. 22, 2003):

    There's a great American tale that's largely going untold. It involves a far-flung adventure by daring men who seemed to know no fear. Along the way, they encountered hostile forces, fierce weather and a formidable landscape. It's a story that should stir the heart of every 10-year-old American boy (and girl).

    Unfortunately, the 200th anniversary of the Lewis and Clark expedition, which charted President Thomas Jefferson's prescient Louisiana Purchase and opened up the American West, is getting a lukewarm reception as it makes its way across the country, tracing the footsteps of this historic journey. Tepid crowds are greeting the re-enactors, which shouldn't be a surprise in the wake of academia's roughly 40-year assault on what used to pass for conventional American history. Is it any wonder that with a curriculum that reduces the accomplishments of Jefferson and the other Founders to "slave owners" the Corps of Discovery would be viewed as a less-than-noble lot?

    The nearly two-week celebration here was the second of 15 planned Lewis & Clark Bicentennial Commemoration National Signature Events. The first was held at Monticello in January and the last is slated for St. Louis in September 2006. Stops in between include the Mandan Nation in North Dakota, the Yellowstone River in Montana, the Nez Perce tribal lands in Idaho, and, of course, the Pacific Northwest.

    Historically, the Indiana site, across the Ohio River from Louisville, Ky., was not an insignificant stop for the Corps of Discovery. While many believe the expedition started in St. Louis, it was here, near Clarksville, named after Revolutionary War hero George Rogers Clark, that it really began. It's also where Clark's little brother, William, first met up with Meriwether Lewis on the expedition. This meeting was re-created last week, amid a pelting rain that kept the crowd at just a few hundred. It was also from this frontier region that Clark culled a group of hearty mountain men who would make up about one-third of the corps.

    This was all told well in a locally made documentary shown at the Interpretive Center at the Falls of the Ohio State Park, the host of the bicentennial events. Further downriver is George Rogers Clark's ancestral home, believed to be the departure site for the expedition when it shoved off for St. Louis, roughly 250 miles away, to sit out the winter before beginning its journey up the Missouri in the spring of 1804.

    Most Louisvillians would be hard pressed to repeat these facts. A slew of elementary-school field trippers trekked through the exhibits last week, along with hard-core history buffs from the surrounding area. But for most, this historic re-enactment was a nonevent....

    Adding insult to injury, the alleged "Shawnee Village" set up on the banks of the Ohio offered more kitsch than culture. A group of ragtag--many white--re-enactors dressed in "traditional" garb fired clay pots, threw tomahawks and sold cheap souvenirs, trinkets and foodstuffs (my favorite was Shawnee Popcorn).

    With three years to go, there's still hope for rekindling genuine interest in the Lewis and Clark expedition. With his wonderful documentary, "The Civil War," Ken Burns showed that it's still possible to get people excited about unglamorized American history. Maybe all it would take is a reshowing of his superb 1997 "Lewis & Clark" documentary (or "The Far Horizons," the great 1955 film starring Charlton Heston and Fred MacMurray--one guess as to who gets Sacagawea). But I'm not holding my breath.

    Click here to return to top of page.

    Report: Unit Killed Hundreds in Vietnam (posted 10-21-03)

    From the Associated Press, as reported by the Miami Herald (Oct. 19, 2003):

    An elite unit of American soldiers mutilated and killed hundreds of unarmed villagers over seven months in 1967 during the Vietnam War, and an Army investigation was closed with no charges filed, The Blade reported Sunday.

    Soldiers of the Tiger Force unit of the Army's 101st Airborne Division dropped grenades into bunkers where villagers - including women and children - hid, and shot farmers without warning, the newspaper reported. Soldiers told The Blade that they severed ears from the dead and strung them on shoelaces to wear around their necks.

    The Army's 4 1/2-year investigation, never before made public, was initiated by a soldier outraged at the killings. The probe substantiated 20 war crimes by 18 soldiers and reached the Pentagon and White House before it was closed in 1975, The Blade said.

    William Doyle, a former Tiger Force sergeant now living in Willow Springs, Mo., said he killed so many civilians in 1967 he lost count.

    "We didn't expect to live. Nobody out there with any brains expected to live," he told the newspaper. "The way to live is to kill because you don't have to worry about anybody who's dead."

    In an eight-month investigation, The Blade reviewed thousands of classified Army documents, National Archive records and radio logs and interviewed former members of the unit and relatives of those who died.

    Tiger Force, a unit of 45 volunteers, was created to spy on forces of North Vietnam in South Vietnam's central highlands.

    The Blade said it is not known how many Vietnamese civilians were killed.

    Records show at least 78 were shot or stabbed, the newspaper said. Based on interviews with former Tiger Force soldiers and Vietnamese civilians, it is estimated the unit killed hundreds of unarmed people, The Blade said.

    Army spokesman Joe Burlas said Sunday that only three Tiger Force members were on active duty during the investigation. He said their commanders, acting on the advice of military attorneys, determined there was not enough evidence for successful prosecution.

    The only way to prosecute the soldiers was under court-martial procedures, which apply only to active military members, Burlas said.

    He also cited a lack of physical evidence and access to the crime scene, since a number of years had passed. He would not comment on why the military did not seek out the evidence sooner.

    Investigators took 400 sworn statements from witnesses, Burlas said. Some supported each other and some conflicted, he said.

    According to The Blade, the rampage began in May 1967. No one knows what set it off. Less than a week after setting up camp in the central highlands, soldiers began torturing and killing prisoners in violation of American military law and the 1949 Geneva Conventions, the newspaper said.

    Sgt. Forrest Miller told Army investigators the killing of prisoners was "an unwritten law."

    Other soldiers said they sought revenge in the villages after unit members were killed and injured during sniper and grenade attacks.

    "Everybody was bloodthirsty at the time, saying, 'We're going to get them back,'" former medic Rion Causey of Livermore, Calif., told The Blade.

    Click here to return to top of page.

    Cannibalism Has Ancient Roots (posted 10-21-03)

    Tim Taylor, writing in telegraph.co.uk (Oct. 15, 2003):

    The science of cannibalism has just become respectable, as irrefutable bio-molecular evidence that we have eaten each other for millennia spurs renewed efforts by archaeologists, geneticists and anthropologists to find out when we started to do it, and why.

    With the Lendu and Hema militias currently cooking human hearts and livers under the eyes of UN observers in north-east Congo, and the abduction of children for food in North Korea, it is hard to believe that until recently academia was dominated by politically correct assertions that cannibalism did not exist. While no one denied that psychopaths and the very hungry do it sometimes, eye-witness accounts of routine cannibalism were ignored.

    In his 1979 book, The Man-Eating Myth, the social anthropologist William Arens told a generation of scholars what they wanted to hear: stories of cannibal tribes were the racist slanders of white imperialist scientists.

    Survival cannibalism made headlines after the 1973 Andes air crash. Sixteen Catholics had stayed alive by eating those who either died on impact or subsequently. The Vatican advised that, although those who had chosen to starve were not guilty of the sin of suicide, those who practised cannibalism had not sinned either: the souls of the deceased were with God, the corpses profane husks.

    The ease with which humans switch into survival mode should have alerted the anthropologists who espoused Arens that their cherished theory was fictional. Archaeologically, cannibal behaviour was evident all along, from prehistoric Fiji to the Aztecs to the Neanderthals of Europe.

    There is now an overwhelming case that cannibalism is a worldwide phenomenon, stretching back to our evolutionary origins: wild chimpanzees and 70 other mammal species have been observed killing and eating each other, while the two-million-year-old Homo habilis cranium known as Stw 53 is covered with deliberate cut marks.

    Click here to return to top of page.

    The Wright Brothers' Hyperbole (posted 10-21-03)

    Roger Shattuck, writing in the NY Review of Books (Nov. 6, 2003):

    The way we clutch at centennial celebrations, one might think that many of us were still practicing numerologists. Even the United States Postal Service could not refrain from hyperbole in its official statements:

    Orville and Wilbur Wright changed the world on December 17, 1903. This colorful stamp [of the first Wright flyer] commemorates the centennial of their incredible feat near Kitty Hawk, North Carolina, their first controlled, powered, and sustained flight in a heavier-than-air flying machine.

    Most American schoolchildren learn an even more expansive version of the story, the version set out in the printed label for their 1903 airplane on exhibit in the Smithsonian Air and Space Museum. The label does not avoid jingoism. Self-educated and working alone in Dayton, Ohio, the Wright brothers "invented and built" the first airplane and "discovered the principles of human flight." Hyperbole is hard to beat back.

    The brothers' own two-page statement to the Associated Press on January 5, 1904, was more modest. It noted drily that the fourth and last flight on December 17 lasted fifty-nine seconds and covered 852 feet over the ground against a twenty-five-mile-an-hour head wind. Near the end of the statement they added two terse comments on their accomplishment. "Only those who are acquainted with practical aeronautics can appreciate the difficulties of attempting the first trials of a flying machine in a twenty-five mile gale." In other words, as their subsequent writings show, their greatest challenge was not developing lift and propulsion, but achieving control, stability, and equilibrium in turbulent conditions. They close with confidence: "The age of the flying machine had come, at last."

    Even four years later in the widely read ten-page account for Century Magazine (September 1908) of their four years of test flights in Kitty Hawk, the brothers limit their claim: "The first [flight] in the history of the world in which a machine carrying a man had raised itself by its own power into the air in free flight, had sailed forward on a level course without reduction in speed, and had finally landed without being wrecked." They could write more graphic and more accurate copy than the Post Office or the Smithsonian Institution.

    Not until 1920, in the title of a legal document he wrote to be used in court for their patent defense, did Orville use the expression "How We Invented the Flying Machine." Elsewhere they remained circumspect enough to claim only that they had carried out the first manned flight.

    Some aviation historians in the United States tend to be impatient with that sober version. Fred C. Kelly, a biographer of the Wright brothers, makes this sweeping statement in his introduction to a collection of their writings: "They knew that the stunt of flying was a minor feat, that their big achievement was inventing the machine."

    It is as if the word "invention" has special claims over these events. Peter Jakab, a curator in the Aeronautics Division of the National Air and Space Museum, has written a fine short account, Visions of a Flying Machine: The Wright Brothers and the Process of Invention. In his next-to-last chapter Jakab reproduces the genuinely historic photograph of Orville piloting the 1903 Flyer precisely at liftoff into its first successful heavier-than-air flight. The picture shows Wilbur virtually dancing on the sand beside the plane. Jakab captions this precious image: "The 'moment' of invention." I bridle. It would be more accurate and more apt to say, "The long-prepared feat of controlling an imperfect and unstable machine into free flight." The invention of the airplane occurred for no camera to catch during an ex-tended moment lasting a hundred years from the enterprising English baronet George Cayley to the Wright brothers. True, the Wright brothers made the most substantial contribution to the sustained collaboration. But as Orville is said to have declared in an interview just before he died in 1948, "We stood on the shoulders of others."

    Click here to return to top of page.

    It's Time to Re-Evaluate FDR and Some Other Presidents (posted 10-20-03)

    Robert Bartley, writing in the Wall Street Journal (Oct. 20, 2003):

    Peace, in setting presidential reputations, far outranks its brother prosperity. I didn't realize how completely war and peace define our presidents until I was asked to think about their economic leadership.

    Our OpinionJournal.com and the Federalist Society sponsored a new rating of the presidents, and in June an expanded print version will be published in collaboration with Simon & Schuster. I was asked to join William Bennett, Richard Brookhiser, Robert Dallek and others in contributing. Asked about leadership on economic policy, I couldn't find much.

    Yes, Washington was smart enough to hire Alexander Hamilton, as Lincoln hired Jay Cooke to finance the Civil War by inventing war bonds. In hands-on terms, Andrew Jackson personally destroyed the second Bank of the United States, a medium-sized calamity for the Republic's finances. Ronald Reagan tamed the economic crisis he was elected to face, but went on to greater glory by defeating communism in the Cold War.

    The great puzzle is Franklin D. Roosevelt. He made his mark defeating Adolf Hitler, which earns him ratings as the top president of the 20th century. But he was originally elected to cure the Great Depression; how did he do there? Unemployment was still above 17% on the eve of war in 1939. Most of Roosevelt's acolytes settle for saying he lifted the nation's spirits.

    Now comes historian Jim Powell of the Cato Institute, with a new book arguing that Roosevelt's policies actually prolonged the Depression. "FDR's Folly" is endorsed by two Nobel Prize economists, Milton Friedman and James Buchanan, and cites a plethora of economic studies.

    Mr. Powell serves to remind us what Roosevelt did: Close the banks. Break up the big and diversified banks in favor of one-branch banks where the problem was. Expropriate private holdings of gold. Mark up the gold price in dollars in an attempt to raise agricultural prices. (Undersecretary of the Treasury Dean Acheson resigned in protest against this devaluation of the dollar, a point for reflection by the Bush entourage currently touring Asia to peddle a cheap dollar.)

    The New Deal tried to organize industries into cartels to keep prices up. But it also sponsored a torrent of antitrust suits against industry colluding to keep prices up. It started new welfare plans, notably Social Security, financed by a tax on employment kicking in before benefit payments did. Above all, Roosevelt raised taxes on "the rich." An "undistributed profits tax" even blocked corporations from accumulating internal capital.

    From the standpoint of the 21st century, it beggars the imagination that anyone could see this witches' brew as a recovery plan. But the mythology of the New Deal lingers today, and we badly need a new debate on this part of our history. I hope that Mr. Powell's book succeeds in sparking one.

    Click here to return to top of page.

    Faking Biblical History (posted 10-16-03)

    Neil Asher Silberman and Yuval Goren, writing in Archaeology (Sept./Oct.):

    The story first exploded into the headlines on October 21, 2002, with the beginning of a skillfully orchestrated publicity campaign. At a Washington press conference jointly sponsored by the Discovery Channel and the Biblical Archaeology Society, Hershel Shanks, publisher and editor of the popular Biblical Archaeology Review, presented a large audience of reporters and TV crews with photographs and background supporting what he called "the first ever archaeological discovery to corroborate biblical references to Jesus." The discovery in question was a small chalk ossuary, or bone container, bearing the Aramaic inscription Yaakov bar Yoseph, Achui de Yeshua, "James, son of Joseph, brother of Jesus." According to Shanks, the ossuary belonged to an anonymous Tel Aviv antiquities collector who, having become aware of its significance, was now willing to allow news of its discovery be made public.

    Authenticated as dating from the first century A.D. by renowned Semitic epigrapher André Lemaire of the Sorbonne and by some laboratory tests carried out by scientists at the Geological Survey of Israel (GSI), the ossuary caused a worldwide sensation. No previous artifacts had ever been found that could be directly connected to the gospel figures Jesus, Joseph, or James--yet here was one that might have held the very bones of Jesus' brother. In the following days, excited reports about the "James Ossuary" appeared on NBC, CBS, ABC, PBS, and CNN and in The New York Times, the Wall Street Journal, the Washington Post, and Time. Newsweek suggested that "Biblical archaeologists may have found their holy grail."

    But the story that began with trumpet blasts of spiritual triumph was destined to end as an embarrassing farce. Indeed, the pious self-deception, shoddy scholarship, and commercial corruption that accompanied the relics' meteoric rise and fall as a media sensation offers an instructive Sunday school lesson to anyone who would, at any cost, try to mobilize archaeology to prove the Bible "true."

    Click here to return to top of page.

    The Mysterious Death of Subhash Chandra Bose (posted 10-15-03)

    S. M. N. Abdi, writing in the South China Morning Post (Oct. 12, 2003):

    A fresh communique from the Taiwan government to a judge in India has given a bizarre twist to the unsolved mystery of nationalist hero Subhash Chandra Bose's disappearance 58 years ago.

    Bose was one of the icons of India's nationalist movement. But unlike Mahatma Gandhi, who championed non-violence, he believed in armed uprising.

    After escaping from house arrest in Calcutta in January 1941, Bose secretly met Adolf Hitler in Germany and sought his help to free India from British rule. When the Nazi leader declined, he turned to the Japanese.

    He arrived in Singapore via Tokyo and, with Japanese help, raised the Indian National Army (INA), mainly from Indian soldiers of the British army who had been captured by the Japanese in Singapore.

    The INA advanced to north-eastern India but the British ultimately crushed the ill-equipped, ragtag force.

    The defeated troops were driven back into Burma, thousands dying of dysentery, malnutrition or malaria on the way. Bose himself was reportedly killed on August 18, 1945, aboard a Japanese bomber which crashed seconds after takeoff from a military air base in Taipei en route to Moscow.

    But even today, 58 years after Bose vanished, the All India Forward Bloc, the political party founded by him, and millions of ordinary Indians, particularly in his home state of West Bengal, refuse to believe that he perished in the crash.

    After independence, the Indian government set up two high-powered inquiry commissions to probe Bose's death. Both investigations concluded that he died in the plane crash and the ashes lying in Tokyo's famous Renkoji temple were indeed his.

    But the Forward Bloc rejected the findings.

    In 2000, the party, which has three members in the national Parliament and 25 representatives in the provincial West Bengal legislative assembly, forced Prime Minister Atal Behari Vajpayee's government to launch a third full investigation.

    Justice Mukul Kumar Mukherjee, heading the latest probe, sought help from Taipei to unravel the mystery which has refused to die even after six decades.

    In a stunning disclosure earlier this month, the Taiwan government informed the commission that not a single plane, civil or military, crashed at Taipei airport from August 14 to October 25, 1945.

    The revelation has demolished the plane-crash theory about the fate of Bose, who is still venerated by Bengalis as Netaji, or The Leader.

    The information, sent by Lin Ling-san, Minister of Transport and Communications, says: "After reviewing all records during the period from 14th August to 25th October 1945, no plane ever crashed at Old Matsuyama now Taipei domestic airport carrying Subhash Chandra Bose."...

    Indian historian Sitnanshu Das says that the Japanese probably faked the air crash so that their generals and their ally Bose could disappear without the Allies hunting them.

    According to Das, Bose tried to make his way back to India via Russia. But he was caught and thrown into jail by the authorities when he entered Russia.

    And by the time Moscow realised that Bose was in their custody, the Russians were not willing to admit that they had imprisoned him and left him to die in jail.


    Click here to return to top of page.

    What's New in Historic Preservation (posted 10-14-03)

    Joanne Ostrow and J. Sebastian Sinisi, writing in the San Diego Union-Tribune (Oct. 12, 2003):

    Historic preservation long ago moved beyond the classic image of little old ladies blocking bulldozers to save buildings where George Washington slept.

    The bulldozers remain real enough. But they've been joined by more recent concerns such as urban sprawl, protecting significant structures from terrorism and saving important post-World War II architecture that many consider too recent to care about.

    These and other issues that are changing the face of preservation were the focus of the annual National Preservation Conference held earlier this month in Denver.

    Sponsored by the National Trust for Historic Preservation, the conference drew an international contingent of preservation professionals, planners, architects and students to Denver.

    [These are some of the issues the conference examined:]

    o Preserving sites important to the cultural history of women, ethnic minorities, gay men and lesbians.

    The preservation movement has fallen short of recognizing historic minority sites because of our country's difficulty in "dealing honestly with all aspects of our history," said Western revisionist historian Patricia Limerick, who teaches at the University of Colorado at Boulder.

    "There is a lot of emotional baggage in dealing with -- or even acknowledging -- places like the site of the Sand Creek Massacre or World War II Japanese internment camps," she said.

    "But after 150 years of denial and an amputated historical view, we're now looking at a bigger picture of the American West that takes in even poor and working-class white men," she added.

    o Preserving recent structures that have played a significant role in history.

    "We don't think of things built in the last 50 years as 'historic,' " said Colorado Historical Society Director Georgianna Contiguglia.

    "But there's a large body of post-World War II architecture -- the Levittowns and early International Style, along with missile silos and other Cold War sites -- that stand as a record of their time."

    For better or worse, she said, "They mark the era some of us grew up with. They also merit preservation. America grew up at different times, and not many people would want only colonial villages called 'historic.' "

    Click here to return to top of page.

    Columbus Beat Columbus to America (posted 10-14-03)

    Will Bagley, writing in the Salt Lake Tribune (Oct. 12, 2003):

    If anything is certain about Columbus -- or even history in general -- it must be summed up in the old rhyme recited by generations of schoolchildren to teach them the year Europeans (excluding Vikings) discovered America: "In 1492, Columbus sailed the ocean blue." Now even that is open to question.
    According to the London Times [in an article published in 2001], historian Ruggero Marino claims Columbus' 1492 voyage for Spain simply repeated a discovery made about 1485 while the Genoese navigator was working for Pope Innocent VIII.

    Marino points to an inscription on the pope's tomb -- Novi orbis suo aevo inventi gloria -- attributing the glory of the discovery of the New World to Innocent's pontificate. Trouble is, Innocent died in July 1492, before Columbus weighed anchor Aug. 3. "The inscription either anticipates Columbus's success," Marino observes, "or else refers to an earlier journey."

    This might be a simple case of giving credit where credit isn't due, but the late Islamic scholar Alessandro Bausani found a more intriguing clue on the Piri Reis map, drawn on gazelle skin in 1513 by Admiral Piri Ibn Hadji Mehmed of the Ottoman Empire.

    Bausani thought the map held the "key to the mystery of Columbus and the Indies." A note on the map describing the Western Hemisphere allegedly says it was discovered "in the year 890 of the Arab era by the infidel from Genoa."
    In the Islamic calendar, 890 corresponds to 1485 or 1486.

    Marino points to another odd statement indicating "proof" may be in the Vatican archives.
    "When the documents related to the New World discovered by Columbus are disclosed," Pope Pius IX said in 1851, "it will be totally clear that Columbus embarked on his great venture under the input and with the help of this Apostolic See."

    This sounds impressive, but altogether it is pretty thin evidence that raises a host of questions. If Columbus discovered America in 1485, why keep it secret? To explain this, Marino embarks onto the murky waters of the Sea of Conspiracy, claiming Innocent's successor, the Spanish-born Alexander VI, hushed it up and "signed over the rights to the New World" to Spain. How Alexander could have kept the notoriously chatty Columbus from spilling the beans is another mystery.

    Most troubling is that the 1954 standard English translation of the Piri Reis map renders the critical date as 896, or our 1490 or 1491.
    Scholar Gregory McIntosh's fine book, The Piri Reis Map of 1513, gets lost in the brouhaha over Marino's claims, but McIntosh reinterprets the map and its extensive commentary to conclude it is based on a map of Columbus's Second Voyage of 14931496. As his publisher notes, McIntosh's work "opens up new ways of looking at the history of exploration."

    So, did Columbus sail the ocean blue in 1492? Probably, but Marino reminds us that history is not a collection of unchangeable facts. It's the process of seeking the truth about the past.

    Click here to return to top of page.

    David Greenberg: Nixon's Anti-Semitism (posted 10-13-03)

    David Greenberg, writing in the Forward (Oct. 10, 2003):

    Richard Nixon had difficulties, to say the least, with his public image, and never more so than during the dark days of Watergate. When the president found himself mired in scandal, growing numbers of Americans came to view him as a criminal, a liar and a madman — and, as of May 1974, an antisemite as well.

    It was that month that Seymour Hersh of The New York Times reported that on the White House tapes that Nixon had recorded — which at that point were still unreleased to the public — the president had on multiple occasions complained about those "Jew boys" or "those Jews down there" in the U.S. Attorney's Office who had first undertaken the Watergate prosecution.

    The revelation confirmed the worst suspicions of many longtime detractors. It dismayed his loyalists, especially those who were themselves Jewish. According to Nixon's speechwriter William Safire, several aides who had long tried to deny his animus toward Jews were distraught. Federal Reserve Chairman Arthur Burns, Safire noted, felt especially incensed about the ethnic slurs on the tapes. Leonard Garment, Herb Stein and Safire himself, for their part, all felt that sinking sensation in an especially personal way. "It fit perfectly with most Jews' suspicions of latent antisemitism in Nixon, which all of us had worked so hard to allay," noted Safire.

    Ever since that bombshell, the White House tapes have been unforgiving to Nixon on the Jewish question. For all the historical treasures the tapes hold, they are mined most enthusiastically for Nixon's racial and religious slurs. With each batch that the National Archives releases — it began making the tapes public in 1996 — reporters excitedly note the most scandalous excerpts. One recently discovered exchange between Nixon and Reverend Billy Graham featured Graham lamenting the Jewish stranglehold on the news media and Nixon concurring with the reverend's assessment. There are dozens more.

    It's hardly surprising that such statements still have the power to offend. It is surprising, though, that they retain the power to surprise, for Nixon's reputation as an antisemite long predates his presidency. As I discovered in my research for my book "Nixon's Shadow: The History of an Image," such concerns were not a sudden revelation by the White House recordings. On the contrary, they had dogged Nixon from virtually the start of his career.

    Click here to continue reading this article.

    Click here to return to top of page.

    Edward Said's Dishonest Career (posted 10-13-03)

    Edward Alexander, writing for the National Association of Scholars (Oct. 2, 2003):

    Edward Said, longtime professor of English and comparative literature at Columbia University, prolific author of "cultural" literary criticism and political polemic, former member of the Palestine National Council and advisor to Yasser Arafat, died in New York on 25 September 2003 at age 67.

    If enormous influence in the academic world is a reliable indicator of intellectual distinction, then Said merited his reputation as one of America's intellectual eminences; but if reputation attests mainly to the irresistible attraction of foolish ideas, he did not. Said taught a whole generation of English professors to search for racism in writers (like Jane Austen) who did not think as the professors do. He induced a generation of Middle East scholars not only to believe that "since the time of Homer...every European, in what he could say about the Orient, was a racist, an imperialist" but to ridicule "speculations about the latest conspiracy to blow up buildings, sabotage commercial airliners and poison water supplies" as "highly exaggerated [racial] stereotyping" (this in a statement of 1997). By Said the Israel "specialists" in the political science departments were taught that "Immediately after the state of Israel was declared in 1948, every major Arab state -- Syria, Jordan, Egypt -- petitioned Israel for peace" and that after 1967 "Israel's occupation increased in severity and outright cruelty, more than rivalling all other military occupations in modern history."

    His acolytes also found meat and drink in Said's pristinely ignorant and intellectually violent pronouncements about Jews. They are not, he claimed, really a people at all because Moses was an Egyptian (he wasn't) and because Jewish identity in the Diaspora is entirely a function of external persecution. The Holocaust (which destroyed most of the potential citizens of a Jewish state) was in Said's estimation a great boon to Jews because it served to "protect" Palestinian Jews "with the world's compassion." Prior to 1948, he asserted, "the historical duration of a Jewish state [in "Palestine"] was a sixty-year period two millennia ago." (In fact, as any normally attentive Sunday-school student knows, Jewish sovereignty in the Land of Israel lasted a thousand years.) Said's recitation of preposterous falsehoods about Judaism and Israel, so far from alienating Jewish liberals, seemed to be a magnet for them. Indeed, no troubler of Zion has ever been more justified than Said in claiming that many (at times it seemed all) of his best friends were Jews, ranging from the Israeli pianist Daniel Barenboim to the apoplectic scribbler of anti-Israel diatribes, Noam Chomsky....

    Said's intense hostility to America also powerfully influenced that sizable contingent of our academics whose motto is "the other country, right or wrong." He called Operation Iraqi Freedom the crusade of an "avenging Judeo-Christian god of war," fitting into the pattern of America "reducing whole peoples, countries and even continents to ruin by nothing short of holocaust." And, as usual, he blamed the Jews for what he hated: "The Perles and Wolfowitzes of this country" have led America into a war "planned by a docile professionalized staff in ... Washington and Tel Aviv" and publicly defended by "Ari Fleischer (who I believe is an Israeli citizen)." (A New York Post journalist who attempted to find the source of Said's phony claim about Fleischer located it in the website of the White Aryan Resistance Movement.)

    Far from making him an untouchable, Said's past membership in an international terrorist organization, his Disneyland versions of history, his thinly-veiled antisemitism and blatant anti-Americanism made him a star in the academic, literary, and intellectual worlds. He was elected president of the Modern Language Association; made a fellow of the American Academy of Arts and Sciences; adored by NPR and BBC; given countless awards, honors, visiting lectureships; and newspapers like the New York Times, the Guardian, and Ha'aretz were in thrall to him.

    Said's career in his last years seemed to lurch from scandal to scandal. In the September 1999 issue of Commentary (see also the Summer 2000 issue of Academic Questions), Justus Reid Weiner revealed that Said had "adjusted" the facts of his life to create a personal myth, often told and poignantly embellished, to fit the myth of Arab dispossession. For decades he had presented himself as an exile, an Arab who grew up in Jerusalem but who, at age twelve, when Israel was established, was (along with his family) driven out of the Talbiyeh neighborhood of Jerusalem. In fact, as Weiner massively documented and irrefutably demonstrated, Said's tragic tale was largely a fabrication. He grew up in a wealthy section of Cairo, son of a Palestinian Arab who emigrated to the U. S. in 1911, became an American citizen, then moved to Egypt. Said was educated in Egypt, not Jerusalem. His family occasionally visited cousins in Jerusalem, and Said was born during one such visit in 1935.

    Click here to return to top of page.

    Only Now Are We Finally Recognizing the Underrated Genre Of Literary Reportage (posted 10-10-03)

    Isabel Hilton, writing in the New Statesman (Oct. 13, 2003):

    In her new biography of the journalist Martha Gellhorn (Chatto & Windus, GBP20), Caroline Moorehead describes the young woman in Spain, wondering how to go about reporting the civil war. She was neither a novice nor an unknown writer: her book of short stories about the Depression, The Trouble I've Seen, had been widely praised. But she was in company with Ernest Hemingway who, as Moorehead writes, considered himself the foremost war correspondent in town; and Gellhorn herself felt she knew nothing of military matters. As it turned out, that may not have been such a disadvantage: she wrote about the drama lived by civilians caught up in the conflict - about the agonies of non-combatants, largely ignored by reportage - and in a way that outlasted more conventional war reporting.

    Martha Gellhorn did not invent literary reportage but she certainly practised it. This month, speaking in Berlin, one of its foremost practitioners, the Polish writer and journalist Ryszard Kapuscinski, argued that reportage began with the historian Herodotus. If so, it sets at least one record: in more than 2,000 years, it is one of the few written forms that - until just this month - had neither been widely defined nor publicly celebrated with a prize. This has now been rectified. The winner of the first Lettre Ulysses award for creative non-fiction, awarded in Berlin on 4 October, was the Russian writer Anna Politkovskaya for her second book on the war in Chechnya (Tchetchenie: le deshonneur russe, Buchet/Castel).

    I thought of Politkovskaya as I watched George W Bush in September, congratulating Vladimir Putin on dealing with the 'problem' in Chechnya. As a member of the Ulysses jury, I already knew that she had won the prize. It would have made no difference to Bush - a man apparently proud of his lack of reading - but perhaps it would be of some encouragement to a writer who has shown extraordinary courage and tenacity in documenting the atrocities in Chechnya and the corruption of Russian society that has been one effect of this extremely dirty war. Perhaps, too, international recognition might offer her a measure of protection against Bush's good friend Putin, whose regime has shown itself more than capable of silencing those who challenge its propaganda.

    To recognise a writer such as Politkovskaya is to do more than acknowledge her prose. Her writing on Chechnya is both more powerful and more durable than any number of news reports or television images. Good reportage is more than a recitation of events: it brings the qualities of a good novelist to bear on reality, without breaking its contract with fact.

    Click here to return to top of page.

    Descendants of the Confederates Who Settled in Brazil (posted 10-10-03)

    Mike Williams, writing in the Atlanta Journal and Constitution (Oct. 8, 2003):

    The monument could stand on any small-town courthouse square in the Deep South of the United States, the names etched in white marble calling up memories of the bloody war that split the nation almost 150 years ago.

    But this monument isn't in Kennesaw or Chickamauga, Vicksburg or Antietam. Tucked beneath shade trees on a picturesque hilltop, it is more than 4,000 miles south of the land once known as Dixie.

    The families memorialized here were Southern survivors of the Civil War who left America in the wake of defeat and resettled in Brazil, hoping to build new lives in a fertile land that resembled the one they left behind.

    Their arrival signaled the start of a strange new pocket of culture in this faraway corner of South America, a mix of Southern drawl and clipped Portuguese, of U.S. notions about farming and education layered with European customs filtered through a Brazilian lens.

    "My grandfather came from Texas and built his house in the middle of a forest," said Maria Weissinger, 86, one of the few remaining U.S. descendants with living memories of the original Southern settlers. "He spoke no Portuguese and the people here spoke no English. For years they couldn't pronounce his last name, so they just called him 'John of the Woods.' "

    The settlers also brought with them a deep affection for the American South that lives on today, centered on the shaded cemetery where the monument stands. Descendants of the settlers --- who call themselves "Confederados" --- hold an annual festival in a pavilion near the cemetery that draws hundreds to a feast of fried chicken, biscuits and grits. The celebration is complete with old-time dances and period costumes, the men dressed in Confederate gray and the women in billowing hoop skirts.

    There is also an unabashed pride in the Confederate flag. The Confederate descendants have heard of the controversy the emblem has stirred in Georgia and other U.S. states, but insist the connection for them is one of family history, not ideology.

    "We aren't saying we want to fight that war again," said Allison Jones, 60, an engineer whose great-great-grandfather William Hutchinson Norris of Oglethorpe, Ga., was one of the first to settle here in 1866. "For us it's a way of honoring our ancestors. It was the flag of that time, not the flag of today. Our flag is the Brazilian flag, but this was the flag of our ancestors."

    Although there are no exact numbers, some accounts estimate that as many as 10,000 Americans moved to Brazil after the Civil War.

    Recruiters for the Brazilian government crisscrossed the South after the war, luring would-be immigrants with promises of subsidies and tax breaks from Brazilian Emperor Dom Pedro II, who was anxious to capitalize on their expertise in raising cotton.

    But the 4,000-mile journey by ship was followed by hardship for many. Although the region where they came to live resembles the rolling hills of Tennessee and Georgia, tropical diseases took their toll, along with the difficulties of adapting to a new culture.

    Brazilian historians estimate that as many as 60 percent of the settlers eventually returned home or died from disease.

    But many of those who stayed prospered, as their farms produced bountiful crops. Slowly, over the years, they intermarried with Brazilians --- many of them immigrants as well --- leading to estimates that as many as 200,000 people in the region today can claim U.S. ancestry.

    Click here to return to top of page.

    Attitudes Toward Suicide Through History (posted 10-7-03)

    Ian MacLeod and Andrew Duffy, writing in the Ottawa Citizen (Oct. 6, 2003):

    Suicide over the ages has been an honourable and heroic death, an act of religious faith, a mortal sin, a political protest, the devil's work, a crime, a sign of madness, or social disintegration.

    Suicide inspired the literary tragedies of Ophelia, Othello, Romeo and Juliet, the "Seventh Circle of Hell" in Dante's Inferno and works of art by Rembrandt, Titian, Brugel, Warhol and others.

    Now, like other human behaviours, it is interpreted through science and medicine. Suicide is a symptom of a neurochemical imbalance, of mutant genes, of serotonin not Satan. Suicidal behaviour is seen as a complex, but controllable illness, like diabetes, high cholesterol and alcoholism. ...

    Though not universally accepted in ancient Greek and Roman cultures, suicide was acceptable, even noble, with sufficient cause: for example, to avoid dishonour, as a service to the country, to end severe pain, or as an act of loyalty toward a master or husband.

    But even the Romans outlawed certain types of voluntary death. The attempted suicide of a soldier, for example, would weaken the army and was akin to desertion.

    The Bible does not explicitly condemn suicide and includes at least seven, from the Old Testament deaths of Abimelech, Samson and Saul to the New Testament's hanging of Judas Iscariot.

    Some consider Christianity's founding event, the voluntary death of Jesus Christ for the sins of humanity, a suicide. And the later writings of his disciples invited the faithful to actively seek their heavenly rewards with God.

    Voluntary death as an act of faith was so embraced by early Christians as a way of reaching their eternal salvation that church leaders became concerned about the number of martyrs, notably the Donatists, departing terrestrial life -- and church control.

    Named after their bishop, Donatus, the Donatists were a schismatic sect of Christians in North Africa, from the fourth to seventh centuries. Their rigorous religious beliefs called for a life of penance often followed by martyrdom, commonly by flinging themselves off cliffs.

    There was fierce opposition to the Donatists from the church, and especially St. Augustine, who, in his book The City of God, decreed suicide to be an unpardonable sin that violated the commandment "Thou shall not murder."

    Life is God's sacred property and to destroy it is to subvert his dominion and will. St. Augustine's recasting of suicide as murder, as opposed to the Roman ideal of heroic individualism, came at a time when the church was consolidating its hold over its followers. What made a suicide a sin was its voluntary nature, Ms. Lieberman says.

    "Self-destruction was prohibited because it represented an individual's choice to do wrong, a deliberate challenge to divine authority."

    St. Augustine, however, did allow for suicide in cases of divine sanction. "To kill oneself at God's command is not suicide," he wrote, a point that seems to distinguish between the death of Samson (and perhaps Jesus Christ) and the suicide of Judas. (Some argue Samson did not commit suicide, but died as a Christian soldier on the field of battle.)

    Centuries later, St. Thomas Aquinas, the great Catholic theologian and philosopher, issued a similarly influential condemnation of suicide as a mortal sin in his theological masterpiece, Summa Theologica. But he, too, distinguished between suicide and martyrdom....

    Public attitudes toward self-destruction began changing during the Renaissance.

    In 1600, Shakespeare penned Hamlet's timeless "To be, or not to be?" soliloquy:

    ....Whether 'tis nobler in the mind to suffer

    The slings and arrows of outrageous fortune,

    Or to take arms against a sea of troubles,

    And by opposing them. To die; to sleep; ...

    From 1580 to 1620, suicide became a popular dramatic device, with more than 200 appearing on the English stage, including dozens in Shakespeare's works.

    "That figure alone points to the social phenomenon of a public that was attracted by both curiosity and apprehension," writes Mr. Minois. "Late 16th- century audiences just lapped up voluntary death."

    Around the same time, English law began to excuse suicide if the severity of a physical illness could be demonstrated, or if the person was too young or too mad to understand the consequences.

    Click here to return to top of page.

    L'Apres L'Empire? (posted 10-7-03)

    Paul Gillespie, writing in the Irish Times (Oct. 4, 2003):

    'A lone superpower that lacks true power, a world leader nobody follows and few respect, and a nation drifting dangerously amidst a chaos it cannot control."

    This bleak description of the US is offered by the distinguished American historian and world systems analyst, Immanuel Wallerstein. His theme is its imperial decline which will, he argues, usher in a chaotic world over the next 20 or 30 years.

    His work on comparative world history over the last generation has been very influential, creating a school of analysis that starts with the transnational capitalist system since the 17th century and relates it to developing inter-state systems since then. In his most recent work, The Decline of American Power: The US in a Chaotic World, he examines what lessons history has for his own country.

    The argument has attracted greater attention because of the widespread international discussion about whether imperial analogies validly apply to US power, engaging conservatives of various hues in and around the Bush administration as well as its liberal and left-wing critics, Wallerstein among them.

    The controversy may be followed on a useful website, www.globalpolicy.org/empire/analysis

    Wallerstein traces the rise of US global hegemony back to the 1870s, when the US and Germany began to acquire an increasing share of world markets, mainly at the expense of the steadily receding British economy. The period 1914-1945 is best understood as a continuous "30-years war" for world dominance between Germany and the United States .

    He divides the post-war era into three periods. Until 1967-73 the US enjoyed indomitable military, economic and political power in the two-thirds of the world secured under the Yalta agreement.

    Vietnam, 1968 and the economic recovery of western Europe and Japan undermined that, leading to a period of "late summer glow" in US power until 9/11 in 2001, in which its two primary foreign policy goals were to prevent the emergence of a politically independent European entity (including Russia) and to maintain the US military edge by restricting the spread of nuclear weapons in the South.

    We have now entered a new stage, stretching to 2025 or 2050, "one of anarchy which the US cannot control". While the neo-conservative hawks agree with him about relative decline, their policy to reverse it by force would only hasten the process, he believes. ...


    Intriguingly, another prophet of US decline is the French historian and demographer Emmanuel Todd, whose book Apres l'Empire, essaie sur la decomposition du systeme Americain, published last year, has been a bestseller in France and is about to be published in English.

    His argument overlaps with Wallerstein's in several respects, including the long view of economic achievement and military overstretch. It is well to be aware of their arguments in such an uncertain world.

    The "theatrical military activism against inconsequential rogue states that we are currently witnessing plays out against this backdrop. It is a sign of weakness, not of strength.

    "But weakness makes for unpredictability. The US is about to become a problem for the world, where we have previously been accustomed to seeing a solution in them."

    Todd draws parallels with imperial Spain in the 17th century, when gold flooded in from the New World, productivity declined and the country fell into economic and technological arrears.

    He, too, says the US is falling dramatically behind in the core industrial sphere. Its trade in advanced technology went into deficit in the 1990s, filling out its overall annual $ 500 billion deficit funded by overseas investment.

    It is far behind in mobile communications technology, including satellites. Airbus is about to surpass Boeing, European railway systems are far ahead.

    "The only remaining superiority is military. This is classic for a crumbling system. The fall of the Soviet Union took place in an identical context" when it embarked on military adventures, including Afghanistan, to forget their economic shortcomings.

    Click here to return to top of page.

    Myth of Robin Hood Reconfigured to Appeal to Theater Audiences in the 16th Century? (posted 10-7-03)

    From History Today (Oct. 5, 2003):

    The Robin Hood legend was embellished to appeal to London audiences, new research has indicated. The idea of Robin Hood as a nobleman fighting against evil barons was fostered from the end of the 16th century. Before this he had been perceived through ballads as a yeoman outlaw battling the tyrannical establishment. From 1598 to 1601, Robin Hood acquired noble status and featured in five plays and in Shakespeare's As You Like It. The new theory in the journal English Literary Renaissance indicates the figure was promoted for middle-class London theatre-goers. Professor Meredith Skura states:"The traditional Robin Hood was no longer viable on the stage in London… The noble outlaw triumphing over the sheriff had been a countryman's dream. But London merchants dreamed not only about triumphing over superiors, but also about moving up to join them.'' Another theory is that during the 1590s rebellions over land forced the establishment to elevate Robin Hood’s status so that he wasn’t perceived as a working-class hero.

    Click here to return to top of page.

    Historians Pledge to Be Honest in New History of Indonesia (posted 10-7-03)

    From the Jakarta Post (Oct. 6, 2003):

    A team of 80 historians tasked with revising the national history book, are gathering new data and information on former president Soeharto's roles in a number of crucial events.

    Controversial issues linked to Soeharto's past, which is to be included in the new history book, include the abduction and murder of six military generals on Sept. 30, 1965, the Supersemar letter that led to Soeharto's rise to power and the invasion of East Timor.

    The book is also to clarify Soeharto's role in the March 1949 battle against the return of Dutch occupation forces to Yogyakarta.

    Anhar Gonggong of the University of Indonesia, who is a member of the team, said the rewriting of the book -- which started two years ago and expected to be completed next year -- would be based on new data and accounts from people, including eyewitnesses.

    He said the book, which would consist of eight volumes, is to replace the five-volume Sejarah Nasional Indonesia (Indonesian National History) book written by a team led by historian and ex-minister of education and culture Nugroho Notosusanto, one of Soeharto's former aides.

    The existing history book, which is still taught in schools, blames the Sept. 30 incident on the now-defunct Indonesian Communist Party (PKI).

    Anhar said the new book would reveal hitherto undisclosed facts behind the tragedy, which has been the cause of life-long trauma among many Indonesians.

    "Is it only the PKI that should be blamed, or also elements within the Army? The book will discuss it," he said, but declined to go in details.

    He said the account of the Sept. 30 incident, when six Army generals were abducted, killed and their bodies thrown into a well in Lubang Buaya, East Jakarta, would be rewritten.

    However, Anhar said reports by military-backed newspapers and also Nugroho's book, which said the generals were tortured, their eyes cut out and sexual organs severed, before their bodies were thrown into the well would be reviewed, because these "facts" were not verified by the post-mortem examinations.

    About one million supporters and sympathizers of the PKI were killed after the incident, and their children, grandchildren and other relatives denied entry to politics during Soeharto's 32-year rule.

    Any suspected PKI supporters, sympathizers and their families were also banned from civil service and suffered discriminatory treatment from the Soeharto administration, as well as from the general public.

    Another team member, professor of Indonesian history Aminuddin Kasdi at the State University of Surabaya, suggested that the alleged PKI victims of the Sept. 30 incident "demand compensation from the government and reconcile with other elements of the nation".

    With regard to the March 1949 incident, Anhar said the war against the Dutch re-occupation was not initiated by Soeharto, but Yogyakarta's then-Sultan Hamengkubuwono IX.

    "Of course, Lt. Col. Soeharto played an important role in the attack, but he just followed the orders of his commander Col. Bambang," he said.

    However, Soeharto claimed the success of the battle as an individual achievement to qualify him for the presidential seat.

    Anhar also said the book would replace the term "integration" used in Nugroho's book with "annexation" in regards Indonesia's invasion of East Timor in 1976.

    "The 'integration' was conducted unilaterally. Our Constitution has excluded East Timor from the very beginning," he argued.

    The new book is expected to draw strong opposition from Soeharto's loyalists, military officials and civilians, many of whom are still in power.

    "I will resign if they (the loyalists) -- or even President Megawati Soekarnoputri -- intervenes in the rewriting of history," Anhar vowed.

    Click here to return to top of page.

    Journalists' Anecdotes from the JFK Assassination (posted 10-7-03)

    Joe Strupp, writing in Editor & Publisher (Oct. 7, 2003):

    A new book from The Freedom Forum chronicling how the press covered the assassination of President John F. Kennedy offers some interesting tales that may not be well-known.

    Similar to its 2002 publication, Running Toward Danger: Stories Behind the Breaking News of 9/11, the new Freedom Forum book, titled President Kennedy Has Been Shot, includes first-person accounts from more than 40 journalists who covered that historical tragedy on Nov. 22, 1963. The book is slated for November publication to coincide with the 40th anniversary of that tragic day in Dallas.

    Among the revelations and anecdotes in the book:

    * Because so few people showed up at Lee Harvey Oswald's funeral, seven reporters assigned to cover the event had to serve as pallbearers. "I refused," Associated Press correspondent Mike Cochran recalls in the book. "Then Preston McGraw of UPI stepped up. So then, there's one thing I knew, stupid as I may be, inexperienced as I may have been, if UPI was going to be a pallbearer, I was damn sure going to be a pallbearer."

    * Reporters aboard Air Force One for the flight from Dallas to Washington, which carried Kennedy's body and newly sworn-in President Lyndon Johnson, were so busy keeping up with the details of the day that they practically had to kick Johnson out of their compartment several times because he kept bothering them while they worked. "This was the only time in my life that I ever felt like saying to the President of the United States, 'I've got a lot of work to do,'" Newsweek's Charles Roberts recalls in the book.

    * Aboard a State Department plane carrying six members of Kennedy's cabinet and several staffers to Tokyo for trade talks at the time of the assassination, Press Secretary Pierre Salinger helped the nervous occupants pass the time as the plane headed back to Washington by organizing a poker game, in which he won $800. "I shouldn't have won," he remembers. "I should have lost. I was apalled."...

    * Sympathetic readers of the Dallas Times Herald sent the paper more than $200,000 in unsolicited donations for Marina Oswald, wife of Lee Harvey Oswald, which the paper forwarded to her.

    * After convincing Oswald's wife to be interviewed, Life magazine wanted to keep other journalists from speaking with her so the magazine paid for her and her child to stay in a suite at Dallas's Adolphus Hotel. "We told them to order anything they wanted from room service," recalls Life's Richard Stolley. "But for God's sake, don't leave."

    Click here to return to top of page.

    Spiro Who? (posted 10-6-03)

    Lance Gay, writing for Scripps Howard (Oct. 5, 2003):

    History has not been kind to Spiro T. Agnew, the self-made son of an immigrant Greek peddler whose precipitous rise from obscurity -- and just as sudden fall back into anonymity -- reads like Greek tragedy.
    Today Agnew is just a footnote in history -- only the second vice president to resign his office, the other being John Calhoun.

    Calhoun left the White House in 1832 to become a senator but Agnew resigned in disgrace on Oct. 10, 1973, after copping a no-contest plea to evading federal taxes.

    Some historians today are beginning to take a second look at Agnew and his career, and finding there's more to the story.

    Historian Justin Coffey said Agnew was the first truly suburban politician to make it to the national stage, and among the first Republican governors to break through a Democratic stronghold on states south of the Mason-Dixon Line -- a region that in the last 30 years has been transformed into a Republican bastion.

    Agnew was the first to hone in on middle-class distrust of TV and print media, whom he pilloried as "the nattering nabobs of negativism." His own political transformation from liberal to conservative Republican matches that of the party he represented.

    Coffey contends Agnew was the most controversial vice president the United States has ever seen.

    "There's a lot more to Spiro Agnew than nolo contendere," said Coffey, an adjunct professor at DePauw University who is turning his Ph.D. dissertation on Agnew's life into a biography.

    Although the Nixon-Agnew administration has coughed up hundreds of biographies and histories over the last 30 years, little new has been written about Agnew.

    U.S. Senate historian Don Ritchie said Agnew clearly deserves more study.

    "His resignation fixed his standing among historians as it did the public," Ritchie said. "But I think he was a more interesting person than he's given credit."

    Ritchie said the trajectory of Agnew's political career is remarkable, especially since he was a Republican from a state that still has solidly Democratic leanings.

    Through a series of flukes, Agnew rocketed from obscurity as county executive of suburban Baltimore County to become governor of Maryland and vice president all within three years. Then his career collapsed just as precipitously when he pleaded no-contest to tax evasion, and resigned his office.

    If he had stayed another 10 months, he might have become president, as Watergate disclosures were eating away at Richard Nixon's hold on the White House.

    But Coffey doubts that Agnew would have been the sort of conciliator and healer that his successor Gerald Ford became.

     

    After he left Washington, Agnew bitterly refused to give interviews and wrote an autobiography, "Go Quietly ... Or Else," in which he claimed Nixon and his acolytes railroaded him. "He naively believed that by throwing me to the wolves, he had appeased his enemies," Agnew recounted.

    He was so embittered at Nixon he refused to take the former president's phone calls, but did show up at Nixon's funeral in 1994.

    Coffey said Agnew's exile was complete, and one-time GOP colleagues also abandoned him. "He was like a Soviet official that was kicked out of the Kremlin. He never was invited to any conventions, and never asked to campaign for anyone. He very much wanted to be a part of the political arena, but he was persona non grata."

    Feeble and clearly ailing, Agnew returned to Washington in 1995 to attend the unveiling of a marble bust of himself installed in the halls of the Senate, alongside those of other senators who went on to become vice president. He died from leukemia Sept. 17, 1996, in Ocean City, Md., and is buried in a suburban Baltimore cemetery under a gravestone that reads only: "Agnew, Spiro T. 1918-1996."

    Agnew left his papers to the University of Maryland.

    Jennie Levine, curator for historical documents, said the collection amounts to an astonishing 406 linear feet of documents, and includes presents he got as vice president, including a monkey cape from Kenya, and a set of golf clubs.

    Levine, who was born in 1972, said she didn't know much about him until she began cataloguing Agnew's records three years ago. She said the library gets about 10 requests a year to look at the papers, which comprise newspaper clippings, speeches and notes on meetings that Agnew convened, but no diary and few personal items. Work is completed on cataloguing 70 percent of the collection, she said.

    To the Vietnam generation, Agnew remains a polarizing figure -- the archconservative who liked sharkskin suits, wore his white hair slicked back and glowered at his enemies.

    His opponents reviled his middle-class habits, his taste for pool, association with Frank Sinatra, and devotion to the Baltimore Colts.

    But "the silent majority" of Nixon supporters who backed an aggressive opposition to Soviet communism and a crackdown on Age of Aquarius hippies supported him.

    Agnew put himself through law school at night while working as an insurance clerk in Baltimore, and became interested in politics after going to PTA meetings.

    In 1962, he was elected county executive of Baltimore County, and four years later, when Democrats backed a race-baiting candidate who ran on the slogan "Your Home is Your Castle," Agnew comfortably won the race to be Maryland's governor as the sensible and more liberal alternative.

    "Until he became vice president, he got terrific press coverage," said Coffey, the DePauw historian.

    "He was a moderate pragmatist as governor of Maryland."

    That all changed abruptly when Nixon plucked Agnew from obscurity at the 1968 Republican National Convention -- a decision one editorial writer contended was the most eccentric appointment since the Emperor Caligula appointed his horse to the Roman Senate. "He had a thin skin toward the press," Coffey said.

    The Agnew choice was part of Nixon's "Southern strategy" designed to appeal to Democratic voters in the South. Victor Gold, once Agnew's press secretary, said Agnew was a leading theoretician of turning the South Republican. "We speak of the Ronald Reagan revolution ... Spiro Agnew was the John the Baptist for that revolution," Gold once said.

    Nixon used Agnew as a hatchet man. Initially reluctant, Agnew grew to relish his role attacking the "pusillanimous pussyfooters" and "vicars of vacillation" and "the hopeless, hysterical hypochondriacs of history."

    In 1969, Agnew lashed out at Vietnam War protesters. "A spirit of national masochism prevails, encouraged by an effete corps of impudent snobs who characterize themselves as intellectuals."

    Ironically, although Nixon ridiculed Agnew in private and schemed to dump him in favor of Texas Gov. John Connally as his 1972 running mate, the strident speeches so endeared Agnew to the hard-core Republican base that Nixon concluded he needed him on the ticket again.

    But the next year saw Agnew's fortunes reverse abruptly with a criminal investigation of kickbacks, including $147,500 he took during the years he served as Baltimore County executive, Maryland governor and vice president. The final $17,500 in cash came to the vice president's office in discreet brown envelopes.

    Coffey said Agnew believed he was just going along with the way politics worked in Maryland. "It's the old excuse that everyone did it," Coffey said. "But in Maryland, everyone did do it."

    Click here to return to top of page.

    In the 1980s African-American Studies Was All About Black Men and Women’s Studies Was All About White Women (posted 10-6-03)

    Rachel Nearnberg, writing in the Harvard Crimson (Oct. 6, 2003):

    Black women scholars are often marginalized—and the study of black women is an afterthought—in the academy today, according to a group of panelists who spoke at the Radcliffe Institute for Advanced Study Friday.
    “Even in the best of times, America is tough on blacks, tough on women, and tough on black women scholars,” said Nell I. Painter, a professor of American history at Princeton University. “The field of black women’s history is flourishing, but I worry deeply about the toilers. Black women scholars are in danger.”

    Painter was one of six panelists who spoke at a roundtable discussion titled “Gender and Race: Together at Last.”

    More than 400 people attended the panel, which was part of the full-day “Gender, Race and Rights in African American Women’s History” conference, which honored the 60th anniversary of Radcliffe’s Schlesinger Library.

    In the 1980s, Painter said, academia welcomed women grudgingly.

    “African-American studies was all about black men and Women’s Studies was all about white women,” she said.

    Though she said “black women are in positions that black women have never been in before,” Painter noted that higher education remains a stressful environment for African-American women.

    “There are two kinds of stressors: white men who say black women got their job because they are black women and black men and sexual harassment,” she said.

    Painter said prizes for published work are one arena that still needs to change, arguing that black women are rarely recognized for their scholarship.

    Deborah G. White, a professor of history at Rutgers University who also spoke on the panel, said the changes that have taken place in academic publishing are only a first step.

    “We have a whole new vocabulary to speak of difference, and the publishers no longer ask us if there is an audience,” she said. “But some things have only been altered. There still seems to be a wonderment for African-Americans who don’t study African-Americans.”

    She also noted that many academics assume that African-American studies is an easier field for black scholars than white scholars.

    “Some people think I came by my knowledge by osmosis, being black made it easier for me,” she said.

    Other panelists said they were more optimistic about the place of black women in the academy and the study of black women.

    Click here to return to top of page.

    Iain McCalman: Writing for the Popular Market (posted 10-3-03)

    Iain McCalman, president of the Australian Academy of the Humanities and director of the Humanities Research Centre at the Australian National University, writing in the Australian (Oct. 1, 2003):

    AS a semi-licensed advocate for humanities research and a professional historian, I'm used to copping a good deal of public and private scepticism. Some of it comes from politicians, businessmen and university administrators who claim that our research is too rarefied to be supported by the public purse, and some from fellow humanists who say we are being betrayed by crass commercialism and philistine government agendas.

    Scholars who have devoted lifetimes to research in subjects such as archaeology, history, languages, literature and philosophy, and whose achievements are hailed internationally, can't fathom why they should be so undervalued at home, especially when compared with their counterparts in science and technology. Young researchers in the humanities and social sciences struggle to get their ideas into print because of the decay of scholarly publishing in this country. I share this frustration.

    The most persistent popular accusation is, of course, that we professional scholars live in ivory towers, talk exclusively to each other and use a language of abstraction that no one else can understand. Critics say our work will remain undervalued until we learn to disseminate it in forms the general public can appreciate.

    The discipline of history is said to be a case in point. Australian bookshops are crammed with best-selling histories from overseas academics, such as Dava Sobel's Longitude and Simon Winchester's Surgeon of Crowthorne. Our prime-time television resonates with the nerdy syllables of Princeton's Simon Schama on The History of Britain, but where are the Australian equivalents?

    Geoffrey Blainey, Stuart Macintyre and Henry Reynolds are having to carry us all. We Australian academics relished writing reviews of Robert Hughes's The Fatal Shore under catty titles like The Shock of the Old. Still, why wasn't it one of us, rather than an expatriate art journalist, who wrote what proved to be the most widely read work of Australian history of the 20th century?

    True, accusations like these are often travesties. Popularisers are rightly suspected in academe because they often plunder the hard-won work of scholars without proper acknowledgment. I have twice had my work plagiarised in this way and it's not pleasant. Nor can all our research be produced in popular or accessible form without distorting its meaning. Like scientists, we sometimes have to use complex technical methods, deploy abstract theories and mobilise specialised vocabularies to pioneer new knowledge.

    Even so, I think that many of us would now concede that these populist critics have a point. I've been a historian for more than 30 years and up until a few years ago I've never consciously tried to write a trade book. This is in part because I didn't know how and in part because I was scared to leave the safety net provided by my academic peers.

    Recently, I've had a go and I have to say that I loved doing it. I was goaded into the attempt by a challenge from a former American publisher who is now a literary agent in NSW. She believes that this country is full of talent and she wants to make true stories by Australians a global rather than purely local attraction. Only in this way, she argues, can we overcome the economic limitations imposed by our small population. In fact, she argues, we must go further still. If professional historians are serious about reviving and extending the popular reach of our discipline, they must master the visual communication forms that have colonised the imaginations of the young, particularly digital media.

    So, about two years ago, at much the same time as I began writing my first would-be popular book, I also became involved in a series of BBC Television history productions, both as a behind-the-scenes historical adviser and a commentator in front of the camera. It has taught me something of the pleasures and perils of these popularising experiences.

    My trade book, The Seven Ordeals of Count Cagliostro, published in the US and Australia by HarperCollins, has been out a few months now. It's about an 18th-century imposter, healer, magician and freemason from the slums of Sicily who became the most infamous European alive on the eve of the French Revolution. The challenge was to produce a work of historical scholarship that could also grip non-specialised readers; a work that could entertain as well as instruct. It had to have an argument, because all art and science must have that, but it could not be didactic or over-analytical. I had to induce an unknown audience to read my book quite differently from the way my previous scholarly books have been read, if at all. I had to try to persuade a mob of strangers from a variety of countries to pay for the book and to read it hungrily from beginning to end, eager to turn each page and to know how it concludes.

    For this I found I had to relearn the art of storytelling. Narrative is the oldest of the historian's tools, yet for many years it's been in low demand in the academy. Our models are generally associated with the social sciences or with literary and aesthetic theory. Either way, we emphasise inquiry and analysis at the expense of narration and character. In the process I fear we might have lost touch with the heart of our discipline. My kindly US editor sent back the first draft of The Seven Ordeals with the words: "Now here is Iain McCalman's eighth ordeal -- to turn a rich study into a compelling story."

    To my alarm, he wanted me to fly in the face of some of the insights and approaches that I, as a professional cultural historian, hold dear. To realise a complete historical world for my readers, he said, I must learn to paint word pictures as if I had actually witnessed the events. To achieve a complete suspension of disbelief, I must not soar into abstract analysis, or assume prior knowledge in the reader or cast any doubts on the reliability of my sources. I must work chronologically rather than thematically. I must produce a rounded historical life, however complex or haphazard that life might have been, yet I must never be boring, repetitive or anti-climatic: suspense must be sustained until the last page. Most affronting of all, I must speak -- it seemed -- with the certainty of a god figure who knew exactly what had happened in the past, or, as we professionals would say, I must write as a naive positivist who believes in the complete objectivity and fixity of historical fact....

    In contrast to this rather disillusioning TV experience, I have recently worked as a historical consultant for a BBC 2 historical drama-in-progress on the life of Emma Hamilton, Lord Nelson's ravishing, monstrous and tragic mistress. Victory, as the series is presently called, will be screened in 2005 for the bicentenary of the Battle of Trafalgar.

    What was different and exciting about the process of generating this series was that the director, Mike Dormer, loves history. Not only was my contribution sought from the start, but the BBC writer Gwyneth Hughes also took the trouble to read a swag of secondary sources in preparation. The BBC took our collaboration seriously enough to send her out to Australia for 10 days, enabling intensive reading, discussion and debate before we hammered out a rough treatment that we both liked. Even though this series-in-progress is not a documentary, I think it will be better history and television than The Ship.

    Click here to return to top of page.

    Arnold as History (posted 10-3-03)

    Alan Zarembo, writing in the LA Times (Sept. 29, 2003):

    Standing before a roomful of fellow PhDs, Louise Krasneiwicz wears an untucked beach shirt -- a multihued collage of musclemen and "championship" banners. Perched on a chair near her podium is a poster from Flex magazine featuring a bare-chested Arnold Schwarzenegger from his bodybuilding days.

    "We think that Arnold Schwarzenegger's extensive influence and remarkable presence in late 20th century American culture has gone beyond inspiration, hero worship and entertainment," she tells the captive audience at the School of American Research here, where she is a research associate.

    Many of the social scientists take notes.

    Schwarzenegger has been many things in his life: immigrant, weightlifter, action movie star and now gubernatorial candidate. A less known role has been academic study subject.

    For the last two decades, Krasneiwicz, a cultural anthropologist, and her intellectual partner, Michael Blitz, the tenured chair of "thematic studies" at John Jay College in New York, have examined his role in popular culture. To the bewilderment of some peers, they have collected hundreds of articles and advertisements with references to Schwarzenegger, attended the bodybuilding competition he sponsors, taped the sounds inside his restaurant bathroom, watched his 30-plus movies dozens of times -- including rare finds like the 1980 TV drama "The Jayne Mansfield Story," co-starring Loni Anderson. (Arnold played Mickey Hargitay.)

    It has been a pursuit so consuming that they regularly dream about their subject -- and have posted more than 150 of those dreams, ranging from the bizarre to the erotic, on their Web site (Google search: "dreaming arnold").

    Avowed postmodernists, the researchers say the point of their collection is not to quantify Schwarzenegger's influence as a cultural icon -- though they do contend that his importance far exceeds that of any other living celebrity -- but to arrive at some vision of America.

    "We're not really interested in studying him as a person but as a reference, a point in our culture," Krasneiwicz says.

    Click here to return to top of page.

    Coulter Loves to Generalize (posted 10-3-03)

    Frazier Moore, writing in the Montreal Gazette (Sept. 29, 2003):

    [My new book] Treason aims to spring Joseph McCarthy from history's gulag as "a wild-eyed demagogue destroying innocent lives," Coulter summed up.

    Seizing quite the opposite position, her book lionizes the 1950s Wisconsin senator for his holy war against Communist spies in the United States, a crusade she argued was done in by the soft-on-commies Democratic Party, which has since compounded the outrage by demonizing McCarthy with its "hegemonic control of the dissemination of information and historical fact," she said.

    Writing the book was a mad scramble, Coulter said. She began Treason only last October, "but I worked pretty hard," she said. "I cut down on TV (appearances). I worked every Friday and Saturday night."

    Veteran journalist and commentator M. Stanton Evans, who is writing a book on the McCarthy era, shared some of his extensive research with Coulter and "went over her manuscript on the McCarthy chapters," he said. "I can vouch for the facts. Her interpretations are obviously hers. They're obviously meant to be provocative."

    Indeed, Coulter's McCarthy makeover only sets the stage for her wildly provocative main theme: Democrats, always rooting against America, are "the Treason Party," she explained with throaty conviction.

    Democrats have "an outrageous history of shame," she said, "and they've brushed it all under the rug," racking up a shameful record that persists to present-day Iraq, where the Democrats, she claimed, are hoping for the United States' comeuppance.

    So the broad purpose of Treason, Coulter said, "is to alert people, to send out flare lights: Warning, warning! Democrats can't be trusted with national security!"

    It's all very simple.

    In Coulter's America, everything, it seems, is simple. She reigns over a bipolar realm of either right or wrong; love or hate; smart or idiotic; men or - a Coulter favourite - "girly boys," a distinction that in her book yields such questions as the language-garbling "Why are liberals so loath of positive testosterone?" as well as "Why can't liberals let men defend the country?" (By men, she means Republicans.)

    "Everything isn't black and white," countered historian Radosh, who has long contended that Communist spies posed an internal threat after the Second World War. Radosh draws the line at canonizing McCarthy for his blacklisting campaign to flush them out. "But the people who respond to her are people who already agree with her, and they don't want any nuance."

    Just mention nuance to Coulter and she scoffs.

    "As opposed to spending 50 years portraying McCarthy as a Nazi?" she said with a scornful laugh. "THAT's a very nuanced portrait! I think it's just meaningless blather, this nuanced business."

    This nuanced business only muddies the issue, she insists, whereas generalizations are, in her view, a simple, get-to-the-heart-of-it way to make a point.

    For example: "Gen-er-al-ly," she said with snide accentuation, "it's not good to play in traffic. Gen-er-al-ly, when your gut feels a certain way, you better hightail it to the bathroom or you'll be wetting your pants."

    Click here to return to top of page.

    Is Israel a Pariah Nation? (posted 10-3-03)

    David Harsanyi, writing in National Review (Oct. 13, 2003):

    In 2001, delegates to the U.N. conference in Durban voted to brand Israel a "racist apartheid state" guilty of "systematic perpetration of racist crimes including war crimes, acts of genocide, and ethnic cleansing." Denunciations of this nature have been a staple of U.N. rhetoric; the most infamous resolution came in November 1975, when the U.N. ruled that Zionism was racist. From 1967 to 1988, the Security Council passed 88 resolutions aimed directly against Israel -- but zero resolutions criticizing the actions of an Arab state or body, including the terrorist PLO.

    Two essential new books set out to refute many of the harsh -- and historically inaccurate -- charges against Israel. The first, Yaacov Lozowick's Right to Exist, addresses itself to anyone "open to a moral evaluation of the facts," and comes from the hand of a former peace activist. Lozowick contends that the story of Israel is, crucially, the story of its wars -- and that any attempt to evaluate Zionism must be anchored in the larger context of the morality of war.

    The second book may be one of the most helpful written in defense of the modern Jewish state, chiefly because it is destined to be the most popular. Alan Dershowitz's The Case for Israel makes use of conventional legal arguments, as well as a deep understanding of Middle Eastern history, to make a "a proactive defense of Israel." If you ever wondered why the verbose Dershowitz is considered one of the nation's top lawyers, this book will quickly convince you the accolades are well deserved.

    Dershowitz challenges 32 separate accusations often leveled against Israel. Some of the accusers are well known (Edward Said and Noam Chomsky appear all too frequently), some less notable. Dershowitz vigorously defends Israel's human-rights record, and rebuts charges of expansionism. He not only demonstrates Israel's innocence, but argues that no nation in history has faced equivalent challenges and adhered to a higher standard of human rights in the process of survival. Dershowitz then goes a step further, "proactively" noting that those who criticize Israel -- but not countries with far worse human-rights records -- are themselves guilty of international bigotry.

    Does Dershowitz or Lozowick assert that the Zionist project is without blemish? Hardly. Both acknowledge numerous mistakes by Israel, her leaders, and her defenders. Lozowick criticizes the "ineptitude, bad faith, waste, poor taste, callousness, and stupidity" found within Zionism -- characteristics, he writes, that are likewise inherent in "any other large-scale human project." But both authors point out that Israel and Zionism have attempted, more often than not, to follow an honorable path. Lozowick believes this stems from ancient Jewish traditions that remain powerfully influential in modern Israel: As a country, Israel is not overtly religious, but it acts in a very "Jewish" way, especially in the political choices it makes.

    Click here to return to top of page.

    Tom Palaima: NYT Reporter Chris Hedges Copied Hemingway ... Was It Plagiarism? (posted 10-3-03)

    Tom Palaima, a MacArthur fellow and Raymond Dickson Centennial professor of classics at the University of Texas at Austin, writing in the Austin American Statesman (Sept. 28, 2003):

    On Page 40 [of War Is a Force That Gives Us Meaning], the author, Pulitzer Prize-winning New York Times correspondent Chris Hedges, writes: "In combat the abstract words glory, honor, courage often become obscene and empty. They are replaced by the tangible images of war, the names of villages, mountains, roads, dates, and battalions." The phrasing and ideas are clearly taken from Ernest Hemingway's Farewell to Arms: "Certain words such as glory, honor, courage or hallow were obscene beside the concrete names of villages, the numbers of roads, the names of rivers, the numbers of regiments and the dates."

    Hedges does not cite Hemingway in his endnotes or bibliography.

    In early June, I wrote to Hedges' publisher, Public Affairs, advising that the borrowing from Hemingway needed to be acknowledged. I did so after I learned that Hedges' controversial views on war were being dismissed, unfairly I thought, because of rumors of plagiarism. I also made the case that Hedges' plagiarism was inadvertent to my former student, Lt. Col. Ted Westhusing, who teaches at the United States Military Academy at West Point. His frank reply raises a crucial question:

    "Inadvertent plagiarism"? Inexcusable, especially from a New York Times commentator, reporter and author. Do you know what this would garner Hedges in the circles I run in? If truly "inadvertent," and if Hedges were a cadet, he might be lucky to garner only a 100-hour "slug." That is, he spends 100 hours of his free time marching back and forth in the hot sun in Central Area under full dress uniform pondering the consequences of his failure (a slug). If intentional, Hedges would get the boot. Kicked out. Gone.

    Indeed, why should a professional journalist be treated differently than a military academy cadet?

    After some confusing responses from Hedges' publisher, Hedges called me. Hedges later claimed that I misunderstood how he felt about the issues involved.

    But the following points are clear:

    Hedges attributed his unacknowledged use of Hemingway to careless transcription from his notepads, the same kind of "accidental copying" defense used by historian Doris Kearns Goodwin, who resigned from the Pulitzer Prize board in May 2002 after plagiarism was discovered in a book she had written 15 years earlier.

    When he discovered his oversight, Hedges changed the wording of the passage. In the paperback edition, it now reads: "The lofty words that inspire people to war -- duty, honor, glory -- swiftly become repugnant and hollow. They are replaced by the hard, specific images of war, by the prosaic names of villages and roads." The original idea is still Hemingway's. The words less so.

    When I asked Hedges why he had not simply added a citation of Hemingway to his original passage, he replied that he was concerned about increasing printing costs by changing the page layout. But a brief endnote citation would have been easy and cheap.

    I pointed out that changing words did not resolve the issue of plagiarism. Hemingway is now unacknowledged on Page 40 in all copies of War Is a Force as the source for Mr. Hedges' ideas or words or both.

    Click here to return to top of page.

    Did the Catholic Church Retard the Growth of Science? (posted 10-1-03)

    From the Chronicle of Higher Education (October 1, 2003):

    A glance at the October/November issue of"The American Enterprise": Christianity and the creation of science.

    The Christian church did not discourage the development of science but in fact was instrumental in bringing it about, says Rodney Stark, a professor of sociology at the University of Washington at Seattle.

    Because Christians saw God as a rational being, they thought the natural world must have "a rational, lawful, stable structure, awaiting (indeed, inviting) human comprehension," he writes.

    In popular representations of the Dark Ages, the church is portrayed as having plunged Europe into centuries of stagnation between the fall of Rome and the Scientific Revolution, when science suddenly burst forth. That depiction is nonsense, says Mr. Stark. The advancements of the 16th and 17th centuries were the result of generations of work, much of which was done by religious scholars at Christian universities, he says.

    To this day, he says, scientists are about as religious as the rest of the population, but social scientists, as a group, are far less religious than those in the hard sciences. That pattern may explain "why it is so widely believed that religion and science are incompatible -- after all, most of the 20th-century literature on this topic was written by social scientists," Mr. Stark writes.

    Click here to return to top of page.

    Did Kennedy Plot to Kill Diem? (posted 9-30-03)

    James Rosen, Fox News correspondent, writing in the Weekly Standard (Sept. 29, 2003):

    ON JUNE 19, 1972, two days after the Watergate break-in, an employee of the Safemasters Company, armed with a high-powered drill and accompanied by a Secret Service agent, rushed to Room 522 in the Executive Office Building. There, they bored open the safe of an obscure Nixon White House consultant named E. Howard Hunt. A 20-year veteran of Central Intelligence Agency covert operations and a prolific spy novel author, Hunt, along with G. Gordon Liddy, had planned the ill-fated break-in at Democratic National Committee headquarters.

    What the authorities found inside Hunt's safe--a treasure chest of Cold War espionage artifacts--astonished them: a .25-caliber automatic Colt revolver; electronic eavesdropping equipment; and hundreds of copies of old State Department cables chronicling events leading up to the November 1963 coup d'état against South Vietnamese president Ngo Dinh Diem, which climaxed in the bloody murder of Diem and his brother. Investigators also found two forgeries of similar cables, implicating the administration of President John F. Kennedy--himself slain three weeks after Diem--in the assassination of Kennedy's Saigon counterpart.

    When word of Howard Hunt's forged Diem cables first surfaced in 1973, they seized the imagination of Richard Nixon's critics. The disingenuous Diem cables supposedly exemplified the craving of Nixon and his men not just to win an election and cover up their crimes, but to rewrite, in Orwellian fashion, the history of the Vietnam War--to tamper with our national memory itself. One unfriendly author, Fawn M. Brodie, in her 1981 psychobiography "Richard Nixon: The Shaping of His Character," went even further, touting the Diem cables as "essential in illuminating the theme of fratricide in Nixon's life. . . . The pains to which Nixon went to try to prove that John Kennedy connived in the assassination of the brothers Diem would seem to have been one more attempt to say, 'Someone else is guilty, not I.'"

    Now, three decades later, comes evidence that Nixon and Hunt need hardly have resorted to forgery to prove their point about Kennedy, Diem, and America's trajectory in Southeast Asia. Ironically, the evidence was preserved on secret White House tapes--but not Richard Nixon's.

    On February 28, 2003, the Johnson Library in Austin, Texas released 30 hours of recordings made surreptitiously by President Lyndon B. Johnson in early 1966. The few news organizations that reported on the tapes played up perceived similarities between LBJ and the next Texan to occupy the Oval Office, George W. Bush: Both men grumbled about coverage of their war conduct, and both, it turns out, expressed skepticism about the usefulness of the United Nations in resolving international crises. We also got further insight into Johnson's familiar torment over his failure of leadership in Vietnam ("I can't get out, I just can't be the architect of surrender").

    Yet the LBJ tapes also contained a bombshell that went unnoticed. Johnson himself believed what Richard Nixon always suspected: that the Kennedy White House did not merely tolerate or encourage the murder of Ngo Dinh Diem, but organized and executed it.

    Johnson left little doubt about this when, in a February 1, 1966, call to Senator Eugene McCarthy, he complained about the Kennedy administration and its left-wing allies in the Senate, who had supported Kennedy's entrance into the war but not Johnson's continuance of it. "They started on me with Diem, you remember," Johnson pointedly told McCarthy, recalling the words of the coup's proponents. "'He was corrupt and he ought to be killed.' So we killed him. We all got together and got a goddamn bunch of thugs and assassinated him. Now, we've really had no political stability [in South Vietnam] since then."

    Minutes later, in a call to General Maxwell D. Taylor, until recently America's ambassador to South Vietnam, LBJ expounded on his recollection, and the general echoed it. "They started out and said, 'We got to kill Diem, because he's no damn good. Let's, let's knock him off.' And we did," Johnson told Taylor. "Yeah, that's where it all started," the general agreed. "That's exactly where it started!" Johnson replied, his anger palpable. "And I just pled with them at the time, 'Please, don't do it.' But that's where it started. And they knocked him off."

    Click here to return to top of page.

    Ronald Reagan, Man of Letters (posted 9-30-03)

    Editorial in the NYT (Sept. 28, 2003):

    Critics who judged Ronald Reagan a regrettably ordinary man, disengaged even when he wasn't on stage as president of the United States, may be surprised to discover that he was dedicatedly dashing off thousands of letters to a wide assortment of pen pals, political allies and even a few global enemies. A sampling from decades of letters has just been published, revealing a hunger for contact with all manner of people.

    His hand-jotted observations on practically everything in life - from the simple joy in a starry night to executive imaginings of a Star Wars missile defense - hardly present a threat to Marcus Aurelius's "Meditations" for introspective angst and wisdom in a leader. But they do reflect an egalitarian curiosity, affability and humility before fellow humans. His letters confirm not greatness so much as exultation in ordinary life.

    The middling actor turned president wrote as plainly to Leonid Brezhnev ("Isn't it possible that some of those obstacles are born of government aims and goals which have little to do with the real needs and wants of our people?") as he did to his old Hollywood barber upon the death of his wife ("I want you to know how deeply sorry we are").

    He took amusing pains to thank Richard Nixon for his 11-page, single-spaced memo of advice ("I can't thank you enough"). More believable was his thanks to Dwight Eisenhower, for early career advice on how to campaign for California governor ("my TV appearances profited by a reduction in verbiage"). Before his mind sadly faded in retirement, Mr. Reagan's letters remained free of statesman's bloviation. "I think the Soviets are really working to become as free as we are," wrote the last cold-war president in 1992 as Communism crumbled.

    Click here to return to top of page.

    Mark Oppenheimer: The 60's Was About Style (posted 9-30-03)

    Mark Oppenheimer, the author of Knocking on Heaven's Door: American Religion in the Age of Counterculture, writing in the Chronicle of Higher Education (subscribers only) (Sept. 30, 2003):

    When I began my book on mainstream religions and the counterculture, I figured it for the holy trinity of good academic projects: The topic had hardly been written about; there was a wealth of sources; and the sources would be fun. Now, "fun" for academics is a relative concept; after all, there are those who enjoy econometrics. So let me be more precise. When I say that I expected the 1960s to be fun to research, I mean by "fun" what normal people, those with at most a bachelor's degree, mean: sex, drugs, rock 'n' roll. I expected to study rebels, people who raged against the machine. Or at least people who enjoyed the hell out of life, and wove their own clothes.

    In researching five faiths -- Catholics, Episcopalians, Jews, Southern Baptists, and Unitarians -- I indeed discovered that three decades had not dimmed the psychedelic grandeur of the Nixon years. Among the fascinating characters I unearthed were the Rev. Jim Stoll, the first openly gay mainline minister (and most likely also a pedophile); the Rev. Alla Bozarth-Campbell, a pioneering Episcopal priest and author of menstruation-themed poetry; and Terry Nichols, who shed a rural, fundamentalist background to become a leading Southern Baptist opponent of the Vietnam War. I discovered that Mountain Girl, Jerry Garcia's wife, had been raised a Unitarian -- a fact I learned from Tom Wolfe's The Electric Kool-Aid Acid Test.

    Speaking of Tom Wolfe, I also got to read -- nay, was required to read, as a professional obligation -- the journalism of Hunter S. Thompson, Norman Mailer, and Garry Wills; the autobiography of William Sloane Coffin, a former chaplain at Yale University and an antiwar activist; and old timepieces like the first editions of Our Bodies, Ourselves and The Jewish Catalog. I spent many hours listening to record albums by now-forgotten Catholic folk singers. How did I love my job? Let me put it this way: When I rented Easy Rider, I deducted it as a business expense. All three times.

    But I suffered an unexpected disappointment. The scholarship on "the '60s" -- a swath of time that I mark from the rise of Martin Luther King Jr. in 1955 to the fall of President Richard Nixon in 1974 -- is not nearly as fun as the primary sources. Three scholarly generations of historians, from John Morton Blum to Allen J. Matusow to Lisa McGirr, have written thorough and judicious books about the '60s. They are for the most part elegantly written, too. Rick Perlstein's Before the Storm, about Barry Goldwater's presidential campaign, is a real page turner. But none of them is as fun to read as Shulamith Firestone's radical feminist tracts or an early issue of Ramparts.

    In part, this is a reality of scholarly writing; it's never as lively as polemics are, no matter how hard we try. We all have our favorite scholarly writers, the ones who really do make history or literary criticism fun to read, the ones we recommend to our students to prove that footnoted prose need not be insufferably dull. But even our favorite scholars are not the most delectable reads. If forced to purge my library of either Edmund Morgan or People magazine, out goes The Puritan Dilemma.

    But in comparing the primary sources from the '60s to the scholarly books about them, I realized there was another cause for the drop-off in readability. The scholars writing on the '60s seem to be incurably addicted to ideas. Or, if not ideas, then to political process. Or questions about democracy, the will of the people, the meaning of the welfare state, the incursions of the police state. Heavy stuff like that. No fun at all. While I am quite certain that many of those writing about the '60s really did live through them -- smoked dope or listened to Bob Dylan, or paid careful attention to those who did -- they somehow came out the other side convinced that what mattered most about the era were debates over Vietnam and the Great Society.

    They did matter, of course, as did women's lib, the Warren Court, and the victories of the civil-rights movement. But the most influential aspects of the '60s were not political, but rather aesthetic. After all, our experience in Vietnam may not have taught us much about the perils of nation-building or support of dictators; President Johnson's welfare programs to a great extent have been undone; and Supreme Court decisions can easily be overturned. What cannot be undone are the ways in which the '60s rid us of certain aesthetic and stylistic inhibitions. Ladies will never again be required to wear white gloves. The notion of "Sunday best" clothes becomes more antiquated every year. The conventions of rock 'n' roll, rap, and even country music now permit profanity, and they forever will.

    Click here to return to top of page.

    Stanley Kutler: Henry Kissinger, Historian? (posted 9-30-03)

    Stanley Kutler, writing in Salon (Sept. 30, 2003):

    Henry Kissinger, ever anxious to mold his place in history, is, as Ronald Steele has said of Richard Nixon, like the Ancient Mariner, anxious to tell his story over and over again. In his new book, "Crisis: The Anatomy of Two Major Foreign Policy Crises," Kissinger now returns (once more) to two key moments in his career, largely using recently released documents to buttress his case. He first discusses the Yom Kippur War of 1973, arguably the Nixon-Kissinger team's finest hour of diplomacy; and then he turns to the "peace with honor" settlement of the Vietnam War, which Adm. Elmo Zumwalt characterized as bringing neither peace nor honor.

    Few men in public life have understood the importance of the documentary record better than Kissinger. Somehow, he managed to leave public office with his records, and then stashed them in the Library of Congress, closed to historical researchers, except for his selected chorus of acolytes. Kissinger made millions of dollars writing memoirs from that record, all the while successfully preventing others from using his papers for nearly three decades. Similarly, his former deputy, Alexander Haig (who was later secretary of state himself, under Ronald Reagan), managed to depart office with all his papers. Nice team.


    History usually is written first with memoirs by participants, and then by disinterested historians, who uncover and explore the documentary evidence. Kissinger has given us an ample record of memoirs. But now he is anxious to provide, select and edit the documentary record himself, which he controls while he is alive. Why should we trust the completeness of these materials? Kissinger acknowledges that Condoleezza Rice herself approved and released some of these documents. Would she approve similar requests from historians? Understandably, she is busy these days; but then, historians other than Kissinger are not former national security advisors.

    Kissinger first focuses on the Yom Kippur War. For Nixon watchers, this is one of the most fascinating episodes of his presidency. October 1973, when the Egyptians attacked Israel, was Nixon's cruelest month. Watergate was approaching a decisive moment, as pressure mounted on the president to release the damning White House tapes. In the meantime, he had to deal with Vice President Spiro T. Agnew's pending indictment for tax evasion and bribery, charges that resulted in Agnew's plea bargain and resignation. Rep. Gerald Ford succeeded Agnew, but he was hardly Nixon's first choice; the president's diminished power left him no alternative. Finally, special prosecutor Archibald Cox refused to back down from his insistence that Nixon surrender his tapes. The president then dismissed Cox on Oct. 20, and Attorney General Elliot Richardson and his deputy resigned in protest. The ensuing firestorm again left Nixon with no choice, and a week later his lawyers meekly agreed to make the tapes available. Two days after Cox's firing, the House began its impeachment inquiry (which would ultimately lead to Nixon's resignation the following August).

    Nixon's ability to deal with the Middle East conflict was extraordinary. This book supplements other documents and materials that have revealed that role. He was in constant touch with Kissinger, sometimes personally and at other times through Haig. It is unlikely that Kissinger has given us the totality of Nixon's role; nevertheless, there is ample material to demonstrate that the president clearly was in charge and well focused.

    Nixon intuitively saw opportunity in the conflict. He would not allow either side to win a victory that would reinforce the resentments of the past. As the war proved more difficult for the Israelis, Nixon dispatched consumable military supplies despite Pentagon resistance. But Nixon had another tack: "[W]e've got to squeeze the Israelis when this is over and the Russians have got to know it. We've got to squeeze them goddamn hard." He regularly repeated that he would save the Israelis from being overwhelmed, but consistently added that he would not rescue them again. "I don't think it's going to cost us a damn bit more to send in more ... supplies," the president said, "but only for the purpose of maintaining the balance, so that we can create the conditions that will lead to an equitable settlement. The point is, if you don't say it that way, it looks as though we are sending in supplies to have the war go on indefinitely, and that is not a tenable position."

    The administration's refusal to allow the Israelis to destroy the Egyptian Third Army resulted in a cease-fire, more or less between equals. Whatever Egyptian President Anwar Sadat's motivations in beginning the war, events soon proved his determination to change things. There is a clear line that leads from Nixon and Kissinger's 1973 diplomacy to Sadat's dramatic visit to Jerusalem, and the Camp David Agreement orchestrated by Jimmy Carter in 1977. The ensuing quarter-century has not entirely fulfilled the anticipated reconciliation, but there has no been no armed conflict between the parties.

    Click here to return to top of page.

    Thomas J. Curry: The Confusion Concerning Religion and the First Amendment (posted 9-29-03)

    Bishop Thomas J. Curry, writing for the Martin Marty Center (Sept. 2003):

    The modern controversy that has engulfed discussion of the historical meaning of the First Amendment dates back to the decision in the Everson case in 1947. In that controversy, one of the mainstay arguments advanced by opponents of the decision is that the position I have just taken, i.e., that the federal government has no power to make theological statements or involve itself in religious matters, flies in the face of the historical experience of the United States in the decades following the enactment of the Bill of Rights. I agree. The facts are not in dispute. Subsequent to the enactment of the Constitution and the Bill of Rights, presidents, senators, congressmen, and candidates for office repeatedly invoked God and made religious pronouncements, and government continued to support religion in a multitude of ways.

    How does one cope with this anomaly? Does the practice of the times following the enactment of the Constitution, particularly the practice of those who participated in that enactment process, become normative for interpretation of the Constitution and/or the First Amendment?

    My response to that question is no. We do not look to the practice of the time as normative for what we mean by the statement in the Declaration of Independence that “all men are created equal,” or for how we are to deal with minorities, or with women. Nor do we seek out past practices to ascertain what the objectives mentioned in the Preamble of the Constitution—justice, domestic tranquility, the common defense, the general welfare, or the blessings of liberty—mean for us in our time. Rather, I believe that the principle embodied in the First Amendment—that government has no power or jurisdiction in religious matters—was enunciated within a particular historical context that shaped and limited people’s understanding of it. Within our own different historical and cultural context, we, too, have to endeavor to recover and apply that principle in a way that results in maximum liberty and promotes the common welfare.

    Although Americans, in the years from 1789 to 1791, adopted the radical principle that government has no power in religion, their cultural understanding and experience limited the application of that principle. They applied it in those areas of Church and State that particularly engaged them, and that had been clarified in their understanding by the experience of conflict, specifically religious persecution and the financial support of churches and ministers by way of public taxation. Those topics—especially taxation for religion—were realities that troubled America. Having solved them, the vast majority of Americans saw no other existing obstacles to religious liberty. In a largely homogeneously Protestant nation, few people could even imagine, let alone challenge, practices that others would view negatively as religious and sectarian. For most, such practices were part and parcel of the common coin of civilized living.

    Despite having made a very public proclamation that their new government was powerless in religion, that there was no proper American way of being religious, Americans proceeded to assume that there was indeed an “American” way. They came to believe what Professor Howe would argue almost two centuries later: that since American religious liberty had largely emerged out of American religious evangelism, the Amendment had to be read in the context of the theology of that evangelism—that absent State support of the religion that created it, religious freedom would wither and die. Hence they did not see the de facto establishment of religion they created (a modern description, not theirs)—one based on a common cultural-religious experience, democratic or congregational churches, a shared interpretation of history, common religious devotions, and Bible reading—as religiously oppressive, but rather as the context necessary for the preservation of the religious liberty they had brought into being.

    Only conflict could broaden Americans’ understanding of religious liberty and clarify the meaning of the First Amendment for a more pluralistic, diverse America. And conflict soon came—by way of Catholic immigrants, who, by the 1820s, began to arrive in significant numbers.

    Coming from a different worldview, a different religious experience, and a different interpretation of history, Catholic immigrants experienced America’s prevailing religious-cultural system as coercive and religiously oppressive. As a result of the clashes that followed upon their continuing arrival throughout the nineteenth century and well into the twentieth, America would abandon much of its de facto establishment of religion.

    This long conflict—often, but by no means always, manifested in its most intense form in the public schools—led to two major developments.

    First, it led to the evolution of a country and a government much more secular than those which Catholic immigrants had experienced when they first began to arrive. This secularization took place in the public schools, in the way people observed the Sabbath, and generally in the culture of the nation. I have argued that the coming of Catholic immigrants transformed America and made it more open to the diversity of immigrants who would arrive in waves in the nineteenth century and up until the First World War.

    This development, however, coincided with a deepening conviction on the part of the dominant American culture that religious freedom was in danger. It arose from the belief that religious liberty was the product of Protestantism, and that its survival depended on that religion. To those who thought in these terms, the coming of what they perceived as veritable hordes of Catholics and foreigners threatened American liberties—and particularly American religious liberty. The more Catholics altered the religious and cultural status quo, the more they demonstrated that there was no American way of being religious, and the more their critics were convinced that the religion and theology on which the Constitution and the First Amendment depended were being eroded.

    The conviction that since Protestantism had contributed so much to the Constitution and religious liberty it was essential to both and was embedded in the First Amendment—i.e., that America and Protestantism were somehow connected—entered deeply into the minds and attitudes of Americans in the nineteenth century. Indeed, as late as 1998, Professor Phillip E. Hammond of the University of California at Santa Barbara could write that “protestantized religious faith . . . lies behind the Constitution. It is a faith more in process than in substance, but a discernible substance is nonetheless there” (With Liberty for All: Freedom of Religion in the United States, xv). ...

    In fact, for much of nineteenth-century America, the separation of Church and State came to mean the separation and isolation of the Catholic Church, so that the true American religion—what Douglas and Boettner referred to as “Scripture truths and Americanism”—could prosper. Contemporary Americans rightly believed that evangelical Protestantism had contributed immensely to the emergence of religious liberty. However, the presence among them of large numbers of Catholics also convinced them that religious liberty would survive only if the State upheld and protected the theology and religious practice that had led to the creation of that freedom. The Church that was to be “separated” was the Catholic Church, so that what they thought of as the religion and theology necessary to religious freedom—what they regarded as ecumenical and not really amounting to a Church—could flourish and be sustained by government. Thus did the rallying cry “Separation of Church and State” come to be an utterly theological statement.

    The “wall of separation” was enshrined in constitutional interpretation by the Supreme Court in 1947, when Justice Black wrote: “In the words of Jefferson, the clause against establishment of religion by law was intended to erect a wall of separation between Church and State.” ....

    Our modern problem arises from the fact that government—the Supreme Court especially—has determined that the free exercise of religion is something guaranteed by government, that courts are to define and protect. As a result, understanding of the First Amendment is in utter disarray. Because judges assume themselves to be the protectors of religious liberty—rather than a threat to it, as the Amendment proclaims—they assume that they are the judges of what comprises that religious liberty. Thus they read the Amendment as containing substantive theological statements, of which there are currently two major contending theological interpretations.

    Click here to return to top of page.

    How Ecumenical Was Spain Under Islamic Rule? (posted 9-27-03)

    Edward Rothstein, writing in the NYT (Sept. 27, 2003):

    [In Granada] Christian conquerors unfurled their flag in 1492, marking the end of almost eight centuries of Islamic rule in Spain. Less than a decade later, forced conversions of Muslims began; by 1609, they were being expelled.

    That lost Muslim kingdom — the southern region of Spain the Muslims called al-Andalus and is still called Andalusia — now looms over far more than the new mosque's garden. And variations of "the Moor's last sigh" — the sigh the final ruler of the Alhambra supposedly gave as he gazed backward — abound.

    The impulse to idealize runs strong. If Andalusia really had been an enlightened society that combined religious belief with humanism and artistry, then it would provide an extraordinary model, offering proof of Islamic possibilities now eclipsed, while spurring new understandings of the West. In Spain, that idealized image has even been institutionalized. In Córdoba, a Moorish fortress houses the Museum of the Three Cultures. There was once a time, the audio narration says, when "East was not separated from West, nor was Muslim from Jew or Christian"; that time offers, it continues, an "eternal message more relevant today than ever before." In one room, statues that include the 12th-century Jewish sage Maimonides; his Islamic contemporary the Aristotelian Averroës; and the 13th-century Christian King Alfonso X are illuminated as voices recite their most congenial observations.

    A more scholarly paean is offered in "The Ornament of the World: How Muslims, Jews and Christians Created a Culture of Tolerance in Medieval Spain,"(Little, Brown, 2002) by Maria Rosa Menocal, a professor of Spanish and Portuguese at Yale University. Ms. Menocal argues that Andalusia's culture was "rooted in pluralism and shaped by religious tolerance," particularly in its prime — a period that lasted from the mid-eighth century until the fall of the Umayyad dynasty in 1031. It was undermined, she argues, by fundamentalism — Catholic and Islamic alike.

    But as many scholars have argued, this image is distorted. Even the Umayyad dynasty, begun by Abd al-Rahman in 756, was far from enlightened. Issues of succession were often settled by force. One ruler murdered two sons and two brothers. Uprisings in 805 and 818 in Córdoba were answered with mass executions and the destruction of one of the city's suburbs. Wars were accompanied by plunder, kidnappings and ransom. Córdoba itself was finally sacked by Muslim Berbers in 1013, its epochal library destroyed.

    Andalusian governance was also based on a religious tribal model. Christians and Jews, who shared Islam's Abrahamic past, had the status of dhimmis — alien minorities. They rose high but remained second-class citizens; one 11th-century legal text called them members of "the devil's party." They were subject to special taxes and, often, dress codes. Violence also erupted, including a massacre of thousands of Jews in Grenada in 1066 and the forced exile of many Christians in 1126.

    In fact, throughout Andalusian history — under both Islam and Christianity — religious identity was obsessively scrutinized. There were terms for a Christian living under Arab rule (mozarab), a Muslim living under Christian rule (mudejar), a Christian who converted to Islam (muladi), a Jew who converted to Christianity (converso), a Jew who converted but remained a secret Jew (marrano) and a Muslim who converted to Christianity (morisco).

    Even in the Umayyad 10th century, Islamic philosophers were persecuted and books burned.

    Click here to return to top of page.

    Ford and Lindbergh, Anti-Semites Who Helped Hitler (posted 9-26-03)

    Christopher Simpson, writing in the Washington Post (Sept. 24, 2003):

    Max Wallace's ambitious joint biography of American icons Henry Ford and Charles Lindbergh retells some well-known stories and provides considerable new evidence of each man's deep anti-Semitism and disturbing relationships with the Third Reich. The book is based in part on new access to the Lindbergh family archives and the recent disclosure by the Ford Motor Co. of more than 90,000 pages of records concerning its Holocaust-era operations in Nazi Germany. Contrary to comfortable myths, Ford and Lindbergh were not "country club" anti-Semites who simply shared the prejudices of their time, Wallace writes in "The American Axis." They were the real thing.

    Indeed, based on the evidence Wallace has marshaled, it is fair to say that Ford's factories and Ford himself contributed significantly to Germany's war effort, and that Lindbergh rallied more support for Nazi Germany than any other individual in the English-speaking world. Neither "caused" the Holocaust, yet both share responsibility for its devastation. That each did so while waving an American flag and preaching patriotism should give us all pause.

    Wallace's text provides an interesting counterpoint to this year's public relations hoopla surrounding the centenary of the Ford Motor Co. That celebration soft-pedals or remains silent about Henry Ford's use of Ford dealerships to circulate "The International Jew," a vicious bit of conspiratorial hate literature published in pamphlet form. Many experts agree that this publication prompted more damage to innocent people than Adolf Hitler's turgid "Mein Kampf." True, as Wallace reports, Ford eventually apologized, sort of, for his activities as America's most influential anti-Semite. But Wallace also documents that Ford privately disavowed the substance of that apology shortly after it was released. Ford's editor on the pamphlet, William Cameron, remained on the payroll for years as director of public relations, according to company archives. Henry Ford's personal politics remained largely unchanged to his death in 1947.

    Wallace reports that aviator Charles Lindbergh's enthusiasm for Hitler's Germany emerged from his surprisingly deep contempt for democracy and his fascination with the "scientific" racial genetics of his era. Social progress could be achieved only through tight restrictions on immigration, Lindbergh believed, along with birth control for poor people and forced sterilization for those he regarded as defective.

    Not least, Wallace documents some operations of a cabal of relatively influential, anti-Semitic and frequently pro-German U.S. intelligence agents between World Wars I and II. Drawing on Joseph Bendersky's 2000 study, "Jewish Threat," and on files recently released under the Freedom of Information Act, Wallace traces the clandestine support for the "Communism Is Jewish" propaganda of extremist anti-Communist emigres, as well as the highly questionable activities of Lindbergh's mentor Truman Smith, who served as the U.S. military attache in Berlin in the years running up to World War II....

    Put briefly, Ford's most direct, documented role in Nazi Germany was his substantial investment in German automaking during the first decade of Hitler's rule. That in turn laid a foundation for Ford Motor's profiteering from slave labor during the Holocaust as a device to maintain market share and political influence throughout Nazi-occupied Europe. (Corporate headquarters in Dearborn, Mich., today argues, in effect, that some other Ford Motor Co. was responsible for the slave labor. There are many reasons to doubt Ford's claim, as Wallace ably demonstrates.)

    It is also clear that Ford's "International Jew" played a role in Hitler's evolution as an anti-Semite, in part because parts of the text were lifted and used nearly verbatim in "Mein Kampf." Wallace's evidence buttresses the widespread belief that Henry Ford helped support the early Nazi Party and American fascist organizations, though the documentation remains thin and details probably were never recorded in the first place.

    Click here to return to top of page.

    Utah Textbook Writer Neglects Dark Side of State's History (posted 9-26-03)

    Shane Johnson, writing in the Salt Lake City Weekly (Sept. 25, 2003):

    Upon learning, decades after the fact, from tenderfoot-turned-U.S.-Senator Ransom Stoddard that he was not “the man who shot Liberty Valance,” the editor of the Shinbone Star tore his notes to shreds and told Stoddard, “This is the West, sir! When the legend becomes fact, print the legend!”

    Director John Ford’s 1962 silver-screen rendition of Liberty Valance—a fictional American frontier bully who finally gets his comeuppance—confronts the fallacies of lore in the Old West. At the same time he’s debunking those myths, however, Ford reveals that when enough is at stake, whether the reputation of a politician nearing the end of a ride paid for under false pretenses or the sensibilities of a readership clinging to that bygone era, truth rides shotgun to keeping up appearances.

    By local standards, Utah: A Journey of Discovery is a relatively radical state-history textbook. In it, author and historian Richard Holzapfel profiles the “subversive” labor organizer Joe Hill, takes business interests to task on the environment and speaks sympathetically to the plight of Utah’s underclass. His text takes pains to interweave voices from every conceivable ethnic, racial, religious and social group he could muster, no matter how obscure their contributions may have been to the overall story of Utah. But the fallout from that over-inclusiveness could be why Holzapfel stumbled through the Mountain Meadows Massacre, glossed over the segregated colony of Iosepa and erased the last recorded lynching of a black man in the West.

    As any textbook author will attest, they don’t work in a vacuum, especially when history is involved. Outside pressures come to bear whenever one person is tapped to tell so many stories from at least as many points of view. Salt Lake Community College Professor John McCormick knows that all too well. A small but vocal contingent led by a southern Utah rancher, a natural resources political consultant and the conservative think-tank The Sutherland Institute lobbied for a rewrite of McCormick’s fourth-grade Utah history-textbook in 1997. The Utah State Textbook Commission acquiesced to those demands and ordered McCormick to eliminate parts of his book dealing with American Indian religion, Anglo-settler conquest and environmentalism. Since 12,000 of the books were already in use by teachers, the commission enacted rules barring widespread trial circulations of textbooks before official state approval.

    Writing in the wake of the flap over McCormick’s book, Holzapfel said he was forced to toe a certain line. “In this state,” he said, “there are so many groups that are willing to talk that they can block a good work.”

    So it is that, with nearly a century-and-a-half of historical hindsight at his fingertips, Holzapfel’s account of the 1857 massacre at Mountain Meadows squares snugly with the LDS Church’s version of events. Yet, the only time American civilians slaughtered other American civilians with more ghastly results was in April 1995. That’s when Timothy McVeigh wiped out 168 people with an explosives-packed Ryder truck in Oklahoma City.

    Holzapfel’s telling of the United States’ second-deadliest homegrown mass murder--of some 120 settlers, most of them children--has been emblazoned unchallenged in 12-year-old minds throughout Utah.

    Unchallenged, until now.

    Before it was approved, Utah Division of Indian Affairs Director Forrest S. Cuch briefly reviewed an already printed and bound copy of Holzapfel’s textbook and made no objections. But in light of Will Bagley’s award-winning 2002 treatment of the massacre, Blood of the Prophets: Brigham Young and the Massacre at Mountain Meadows, and another book by Sally Denton, Cuch is calling for revision of the passage dealing with the massacre, as well as other portions of the book he said unfairly portray American Indians. “I’ve taken the position that the Paiutes were not involved in the actual massacre,” Cuch said. “I agree with Bagley, and Denton and the Paiutes’ assessments.”

    Irritated at seeing his name listed as an adviser to the textbook, Cuch immediately sent off a letter to Layton Publisher Gibbs Smith.

    “I consider it shameful and a dishonor to continue to depict the Paiutes as co-conspirators of this massacre and murderers of innocent men, women, and children, when in fact, they did not participate. We must stop villainizing the Indian people,” Cuch wrote. “To continue to promote this negative image of the Paiutes is scholastically irresponsible and promotes racial stereotypes of Indians as savages. It also implicates all Indians resulting in extreme damage to the self-esteem of all Utah Indian people, especially Indian youth.”

    Cuch said he is worried that speaking out could cost him his job but added it is time to set the record straight for Utah’s first pioneers. “You want to know why you see drunken Indians walking up and down these streets?” he asked, nodding toward State Street from the lobby of his office building. “It’s because they’ve lost their dignity; they’ve lost their dignity because of books like this.”

    Click here to return to top of page.

    Chester E. Finn, Jr.: States Are Failing to Educate Students in History (posted 9-26-03)

    Chester E. Finn, Jr., writing about the findings of a new study by the Thomas B. Fordham Foundation (Sept. 2003):

    In the post-9/11 world, it's more important than ever for young Americans to learn the history of their nation, the principles on which it was founded, the workings of its government, the origins of our freedoms, and how we've responded to past threats from abroad.

    A well-crafted K-12 curriculum has an obligation to assure that students be deeply immersed in U.S. history (as well as civics, geography, world history, and more) and that graduates be knowledgeable about America's past. Though schools cannot be held exclusively responsible for forging good citizens—that solemn duty is shared by parents, churches and myriad other institutions—they have a unique obligation to handle the "cognitive" side; i.e., to make certain that young people gain the requisite knowledge and intellectual skills.

    Yet assessment after assessment and study after study shows that history is the core subject about which young Americans know least.The fraction of students (in grades 4, 8 and 12 alike) who reach the "proficient" level on tests administered by the National Assessment of Educational Progress (NAEP) is smaller in history than in any other field. The situation has not improved since 1987, when Diane Ravitch and I authored What Do Our 17-Year Olds Know?

    Though U.S. schools include some superb history instructors who are as effective in the classroom as they are passionate about their subject, far too many teachers of history are people who have never seriously studied this field themselves. (They may have been certified as "social studies" teachers after majoring in sociology, psychology, or social-studies pedagogy.)

    In an era of "standards-based" reform, we now understand that the subjects most apt to be taken seriously and taught well in our schools are those for which the state sets high-quality standards that make clear what teachers are expected to teach and children to learn; where the statewide assessment system regularly appraises how well those things are in fact being learned; and where the "accountability" system confers rewards and sanctions—on students, educators, and schools alike—according to how well they have succeeded in this teaching and learning.

    In that context, however, U.S. history has not fared well. While almost every state requires students to sit through at least one course in this subject (typically in eleventh grade), history seldom even appears in statewide testing and accountability systems. Of the 24 states that have or intend to have high school exit exams by 2008, only nine include
    social studies among the subjects tested and, of the nine, just two (Mississippi and New York) test specifically in U.S. history.1

    Unintended Consequences

    Today, the federal No Child Left Behind (NCLB) act of 2001 is the strongest force driving U.S. schools toward standards-based reform and stronger pupil achievement. Without intending to, however, NCLB may actually worsen the plight of U.S. history. By concentrating single-mindedly on reading, math, and science, it will likely reduce the priority that states, districts, and schools assign to other subjects. And by highlighting performance (or the absence thereof) in only those three core fields, it will focus the attention of state and community leaders on their schools' results in those subjects—and deflect their attention from others.

     

    A problem, yes, but one that states and schools can solve if they want to. NCLB is meant as a floor, not a ceiling. Nobody said schools ought not attend with equal fervor to other vital subjects in the curriculum. Moreover, forty-eight states (all but Iowa and Rhode Island) and the District of Columbia have already established academic standards in social studies, meaning that they have at least gone through the motions of detailing what they expect their teachers to teach and students to learn in this field.

    Those standards are necessarily and properly the starting point for determining what America actually intends its young people to know about their nation's history. Insofar as a state's testing and accountability system pays attention to U.S. history, it will (or should) be "aligned" with the state's standards. Those same standards are likely also to drive teacher preparation, textbook selection, and much more.

     

    So they need to be taken seriously. They are the recipe from which the entire education system cooks. But how satisfactorily do today's state academic standards deal with U.S. history in particular? So far as we can tell, nobody has ever asked that question before. We at the Thomas B. Fordham Foundation and Institute, and various other groups (e.g., American Federation of Teachers, Albert Shanker Institute), have periodically examined state social studies standards in general. In 1998 and again in 2000, Fordham's expert reviewers examined them with specific reference to history and (separately) geography. Penn State professor David Saxe carried out the history reviews. But he looked (as we asked him to) at history in general, not U.S history in particular.

    After the 9/11 attacks and the enactment of NCLB, we realized that American history itself needs renewed attention in our schools and that a good first step would be to review state academic standards for social studies (or, wherever possible, for history or, best of all, U.S. history) with a particular eye to their handling of America's own history.

     

    Click here to return to top of page.

    Californians Seek to Right an Old Wrong for 'Repatriated' Mexican Americans (posted 9-25-03)

    Eric Roy, writing in voanews.com (Sept. 23, 2003):

    During the Great Depression of the 1930s, federal authorities and various state and local governments illegally deported or "repatriated" an estimated one to two million Mexicans and Mexican-Americans. The official reason was to free up jobs for so-called "real" Americans. Yet despite the massive scale of the decade-long campaign, it's not recorded in many history books and has largely been forgotten.

    Emilia Castaneda was born in Los Angeles in 1926. Her brother was born here, too. One day, when she was nine years old, she came home from school and her father said the family had to leave for Mexico. Right away.

    "And I don't remember what happened to our possessions, our furniture," she said. "The only thing - my dad was packing a trunk, what little belongings we had, and we were there at dawn. That I remember. It was real dark in the train station."

    Scared and sad, Ms. Castaneda was one of up to half a million Mexicans and Mexican Americans forced to leave California during the 1930s. The so-called "Mexican repatriation" campaign also uprooted perhaps another 1.5 million people in Texas, Arizona, New Mexico, Illinois, Michigan and New York. It started near downtown Los Angeles on February 26, 1931.

    "It was a lazy afternoon, so to speak," said Francisco Balderrama, a history professor at California State University in Los Angeles, when U.S. immigration agents and Los Angeles police surrounded about 400 men, women and children in La Placita, the historic Olvera Sreet Plaza that still is a gathering place for Mexicans and Mexican-Americans.

    "And what they did is, they cordoned off the area and which many people would, maybe through shopping or would find themselves sitting on a bench, etc. And the trucks cornered off the area. And there was just a drive, kind of just picking up people and just rounding them up," he said.

    Professor Balderrama says everyone in the plaza that day was shipped straight to Mexico, with no word to their families. For many, it was the beginning of a second-class existence, vilified by Mexicans as repatriados - repatriated ones - with work hard to find, meals scarce and no electricity or plumbing.

    The historian is the co-author of Decade of Betrayal, the first book to fully document the campaign to force Mexicans and Mexican Americans out of their jobs, off welfare and out of the United States.

    Professor Balderrama says the initial raids were designed to boost political support for President Herbert Hoover among his Republican base and Democratic whites-only labor unions. In fact, he says, the head of the Local Citizens Unemployment Relief Committee trumpeted the cause during a Los Angeles press conference that preceded the first round up.

    "He announced the Immigration and Naturalization Service were going to conduct a series of raids to round up those other people, Mexicans in particular, and this would help provide jobs for American citizens," said Professor Balderrama.

    "Sixty percent of those that were deported were born right here in the United States and they were U.S. citizens, as anybody in the country was, at that time," he added.

    "The best-guess estimates are that, of the 1.2 million U.S. citizens that were deported, probably somewhere around 300,000 to 400,00 are still alive," said Joe Dunn, chairman of the California state senate select committee on citizen participation.

    Click here to return to top of page.

    Reagan and Thatcher:"Linked by the Lord" (posted 9-25-03)

    David Rennie, writing in the London Daily Telegraph (Sept. 23, 2003):

    The extraordinary friendship between President Ronald Reagan and Margaret Thatcher - seen by outsiders as an historic alliance of political soulmates - was viewed by Mr Reagan as evidence of divine intervention, according to letters he wrote to her.

    "Throughout my life, I've always believed that life's path is determined by a Force more powerful than fate. I feel the Lord has brought us together for a profound purpose, and that I have been richly blessed for having known you," he wrote in 1994, days after she delivered a speech at a formal 83rd birthday tribute for the retired president in Washington.

    The note is in a cache of more than 5,000 handwritten draft letters unearthed by US researchers, extracts of which appear in a new book, Reagan: A Life in Letters, published in America today.

    A few years earlier, the pair of leaders forged a deep bond of trust that some historians credit with reinvigorating the Anglo-American "special relationship".

    It has long been known that Mr Reagan cherished the "Iron Lady" as a counsellor and sounding board. But a previously unknown letter shows that Mr Reagan - who saw international relations in intensely personal terms of trust, much like his disciple George W Bush today - was certain that God had called them together, to wage a sacred struggle against Soviet communism, and other "evils".

    "I am proud to call you one of my dearest friends, Margaret; proud to have shared many of life's significant moments with you, and thankful that God brought you into my life," Mr Reagan wrote, signing himself: "Sincerely, Ron."

    When it was written, in February 1994, both were out of office.

    Click here to return to top of page.

    Jewish? Africans Knew It All Along; DNA: Genes Support a Tribe's Belief (posted 9-25-03)

    John Murphy, writing in the Baltimore Sun (Sept. 25, 2003):

    The Jewish community in this dusty mountain village has some unorthodox customs to mark the Jewish new year. They slaughter a cow, eat its intestines, take snuff to expel demons and then, during an all-night ceremony held inside a hut with a cow dung floor, they dance, drink and sing, summoning the spirits of their ancestors for guidance in the year ahead.

    "It's almost the same as Rosh Hashana," says Ephraim Selamolela, a 62-year-old businessman whose family has been celebrating the holiday this way for generations as members of South Africa's Lemba tribe.

    Many Jewish communities would dismiss Selamolela's claims as outrageous. Even Selamolela admits that his tribe has lost touch with mainstream Jewish traditions. But the Lemba have not lost touch with their ancestry, he says. "We are Jewish," he claims. He also has DNA that he believes proves it.

    The 50,000 Lemba scattered among the foothills of the Soutpansberg Mountains in South Africa's Limpopo region have a number of traditions that have always set them apart from other African tribes.

    They practice circumcision, they don't eat pork or mix milk with meat, as prescribed by Jewish dietary laws. They keep one day of the week holy, and they bury their dead with their heads facing north, toward Jerusalem.

    According to Lemba oral traditions, the tribe was led from the Holy Land more than 2,500 years ago by a man named Buba, to a city in Yemen, and later crossed the Red Sea into East Africa, following a star that eventually brought it to present-day South Africa.

    They say they adopted local customs during their journey, like other members of the Jewish diaspora. They intermarried with African tribes, embraced African rituals and forgot many Jewish rituals and scriptures. European colonizers later converted many of the Lemba to Christianity. The Lemba don't have rabbis, synagogues or copies of the Torah.

    But their dietary laws and cultural practices, nearly identical to those in Jewish communities around the world, survived generation to generation, as did their belief that they share an ancestry with the Jewish people.

    For years the outside world dismissed the Lemba's claims as sheer fantasy. That changed in 1999, when geneticists from the United States, Great Britain and Israel discovered some backing for the claims.

    The researchers found that Lemba men carried a DNA signature on their Y chromosome that is believed unique to the relatively small number of Jews known as the Cohanim, who trace their ancestry to the priests of the ancient Jewish Temple and, ultimately, to Aaron, brother of Moses.

    The genetic discovery might have had a greater impact on Jewish communities that had rejected the Lemba's claims than on the Lemba, who never doubted their ancestry.

    "For the Western Jewish world it was an identity crisis, but for the Lemba it was a yawn," says Jack Zeller, president of Kulanu, an organization based in Silver Spring dedicated to finding and assisting dispersed remnants of the Jewish people.

    Click here to return to top of page.

    Did Kennedy's Many Illnesses Adversely Impact His Presidency? (posted 9-25-03)

    David Kaiser, writing on H-Diplo (Sept. 9, 2003):

    [Robert] Dallek has shown that JFK was very gravely ill at various periods in his life and took a good many drugs during his Presidency. These included cortisone for his Addison's disease and, during 1961 and (I think) early 1962, pain killers for his back. They also included some downers and uppers, the former very common sleeping pills at that point. (Galbraith notes in one of his memoirs that during crises he, like Kennedy, used a barbituate to help him sleep). And during a good deal of 1961 Kennedy's back problems put him on crutches.

    However, having studied his appointment calendar in great detail, I can assure the list that none of this prevented him from doing his job or, indeed, living a relatively normal life as President. He came to the office around 9:00, worked until about 1:00 (mostly in meetings--which, as tapes show, often followed quite a relaxed pace), and then disappeared for two hours. (And yes, as I noted in my book, we all know, now, what he was doing for some of that time, on many days.) Then he returned in mid-afternoon and went back to work until about 6:00. He socialized in the evening and went away nearly every weekend, either to Florida or Hyannis Port, usually. This was in sharp contrast to Johnson, a hopeless workaholic if ever there was one, who never relaxed. Literally never. (Even at the ranch he frantically hopped from one neighbor's ranch to another.)

    In short, while Kennedy had been critically ill for much of his life and concealed it to get into the White House, there really isn't any evidence that his health kept him from functioning very effectively in that office. And actually, that's what Dallek says, too.

    I have already had my say, in print, about Hersh's book. Hitchens is simply parroting Hersh. Neither of them has spent any time on the enormous documentation (tapes included) that we now have. This allows Hitchens, for instance, to claim that JFK ordered Lumumba's assassination--quite a trick, since Lumumba was already dead on January 20, 1961. It allowed Hersh to claim that JFK tried to send Lansdale to Saigon to handle the crisis in the fall of 1963, when the record shows clearly that Kennedy REFUSED Henry Cabot Lodge's request to have Lansdale sent.

    Click here to return to top of page.

    In Defense of the CIA's Bill Colby (posted 9-25-03)

    T.C. Wales, in an H-Net review of John Prados's Lost Crusader: The Secret Wars of CIA Director William S. Colby (Sept. 2003):

    In the history of the CIA, only three intelligence professionals--lifetime employees of "the Company"--have risen through the ranks to become Director of Central Intelligence (DCI): Richard Helms, Robert Gates and William Colby. Collectively, their careers span the first four decades of the Agency's existence, and all three men make fascinating character studies. Until now, however, only Helms and Gates have been featured in first-class biographies.[1] _Lost Crusader_, John Prados' impressive new life of William Colby, has redressed this imbalance. Colby, the gray eminence of the CIA's Vietnam struggle and a poker-faced witness during the Agency's nadir at the 1975 Congressional intelligence hearings, has his motives comprehensively deconstructed. The man who emerges demonstrates a curious blend of high ideals, stubborn dedication, and a significant capacity for moral compromise. Prados humanizes a cipher without glossing over his faults. In the process, Colby comes to embody the best--and worst--tendencies of his country, his generation, and his CIA.

    The grand scope of Prados' book appears to have developed by default. _Lost Crusader_ is not an authorized biography and the author glosses over much of Colby's personal life. Instead, Prados has given us an institutional history of Bill Colby's CIA, from the arrival of an idealistic young lawyer and OSS veteran at the Agency in 1950, to President Ford's ouster of a graying DCI in 1975. Yet this is more than a simple chronological narrative. Prados uses the vicissitudes of Colby's career as an operations officer in Scandinavia, Washington and Southeast Asia to address many of the perennial themes of American intelligence history. These include a persuasive leitmotif on the frequently self-defeating nature of American covert operations, and a timely discussion on the manipulation of intelligence by policymakers. Prados does not have an ideological axe to grind and he casts the Agency in a sympathetic light, while subjecting its failures to critical analysis.

    Perhaps the most important theme in _Lost Crusader_, however, addresses the persistent myth that the Agency is out of control--dangerously aloof from both presidential and congressional oversight. This allegation, which gained widespread public currency at the end of the Watergate era, was fed by the media frenzy that surrounded the Congressional intelligence hearings of 1975. Former CIA counterintelligence chief--and Colby rival--James J. Angleton appeared to lend credence to the Agency's critics in his testimony before the Senate Intelligence Oversight Committee. "It is inconceivable," he told a committee investigator, "that a secret intelligence arm of the government has to comply with all the overt orders of the government."[2] Angleton's remarks led the committee chairman, Senator Frank Church, to describe the CIA as a "rogue elephant." Church was wrong. The CIA never went "off the reservation" by making its own covert policy--although, as Thomas Powers notes, until Watergate it often served as the "President's personal Saturday-night gun."[3] The Agency was not out of control, but without Colby's decision to draw Congress into the oversight process, the government might have succeeded in selling that story to the public. For Capitol Hill and the White House bear the most responsibility for the CIA's misdeeds, whether through active direction or passive acquiescence. The Agency is merely their instrument.

    In 1975 Colby effectively ended the executive's monopoly on authority over the clandestine service by surrendering the Agency's "crown jewels" to Congress--the classified record of CIA-sponsored assassinations and coup attempts. This decision simultaneously antagonized President Ford, Henry Kissinger, the large coterie of Kennedy/Johnson hagiographers within the Democratic Party, and most of his own colleagues. It ended his career as DCI. But Prados joins an emerging consensus among intelligence historians by asserting that Colby's act of voluntary self-flagellation effectively "saved" the CIA in its current form. To be sure, revelations that Robert Kennedy plotted with "criminal elements" (read, the mafia) to kill Fidel Castro, or that Langley had advance warning of the coup that overthrew Salvador Allende, did nothing for the CIA's image at home or abroad. By proving that the Agency had performed its dirty tricks under the aegis of Presidential authority, however, Colby successfully "passed the buck."

    Ironically, his confession, and the enhanced congressional oversight it engendered, helped preserve the status quo at Langley. Prados argues that this was Colby's intent all along. The Church Committee and its counterpart in the House failed to impose any fundamental structural reforms on the U.S. intelligence community after their reports to Congress in 1976. Senator David Boren's hearings in the aftermath of the Iran-Contra scandal a decade later were similarly ineffective. This legislative paralysis was not in the best interest of the United States--since September 11 there have been many persuasive claims that the CIA is seriously dysfunctional.[4] Yet it helps Prados prove that Colby, branded an apostate by Agency insiders, was in fact a true believer, an old-fashioned "crusader" against the Communist monolith. The DCI thought the CIA had an indispensable role to play in the life and death struggle that was the Cold War, and willingly sacrificed his career to defend the Agency's privileged status.

    Patriotic selflessness accompanied by a blinkered obsession with the communist menace was the central tragedy of Bill Colby's life. He was part of the first generation of bold, deeply serious young men who earned their stripes at CIA under Allan Dulles--the legendary "great white case officer" of the 1950s. Colby's initial assignments took him to Europe, the fulcrum of superpower confrontation. He organized groups of "stay behind" partisans in the event of a Soviet invasion of Sweden, and undermined communist influence through "political action" (black propaganda) in Italy. The enemy he found there practiced the most ruthless tactics and Colby shared the conventional wisdom that Moscow was behind these malevolent activities. Steeped in the historical "lesson" of Chamberlain's debacle at Munich, the CIA was determined to confront the Soviet menace wherever it appeared.

    In retrospect, it is clear that disaster ensued because the Agency--and the United States--conflated the communist monolith in Europe with the communist-oriented nationalists of post-colonial Asia. Mao Tse-tung and Ho Chi Minh may have been horrific dictators, but their struggles against foreign occupiers earned them a measure of popular legitimacy. This insulated them from Moscow's control, but also made it extremely difficult--if not impossible--for the United States to sponsor viable non-communist leaders. The failed attempt to export orthodox Cold War doctrine to Asia became a black hole that consumed the idealism, careers and lives of many Americans.

    Bill Colby was one of them. In 1956 he was named Deputy Chief of the CIA station in Saigon. Over the next decade and a half, Colby's intelligence, talent for operational improvisation, and becoming modesty earned him a succession of increasingly important posts, while the United States became enmeshed in the Indochina war. By 1963, Colby's clandestine contacts in Saigon had transformed him into an indispensable figure at Langley: DCI John McCone's point-man on Southeast Asia.

    Unfortunately, like many Vietnam-hands during the 1960s, Colby's long experience in Saigon may have caused him to overlook the innate weakness of the regime. In his capacity as station chief--and later as head of the CIA's entire Far East Division--Colby developed a personal friendship with Ngo Dinh Nhu, brother of South Vietnam's reclusive dictator, Ngo Dinh Diem. The growing Saigon press corps was less keen on the ruling clique. David Mohr, a _Time_ correspondent, called them "the most neurotic family I've ever known about, even in history. They simply were a bunch of dingbats."[5] By 1963, Diem's brutality, cronyism, and foolish oppression of the Buddhist sects had produced serious unrest; he and his brother Nhu were murdered in the course of a military coup. Until the end of his life, Colby maintained that the Vietnam War could have been "won" if the United States had stuck with Diem. Prados rubbishes these "extravagant" claims as vestiges of the elusive "perfect strategy" theory--the fantasy that "a few more guns or dollars could have put the outcome in our grasp" (p. 334, 339). The author is correct. If we accept that it was impossible for the South Vietnamese regime to achieve long-term viability, than political "victory" was unattainable. The holy grail that crusaders like Colby and General William Westmoreland sought did not exist in Southeast Asia.

    After Colby's obsession with the Ngo brothers ended with their overthrow in November 1963, he and his Agency colleagues played a more constructive role in Vietnam. Prados notes that the CIA was among the first organs of the U.S. government to comprehend the futility of continued American aid to Saigon. In the most intriguing part of _Lost Crusader_, the author shows that by late 1963 CIA analysts had adopted a pessimistic view on the effectiveness of U.S. intervention in Vietnam (p. 135, 138, 144, 180, 211). In the face of intense pressure from the White House and the Department of Defense, the Agency continued to produce well-researched, realistic reports on the state of the war effort. To be sure, the DCI (McCone, Helms and later, Colby) sometimes directed that these assessments be watered-down with more neutral language, lest they antagonize intelligence "consumers" in government. The Agency's prescient intelligence assessments were frequently undermined by the desire to please its political masters: Senator Church's "rogue elephant" had the heart of a mouse. Overall, however, the CIA's Directorate of Intelligence served as the institutional equivalent of George Ball: the skeptical Cassandra-figure that everyone heard and ignored on Vietnam.

    Disturbingly, under the Nixon Administration the Agency's tendency toward self-censorship became much more egregious. In addition, government officials actively manipulated CIA reports to suit their own policies. In 1974, Henry Kissinger, who was serving as both Secretary of State and National Security Advisor, directed Langley to prepare an unclassified paper on the level of "communist bloc aid to Hanoi" (pp. 284-285). Kissinger, who was lobbying Congress for more aid to Saigon, hoped the report would help bolster his case. When the study actually showed that outside support for North Vietnam was declining, the furious Secretary got the National Security Council to demand a rewrite, inserted his own language into the document, and forced the CIA to accept responsibility for the whole. If allegations concerning the Bush Administration's distortion of intelligence on Iraq's weapons of mass destruction are true, there will be an international scandal. _Lost Crusader_ makes a powerful case that such political shenanigans would hardly be unprecedented. It may also serve as a cautionary tale: the White House's attempts to evade responsibility or foist blame upon an unwilling CIA led to Colby's "rebellion" before the Church Committee.

    Prados' revelations may also change our perception of the CIA's role in the Cold War, and the men who ran it. _Lost Crusader_ leaves the impression that the CIA was frequently victimized by American political leaders, particularly during the Vietnam era. Successive presidents solicited the Agency's assessments, followed them when they were politically convenient, and manipulated or ignored them when they were not. If something went wrong, or the tide of public opinion changed, Langley served as a convenient presidential scapegoat (p. 299). When an operation went bad, CIA bungling, rather than unreasonable demands from the executive, was the inevitable conclusion. The only question seems to be why agency personnel were content to take it on the chin until Colby's confession at the Church/Pike hearings in 1975. Prados' answer is that the analysts and case officers of Colby's generation had a powerful patriotic streak and an abiding faith in their own government. Old loyalties died hard.

    Colby's career in Vietnam after the Diem assassination is a case in point. Although he was dubious about the effectiveness of large-scale special operations in Southeast Asia after early 1964, at the behest of the government he became civilian director of the U.S. pacification program (the office of Civil Operations and Revolutionary Development Support, or CORDS). There, he oversaw efforts to weed-out communist influence in the countryside that married American dollars with "Vietnamese" interrogation methods. It was an obvious prescription for the most revolting brutalization and murder; a shameful record that reached its apogee under the infamous "Phoenix" program, which resulted in the deaths of roughly 20,000 suspected Vietcong sympathizers. Although Colby took action to ensure "greater uniformity and respect for [due] process" by American "phoenix" advisors, he must have known that the operation invited enormous "collateral damage" (p. 216). In one notorious incident Navy Lieutenant (and future U.S. Senator) Robert Kerrey led his unit into a Mekong-delta village at night. The soldiers were searching for an alleged NLF commander. Kerrey's team never found the commander, but left twenty-one civilians dead in their wake. Despite the best intentions, this was Bill Colby's legacy in Vietnam.

    It is easy to condemn the Agency for its record in Indonesia, Chile, and Vietnam during Colby's tenure as a senior member of the Directorate of Plans (later Operations) and brief stint as DCI. Perhaps the most compelling feature of Prados' book, however, is that it makes us hesitate to pass judgment. The CIA is an instrument of the U.S. government. In the final analysis, the American people and their elected representatives, must shoulder the responsibility for the use or abuse of the Agency's powers. In the aftermath of the September 11 tragedy, _Lost Crusader_ serves as a timely reminder that the United States cannot afford to dismiss the intelligence community as its quirky collective id: fascinating, sinister but ultimately harmless. Its failings reflect poorly on, and endanger, all Americans. There will always be men and women like William Colby who are patriotic, self-sacrificing, and totally dedicated to the United States and their chosen creed. It is America's responsibility to make sure their idealism is directed toward honorable ends. Meaningful intelligence reform, stillborn under Colby in 1975, is imperative.

    Notes

    [1]. Richard Helms is the subject of perhaps the best biography of a former DCI. See Thomas Powers, _The Man Who Kept the Secrets_ (New York: Random House, 1979). Gates has written a controversial memoir that, while far form the last word on his tenure at Langley, presents the end of the Cold War in an interesting new light. See Robert F. Gates, _From the Shadows: The Ultimate Insider's Story of Five Presidents and How They Won the Cold War_ (New York: Touchstone, 19970.

    [2]. Quoted in Robin W. Winks, _Cloak and Gown: Scholars in the Secret War_ (New York: William Morrow, 1987), p. 327.

    [3]. Thomas Powers, _Intelligence Wars: American Secret History from Hitler to Al-Qaeda_ (New York: New York Review of Books, 2002), p. 265.

    [4]. See, among others, Rhodri Jeffreys-Jones, _Cloak and Dollar: A History of American Secret Intelligence_ (London: Yale, 2002), pp. 6-9; and William E Odom, _Fixing Intelligence: For a More Secure America_ (New Haven: Yale, 2003).

    [5]. Mohr quoted in William Prochnau, _Once Upon a Distant War: Young War Correspondents and the Early Vietnam Battles_ (New York: Times Books,1995), p. 15.

    Click here to return to top of page.

    Bio Weapons Employed by the Ancients (posted 9-24-03)

    From the Discovery Channel newsletter (September 24th, 2003):

    The legendary Trojan War was won with the help of poisoned arrows, in one of the first attempts of biological warfare, according to the first historical study on the origins of bio-terrorism and chemical weapons.

    "In this celebrated epic poem about noble heroes fighting honorable battles, both sides actually used arrows dipped in snake venom," said Adrienne Mayor, author of "Greek Fire, Poison Arrows & Scorpion Bombs: Biological and Chemical Warfare in the Ancient World" (published this month by Overlook Press).

    Mayor, a classical folklorist in Princeton, N.J., gathered evidence from various archaeological finds and more than fifty ancient Greek and Latin authors, revealing that biological and chemical weapons — horrible even by modern standards — did see action in antiquity.

    Toxic honey, water poisoned with drugs, scorpion bombs, chocking gases, conflagrations and incendiary weapons similar to modern napalm were widely used in historical battles. Among victims and perpetrators of biochemical warfare were prominent figures such as Hannibal, Julius Caesar and Alexander the Great.

    "The first place we see the use of any kinds of poisons is in the story of how Hercules, the super hero of Greek myth, slew the gigantic, poisonous water-serpent Hydra. He dipped his arrows in the monster's venom, creating the first biological weapon described in Western literature," Mayor said.

    The "Iliad" provides several clues to primitive biological warfare. Written about 700 B.C., the poem centers on the war between the Greeks (or Achaeans) and the Trojans, thought to have happened around 1250 B.C.

    Through memorable episodes, the poem tells the legendary 10-year siege of Troy by King Menelaus of Greece, who sought to rescue his wife Helen from her abductor prince Paris.

    "Several passages hint strongly that poisoned weapons were wielded by warriors on the battlefield, although Homer never said so outright. When Menelaus was wounded by a Trojan arrow, for example, the doctor Machaon rushed to suck out the "black blood." This treatment was the emergency remedy for snake bite and poisoned arrow wounds in real life," Mayor wrote.

    Indeed, snake venom does cause black, oozing wounds. The snake species used in the Trojan War were vipers as their dried venom remains deadly for a long time when smeared on an arrowhead.

    "I think it is entirely possible that what we would now call biological weapons were used by warriors in antiquity. My favorite example is Odysseus, whose weapon of choice was arrows smeared with poison," Robert Fagles, chairman of the Department of Comparative Literature at Princeton University, and translator of the "Iliad," told Discovery News.

    Indeed, Odysseus, the archer renowned for crafty tricks, was the first mythic character to poison arrows with plant toxins, Mayor said. Homer recounts that he sailed to Ephyra, in western Greece, on a quest for a lethal plant — probably aconite — to smear on his bronze arrowheads.

    According to Mayor, the possibilities for creating arrow poisons from natural toxins were myriad in the ancient world: "There were at least two dozen poisonous plants that could be used to treat arrows. The most commonly used toxins came from aconite (monkshood or wolfbane), black hellebore (the Christmas rose of the buttercup family), henbane (Hyoscamus niger), hemlock, yew berries and belladonna (deadly nightshade)," she said.

    Other toxic substances used for arrows and spears included venomous jellyfish, poison frogs, dung mixed with putrified blood, the toxic insides of insects, sea urchins and stingray spines. Odysseus himself was killed by a spear tipped with a stingray spine, wielded by his estranged son by the witch Circe.

    "This is an important contribution to the history of chemical and biological weapons. Mayor makes a convincing case that these weapons have roots deep in human prehistory, and that they were actually used," biochemical warfare expert Mark Wheelis of University of California, Davis, told Discovery News.

    Click here to return to top of page.

    Poles Enraged By Memorial To Expelled Germans (posted 9-24-03)

    Roger Boyes, writing in the Times (London) (September 24th, 2003):

    A FIERCE dispute over the fate of Germans expelled from postwar Central Europe has plunged relations between Berlin and Warsaw to a new low only eight months before Poland is due to join the European Union.

    Poles are increasingly nervous that when they enrol in the EU in May they will be flooded by demands for financial compensation from Germans whose property was taken at gunpoint after the Second World War.

    The depth of Polish animosity was exposed by the latest cover of the country's bestselling news magazine, Wprost, which depicts the head of Germany's Expellee Association in an SS uniform straddling a submissive Gerhard Schroder, the German Chancellor.

    The dispute is over proposals for a Centre for the Expelled, commemorating the 12 million Germans who were thrown out of their homes in Poland, Czechoslovakia and Hungary when the borders were redrawn in Europe. Many were pushed into cattle lorries, German women were raped and, according to German accounts, there were several individual atrocities. Witnesses recall German women being strapped to the wheels of carts by angry Czechs.

    The Expellee Association, led by Erika Steinbach, a 60-year-old German Christian Democrat MP, wants to build the centre in Berlin to allow, as she says,"the Germans to mourn and remember those killed and dispossessed". She has received broad support from senior Christian Democratic politicians, prominent writers and even Social Democrats whose families were expelled after the war.

    The Poles are concerned not only about the possibility of court cases under EU law - dispossessed Germans could demand as much as Euro 6 billion (Pounds 4.15 billion) - but also by the prospect of historical distortion.

    Polish historians say that within a generation Germans could be portraying themselves primarily as victims of war. The only big memorial centres in Berlin would be the Holocaust memorial to the six million Jews murdered by the Nazis in Europe and an information centre about persecuted Germans. Other victims would be ignored.

    "Nobody denies that there were German victims," said Leon Kieres, a law professor and director of Poland's Institute of National Remembrance, which researches and investigates crimes committed by the Nazis and the Soviet Union."But we have to remember that Germans were victims as a result of their own actions. Hitler was democratically elected and the people have to be responsible for its government."

    Leszec Kolakowski, the Oxford philosopher, has also criticised the planned centre."Thousands of German women were raped by Soviet soldiers: does their destiny not deserve to be remembered? Why are those who are planning the centre not ready to take up the issue of these much worse, much crueller, much more painful persecutions?"

    The reason, he said, is simple: a memorial for Germans who have suffered at the hands of the Russians would provoke Moscow and would have no chance of extracting compensation, unlike the present project.

    Herr Schroder came round to the Polish view on Monday during a summit with Leszek Miller, the Polish Prime Minister: the centre should not be constructed in Berlin, he said. The aim should be to put such expulsions - the modern term is ethnic cleansing - in a European context, so that the centre should be set up in Sarajevo, Geneva or Strasbourg.

    Some Polish intellectuals are ready to accept the centre in the western Polish city of Wroclaw, known as Breslau in Germany. Houses occupied by Germans there were taken over after the war by Poles expelled from the Eastern territories by Stalin. A Wroclaw centre could present the problems of both Germans and Poles.

    The dialogue however has become too raw. The bilateral Polish-German summit on Monday, which had been designed to clear up differences on the EU constitution, was dominated by the dispute over the memorial centre. The Poles, in particular, have been demonising Frau Steinbach, who is rather a marginal political figure in Germany. In the Polish press, however, she has become the ugly face of the new Germany. Polish journalists discovered that her claim to champion the cause of expelled Germans has been compromised by her life history as the daughter of a German army officer stationed in Poland who did not lose an ancestral home.

    Some Polish newspapers reminded their readers yesterday that for many years Frau Steinbach had opposed Polish entry to the EU unless compensation was paid to dispossessed German families.

    Click here to return to top of page.

    The Myth of the Lazy Native (posted 9-23-03)

    Faezah Ismail, writing in the Malaysian New Straits Times (Sept. 21, 2003)

    The Myth of the Lazy Native was conceived in 1966 when Syed Hussein, then head of the Cultural Division, Department of Malay Studies, University of Malaya, posed the following question: why were the natives of Indonesia, Malaysia and the Philippines judged as lazy by hundreds of authors from the ruling colonial regime in the course of some four centuries?

    "There are probably thousands of published references on this theme. Neither the conduct of the natives, nor pressing political exigencies required the promulgation of such a judgement," he writes.

    That scholarly investigation took seven years to culminate in a book; it was subsequently published in 1977.

    "I did not like the way Southeast Asian history was treated by colonial writers," says Syed Hussein, now a principal research fellow at ATMA, on why he wrote his 267-page hardback.

    He wanted to correct a one-sided colonial view of the Asian native and his society.

    In Culture And Imperialism Said, University Professor at Columbia University and an internationally renowned literary and cultural critic, classifies A Rule of Property for Bengal (1963) by Ranajit Guha and The Myth of the Lazy Native as post-colonial and specialist, addressing a smaller audience about more specific issues.

    "Both these books, the former by a Bengali political economist, the latter by a Malaysian Muslim historian and social theorist, show their authors' assiduous archival research and scrupulously up-to-date documentation, argument and generalisation," says Said, who is also the author of Orientalism.

    Syed Hussein's book, as "startlingly original" in its own way as Guha's, also details how European colonialism created an object, in this case the lazy native, who performed a crucial function in the calculations and advocacies of what the Malaysian calls colonial capitalism, he adds.

    "Generally speaking, the colonial scholars' views of the native were denigrating and very much loaded with colonial interests," says Syed Hussein.

    Writing in The Myth of the Lazy Native, he adds: "The entire concept of humanity was derived from the interest of colonial capitalism. Gambling, opium, inhuman labour conditions, one-sided legislation, acquisition of tenancy rights belonging to the people, forced labour, were all in one way or another woven into the fabric of colonial ideology and given an aura of respectability. Those outside it were derided.

    "The ideological denigration of the native and of his history and society ranged from vulgar fantasy and untruth to refined scholarship."

    Consider the following suggestion by a German scientist that the Filipinos made oars from bamboo poles in order to rest more frequently.

    "If they happen to break, so much the better, for the fatiguing labour of rowing must necessarily be suspended till they are mended again."

    Such opinions were held by other scholars and educated people, says Syed Hussein.

    "Their persistence and repetition over at least two centuries in thousands of books and reports written by administrators, scholars, travellers and journalists, revealed their ideological roots," he adds.

    "Since the independence of Malaysia, Indonesia and the Philippines (from their colonial masters), the negative image of the native is no longer conspicuous in foreign writings.

    "There are writings critical of the economic or political situations in the countries but on the whole they do not contain direct denigration of the natives, their society and history."

    Following the post-independence relationship between Southeast Asia and the West, the image of the native has also altered.

    Syed Hussein observes that "the ideological elements have been transformed and have assumed a new garb. The image of the indolent, dull, backward and treacherous native has changed into that of a dependent one requiring assistance to climb the ladder of progress".

    The "lazy native" concept is synonymous with domination, derived from the "false consciousness" of the colonialists, says Associate Professor Dr Ahmad Murad Merican, fellow and chairman of the Centre for Intellectual History and Malay Thought at the Institute of Knowledge Advancement, Universiti Teknologi Mara, Shah Alam.

    "But what may be seen as the genesis of the development of an autonomous social science tradition in Asia is in Syed Hussein's attacks on Asian intellectuals who continue to reproduce in their own thinking the colonial ideology that created and sustained the "lazy native" image. Such thoughts merely reinforce the doctrine of subjugation without them realising it," he adds.

    As Syed Hussein puts it in the introduction of The Myth of the Lazy Native: "I believe in the primarily negative influence of colonialism. I believe in the need to unmask the colonial ideology for its influence is still very strong."

    "He is against captive-minded intellectuals who are uncritical, unquestioning, in other words, refusing to challenge the received body of scholarship," says Ahmad Murad.

    Syed Hussein's interest in the phenomenon of the captive mentality dates back to the early 1950s when he was studying at the University of Amsterdam in Holland.

    He defines a captive mind in the non-Western world as one that is imitative and non-creative and whose thinking is based on Western categories and modes of thought.

    In 1956, as a postgraduate at the University of Amsterdam, Syed Hussein's article on Some Fundamental Problems of Colonialism - in which he argued about the folly of aping Western thinking - was published in a journal called The Eastern World based in London.

    Click here to return to top of page.

    Italy Obsessed with the Killing of Aldo Moro (posted 9-23-03)

    Daniel Williams, writing in the Washington Post (Sept. 19, 2003):

    No event of the second half of the 20th century in Italy has a stronger grip on the public imagination than the kidnapping and killing by the Red Brigades of Aldo Moro, the leader of the once-dominant Christian Democratic Party.

    His death in 1978 inspired scores of books, hundreds if not thousands of essays and articles, and enough conspiracy theories to overwhelm Oliver Stone. Moro's murder is Italy's equivalent of the Kennedy assassination.

    This year, two feature films about Moro opened in Italian theaters. One was a rehash of a theory that the United States and other sinister forces had arranged his death. But the second, a psychological portrait of one of the terrorists, created a stir. "Buongiorno, Notte" ("Good Morning, Night") ends with a dream of Moro walking free from captivity, instead of the true climax, his body found crumpled and full of bullets in a stolen car parked on a side street in downtown Rome.

    Suddenly, all sorts of Italians are fantasizing that Moro came home alive -- his captors, aging politicians who announced they would not negotiate for his release, pundits and historians. It was as if Italy had moved from historical revisionism to psychological revisionism. In the days leading up to Moro's death, his survival seemed low among many Italians' priorities. Twenty-five years later, it seems to be everyone's most ardent desire.

    "A worker came up to me who saw the movie," said Marco Bellocchio, the director. "He said that he was 20 when Moro was kidnapped, and he applauded. Now, he said, he cried."

    Even former members of the Red Brigades got into the act. After viewing the movie, Anna Laura Braghetti, who was one of Moro's captors and the person on whom the main character is based, said she was against killing Moro. "I was horrified. I imagined letting Moro go, but I didn't do it. I stayed in the Red Brigades," she told the newspaper Corriere della Sera, in her only interview after the movie's showing.

    Her memoir of the killing, called "The Prisoner," was one of the sources for the film. But she did not actually dream Bellocchio's dream. After finishing with Moro, she was involved in the coldblooded killing of a university professor.

    Eventually, Red Brigades members were hunted down and put on trial. The Moro killing was the beginning of the end of what they called "armed struggle."

    Click here to return to top of page.

    McGuire Gibson: We Are Losing the Cities of Ancient Sumer (posted 9-23-03)

    McGuire Gibson, professor of Mesopotamian archaeology at the University of Chicago, writing in Newsday (Sept. 21, 2003):

    We are losing the cities of ancient Sumer, where the earliest civilization began, and are doing little to stop it. For thousands of years, these cities have lain relatively undisturbed, as mounds in the desert of southern Iraq. But now, in the chaos of occupation, they are being destroyed by illegal diggers in search of artifacts.

    In the last few months, sleepy farming towns like Fejur, Rifai and Afak near the ancient cities have come to life as markets for the illegal antiquities trade. Sellers are hoping that some foreigner, such as a journalist, soldier or contractor, will pay them more than the agents sent by dealers, who give them only a few dollars per item.

    Some of the artifacts are being sold inside the country, but the best are smuggled to dealers in Europe, who will sell them to collectors there, or transship them to the United States, Japan and elsewhere. Then, often bearing fraudulent certificates of provenance, they are sold to collectors who justify their collecting by saying that the objects are much safer with them than they are in Iraq. In truth, without their eagerness to possess the artifacts, there would be no illegal digging.

    It is easy to dismiss the damage to Iraq's cultural heritage as an unfortunate by-product of war. All wars cause destruction of standing monuments and the theft of moveable artifacts. But most of the looting in Iraq has taken place after major hostilities had ceased. The huge market in illegal antiquities did not exist at all before the 1991 Gulf War, because Iraq had a strong antiquities law. Under the United Nations embargo, illegal digging occurred only at a few sites in the south. But now, in this period of occupation, there seems to be no limit to what will be destroyed.

    All of this goes on with little hindrance from the occupation forces and little interest by news organizations, in contrast to the looting of the Iraqi National Museum from April 10 to 12. The news coverage of that event resulted in quick U.S. action to create an inventory conducted by museum personnel and U.S. customs officials. Thus far they have documented the theft of 13,000 objects, and that catastrophic figure does not include hundreds of objects smashed and left on the floors.

    But the museum losses are fast being eclipsed by the number of items being ripped out of their original archaeological context in great sites such as Nippur, Isin, Shuruppak, Adab, Umma and Larsa, all in the south of Iraq. Although the situation has gotten some media coverage and is being documented by UNESCO, the National Geographic Society and the Iraqi State Board of Antiquities, the fate of the archaeological sites has not resonated as did the the destruction of the museum.

    Perhaps this is because so many people have been to a museum and been fascinated by some ancient culture or another. But few have visited an ancient site; and in Iraq, even the most important ancient cities appear to be just big hills with remnants of adobe walls in pits dug by the archaeologists.

    Left undisturbed, these sites could have been excavated scientifically for hundreds of years, providing information on the birth and development of both eastern and western cultural tradition, answering research questions that scholars do not yet know how to ask. Such long-term research would employ generations of Iraqis and attract generations of tourists. Compared to these benefits, the short-term gain that the diggers realize is minuscule. ...

    Click here to return to top of page.

    Reagan's Letters: Supporters Say They Show His Serious Side (posted 9-23-03)

    From ABCNews.com (Sept. 21, 2003):

    Reagan: A Life in Letters, a collection of letters being published today, includes more than a thousand letters from Reagan to world leaders, politicians, family, friends and ordinary Americans.

    Nancy Reagan told ABCNEWS' George Stephanopoulos that she hopes the letters will change some long-held perceptions of her husband.

    "You know, for so long, he was not taken seriously," she said. "Nobody knew that he ever did anything like this. They said he didn't read. He always read. He never went anywhere without a book."

    Mrs. Reagan said the former president was a natural writer.

    "He just wrote," she said. "He liked to write. He didn't like the phone at all. But he liked to write, and always has."

    Personal Touch

    Former Reagan economic adviser Martin Anderson, an editor of the book, said of the letters: "They were never meant for publication, but I think that they tell us more about who Reagan was than anything else we've ever seen."

    A key letter, according to Anderson, is the one Reagan wrote early in his presidency to Soviet President Leonid Brezhnev.

    "My own personal view," Anderson said, "is that letter that Reagan wrote early in 1981 was the beginning of the end of the Cold War."

    The letter is significant because Reagan insisted on sending his own personally handwritten letter to Brezhnev in addition to a more formal letter drafted by the State Department. Brezhnev had angrily expressed the view that the United States threatened Soviet security. Reagan addressed him directly:

    My Dear Mr. President:

    I regret and yet can understand the somewhat intemperate tone of your recent letter. After all, we approach problems confronting us from opposite points of view. …

    In your letter you imply … that we have imperialistic designs and thus constitute a threat to your own security and that of the newly emerging nations. There is not only no evidence to support such a charge, there is solid evidence that the United States when it could have dominated the world with no risk to itself made no effort to do so.

    When World War Two ended the United States had the only undamaged industrial power in the world. It's military power was at its peak — and we alone had the ultimate weapon, the nuclear bomb with the unquestioned ability to deliver it anywhere in the world. If we had sought world domination, who could have opposed us?

    Reagan then appealed to Brezhnev to think about the welfare of individuals:

    The peoples of the world despite differences in racial and ethnic origin have very much in common. They want the dignity of having some control over their individual destiny. They want to work at a craft or a trade of their own choosing and to be fairly rewarded. They want to raise their families in peace without harming anyone or suffering harm themselves. Government exists for their convenience and not the other way around.

    Brezhnev knew immediately he was dealing with a different breed of American president.

    Click here to return to top of page.

    After Historic Flight, Wrights Went to Court (posted 9-23-03)

    James V. Grimaldi, writing in the Washington Post (Sept. 22, 2003):

    [O]ften left out of the history books, though described this summer by National Park Service rangers at the Wright Brothers National Memorial, are the circumstances of Wilbur Wright's death at age 45. The tangled story involves the intricacies of intellectual property law, but it could be condensed into one question: Did a lawsuit kill Wilbur Wright?

    The history of the Wright brothers and the tale of the lawsuit merge with that of the New York intellectual-property boutique firm, Fish & Neave, which was founded the year of the Wright brothers' first flight.

    Fish & Neave has "shamelessly" used its Wright brothers connection in promotional materials, says the firm's resident historian, Albert E. Fey. The firm, which opened a D.C. office last year, is eager to retell the story as the nation prepares to celebrate the 100th anniversary of the first flight. Fish & Neave also are minor sponsors of a new Wright Brothers exhibit that opens Oct. 11 at the Smithsonian National Air and Space Museum.

    After the Wright brothers' successful flight near Kill Devil Hills, N.C., close to where Hurricane Isabel came ashore Thursday , it took three years to get a patent on the device and even longer to defend it. The Wrights sued another of the great aviation pioneers, Glenn H. Curtiss, for patent infringement over their discovery that "warping" of an aircraft's wings made powered flight possible. Such warping can be seen today on jetliners' ailerons, the trailing edge of wings.

    Curtiss was an ally of Samuel P. Langley, the Smithsonian secretary who invented the "great aerodrome," an unsuccessful aircraft that rivaled the Wrights' first flight aircraft. The Wrights and the Smithsonian feuded for 40 years before the Smithsonian honored the Wrights' achievement. Curtiss's primary invention was a lightweight engine necessary for powered flight, but he borrowed heavily from the Wrights' wing design.

    Frederick P. Fish, a founder of Fish & Neave, was hired by the Wrights in the patent-infringement litigation against Curtiss, said Fey, who was an equity partner until this year when he turned 70 and became of counsel. At the trial and appeals courts, Fish prevailed. But the litigation continued for 30 years until the Curtiss and Wright corporations merged to combine their efforts. Though Orville Wright and Curtiss sat on the board together, they never spoke.

    Wilbur did not survive the litigation. In an official history, Fey wrote, "In the interest of full disclosure, I must tell you that the Wright Brothers case went on for so long it may have killed Wilbur in the process. A little known fact is that we dragged him to Boston for a deposition, where he became ill. He never recovered."

    Wilbur Wright died of typhoid fever in May 1912. His last letter to Fish complained about how long the case was taking. "Unnecessary delays by stipulation of counsel have already destroyed fully three fourths of the value of our patent," he wrote on May 4, from Dayton, Ohio. "The opportunities of the last two years will never return again. At the present moment almost innumerable competitors are entering the field, and for the first time are producing machines which will really fly."

    Said Fey, "Clients always want things to go faster."

    Wilbur Wright died four days after writing the letter.

    Click here to return to top of page.

    Native American History Needs the Native American's Perspective (posted 9-23-03)

    Ericka Schenck Smith, writing in the Missoulian.com (Sept. 20, 2003):

    It may seem obvious that you can’t write American Indian history without including Indian voices, but a new generation of writers is really the first to be taken seriously for doing just that, a panel of writers said Friday during a Festival of the Book discussion at the Missoula Public library.

    “One of the problems with so-called Native American history is that it’s not really Native American history; it’s white history with Native Americans in it,” said David Beck. Beck, a non-Indian, teaches Native American studies at the University of Montana and has worked intensively with the Menominee in Wisconsin.

    Walter Fleming, a Montana State University professor and author of “The Complete Idiot’s Guide to Native American History,” described two common problems in writing Indian history: On the one hand, there is a distrust of white historians among native people; and on the other hand, there is a distrust of native histories on the part of many academic historians.

    Fleming said he tried to give his Idiot’s Guide a strong cultural context – including creation stories and oral histories – so that people just learning about Indian history would gain a better understanding of the people.

    “These stories are rich,” Fleming said. “These stories are important. These stories are the foundation for people’s philosophies.”

    Stories are also the basis for an upcoming book researched by Germaine White of the Confederated Salish and Kootenai Tribes’ Natural Resources Department and Thompson Smith, the non-Indian director of the Tribal History Project.

    White and Smith delved into Salish and Pend Oreille oral histories to learn about how those tribes responded to their meeting with the Lewis and Clark expedition. (They didn’t think much of it, but did take pity on the pale men who looked so cold.) The process has been painstaking, but White said it is extremely important that the tribal elders’ voices are heard.

    And most importantly, White said, their stories debunk the notion that Lewis and Clark “discovered” much of anything. They show instead how the expedition wandered into lands occupied for thousands of years by people with rich cultures and strong traditions.

    Smith said projects like the ones he has worked on with White prove that tribes can write their own scholarly histories with a native voice.

    And Fleming said those kinds of collaborations are a benefit to the history profession, spawning a whole new school of writing.

    Click here to return to top of page.

    Was Hitler Hypnotized After WWI? Did this Account for His Feeling of Destiny? (posted 9-19-03)

    Danny Heitman, writing in 2theadvocate.com (Sept. 17, 2003):

    Dr. David Post didn't set out to become a globe-trotting lecturer on Adolf Hitler's mental health.
    It happened quite by accident with Post, who lives in Baton Rouge and works as a forensic psychiatrist.

    In 1991, while he was doing a residency in psychiatry at LSU Medical Center in New Orleans, Post made a vacation visit to the home of his uncle, the late Robert C. Holtze, in rural Minnesota.

    Holtze, an honorary consul to the Federal Republic of Germany, had a house on Lake Superior, some 20 miles from the Canadian border. Like many people who end up as house guests with a little time on their hands, Post began perusing his uncle's bookshelves. It was there that he found a copy of "Adolf Hitler," an acclaimed biography by John Toland.

    In a small footnote, Toland raised the possibility that Hitler might have been hypnotized during treatment for battle-related trauma while serving as a German corporal during World War I. Toland suggested that as a result of this hypnosis, Hitler might have experienced hallucinations that he interpreted as a supernatural summons to lead the German people.

    Could Hitler's hypnosis have contributed to his visions of grandeur, helping set the stage for World War II?

    Post was intrigued, but Toland's references raised more questions than they answered. As a psychiatrist, Post found those questions too interesting to ignore.

    So began Post's long road in researching Hitler's possible treatment by hypnosis, and its possible impact on world history. More than a decade later, Post's perspective on the issue is attracting attention from around the world. He's been invited to lecture on the topic at conferences in Munich, Atlanta, New Orleans, Indianapolis and elsewhere, and he's been featured on CBS Radio and in the pages of the Boston Herald. A lecture before a convention of MENSA members is in the works. MENSA is an international organization of people with above-average IQ's.

    Toland's primary source for his information about Hitler's medical treatment during World War I came from a Restricted U.S. Navy Intelligence report, declassified in 1973.

    His interest piqued, Post contacted the National Archives and got his own copy of the report.

    "Just looking at that document, I was very impressed," Post recalled. "It got me interested in following up."...

    The report cited by Toland had originated during World War II, as American intelligence officials tried to fathom what made Hitler tick. It was written by Dr. Karl Kroner, an Austrian nerve specialist who recalled being present at a hospital in the Pomeranian town of Pasewalk where Hitler was treated in 1918.

    In October of that year, while fighting on the Belgian front, the 29-year-old Hitler had been temporarily blinded by a mustard gas attack and taken to Pasewalk to restore his health. Gradually, Hitler began to regain his sight. But after news of Germany's surrender, Hitler again complained of blindness.

    According to Kroner, a consulting psychiatrist, Dr. Edmund Forster, concluded that Hitler's blindness was a symptom of hysteria.

    At that point, Hitler's medical history becomes cloudy. When he rose to power in 1933, the German dictator had his treatment records from Pasewalk destroyed. After being arrested as a subversive by Hitler's Gestapo and being interrogated for 13 days, Forster committed suicide, further shrouding Hitler's medical history in secrecy.

    But Forster left behind some tantalizing clues suggesting that Hitler's treatment at Pasewalk included hypnosis.

    Rudolph Binion, another Hitler scholar that Post encountered during his research, unearthed a connection between Forster and Ernst Weiss, who helped run a newspaper in Paris for German exiles during Hitler's regime. According to Binion, Forster traveled to Paris before his death and met with the expatriate newspaper's editorial board, which included Weiss. Forster reportedly shared Hitler's medical records from Pasewalk with the editorial board, though the whereabouts of those records are unknown today.

    However, in 1938, five years after Forster's trip to Paris and subsequent suicide, Weiss wrote "The Eyewitness," which is ostensibly a novel about a German corporal named "A.H." who is blinded during a mustard attack and treated by a psychiatrist at Pasewalk. A.H. is described as a patient with an Austrian dialect who is prone to giving hysterical speeches to the other patients. A.H. has received the Iron Cross, loves the music of Wagner, and hates Jews.

    For Hitler scholars Toland, Binion and, subsequently, Post, the parallels between the supposedly fictional A.H. and Adolf Hitler seemed too coincidental to ignore. The real attention-grabber, in their view, is a central chapter in which the psychiatrist hypnotizes A.H. and suggests that he must recover his sight in order to lead the German people. The doctor knows that A.H. fancies himself a statesman, and he apparently sees the prospect of greatness as a convenient way to encourage the patient's recovery.

    "Perhaps you yourself have the rare power, which occurs only occasionally in a thousand years, to work a miracle," the doctor tells A.H. "Jesus did it. Mohammed. The saints … You are young; it would be too bad for you to stay blind. You know that Germany needs people who have energy and blind self-confidence."

    Did Weiss use Forster's record of Hitler's treatment as the basis for "The Eyewitness"? Only Weiss could say for sure, and as the German Army entered Paris in 1940, he, too, committed suicide....

    The possibility that Hitler's sense of destiny might have been encouraged by hypnotic suggestion fascinates Post.

    "I think there's clear and compelling evidence that he was hypnotized," said Post, who published his findings in the November 1998 edition of the Journal of Forensic Sciences. "What I tell individuals is that they need to make up their own minds."

    Click here to return to top of page.

    Remembering Japan's Occupation of Manchuria--and Its Slave Labor Camps (posted 9-19-03)

    Jim Yardley, writing in the NYT (Sept. 19, 2003):

    The strangeness of it all was not lost on Robert Rosendahl as he walked through the humming Chinese factory, the place where as a World War II prisoner of war he had been a slave laborer for the Japanese. He had hated this factory, just as he hated the Japanese prisoner of war camp that took three years of his life....

    The question of remembering is a potent one for many Chinese here in the northeastern region, historically known as Manchuria. The return of the American P.O.W.'s came, by design, on a symbolic day in Chinese history — the anniversary of Sept. 18, 1931, the beginning of Japan's brutal 14-year occupation of Shenyang and of surrounding Manchuria, when untold numbers of Chinese were slain.

    For many Chinese, the lingering resentment and anger toward Japan is great. Today, an online petition signed by more than 1.1 million Chinese called on Japan to compensate Chinese victims of buried chemical weapons left by the Japanese after World War II. Last month, Chinese construction workers unwittingly struck buried mustard gas, killing one person and injuring dozens.

    This month, China and Japan held talks on the issue of buried chemical weapons, Reuters reported. So far, Japan has rejected the idea of compensation, contending that China relinquished such a claim when it established diplomatic relations with Japan in 1972....

    The most agonizing question — still not fully resolved — concerns whether the Japanese used the Mukden prisoners in medical and germ warfare experiments. Mr. Rosendahl remembers being forced to take shots, while Mr. Allen said he did not, but knew of other inmates who died after taking them. Greg Rodriquez Jr., whose late father was a Mukden prisoner, said guards had held feathers under the noses of sleeping prisoners, a known method of spreading bacteria.

    The old prison camp is now a battered apartment building for poor Chinese. Yet as Mr. Rosendahl and Mr. Allen went inside this morning, it took them a minute to recognize the place burned in their memories. They stood in a tiny room, since converted into an apartment, and seemed stunned at how much and how little had changed.

    "You recognize any of this?" Mr. Rosendahl asked Mr. Allen.

    "Not this TV," Mr. Allen replied, and the two men laughed.

    Click here to return to top of page.

    Has Italy Covered Up Its Fascist Past? (posted 9-18-03)

    Richard Owen, writing in the London Times Sept. 17, 2003):

    AN ITALIAN historian has broken a national taboo by challenging the notion that Italy emerged on the side of the victors at the end of the Second World War thanks to the anti-Fascist Resistance.

    In a book published yesterday Gianni Oliva, an historian from Turin, accuses Italians of "failing to face up to our unspeakable past". Publication coincides with the 60th anniversary of the fall of Benito Mussolini, the Fascist dictator, in September 1943 and the outbreak of civil war between Fascists and partisans in German-occupied Italy.

    In 1944 Italy was liberated by Allied forces, with help from the Resistance. But, in The Alibi of the Resistance, Signor Oliva argues that history has been doctored and that the role of the anti-Fascist partisans had been inflated so that Italy could blot out the memory of its defeat.

    Standard school textbooks in Italy declare that the Italian Republic was born out of the Resistance. They say that a coalition of democratic parties first forced the state authorities to rebel against Mussolini and then "led the struggle against German occupation".

    James Walston, a British scholar who teaches international relations at the American University of Rome, said that this was postwar Italy's "founding myth", based on the idea that the partisans had cleansed the country of its Fascist past and given it renewed democratic virginity.

    "Not only is the Fascist period largely a blank in schools and even universities, there is now a rehabilitation of Mussolini taking place," he said. Whereas France and Germany had faced the past, Italy had not.

    Dario Bioccia, an historian at Perugia University, said the reality was that most segments of society had backed Mussolini for more than 20 years. Il Duce had been overthrown not by the Resistance but by the Fascist leadership itself -with the support of King Victor Emmanuel III -because of catastrophic war losses.

    Signor Oliva said that the 1.3 million Italians who were taken prisoner of war by the Allies were shunned when they returned home "because the country wanted to sweep the reality of its defeat under the carpet". Italians preferred to cancel out the past so as not to delegitimise the new postwar political order. "We wanted to be absolved as a nation. Parts of our collective memory have been surgically removed."

    Corriere della Sera, the Italian daily newspaper, said: "Italians think we won a war, which in fact we lost."

    The debate follows controversy this week over remarks by Silvio Berlusconi, the centre-right Prime Minister, who said that Mussolini had been a relatively benign ruler who never killed anyone but "sent them on holiday in internal exile". Later he said that he had been speaking as an "Italian patriot".

    Signor Oliva said that the Italian Left, centred around the former Communist Party, had drawn its postwar legitimacy from the partisans, but the Right had also accepted the myth, partly because it preferred not to dwell on the Fascist past.

    One of the main parties in the Berlusconi coalition, the Alleanza Nazionale, led by Gianfranco Fini, the Deputy Prime Minister, is the direct descendant of Mussolini's Blackshirts.

    Signor Oliva said that the Resistance had been used by Italians to exonerate them from the need to come to terms with their own past. "We have to ask ourselves why a country which was defeated invented a false idea of itself as a victor, and why we still cling to this myth 60 years on," he said.


    comments powered by Disqus

    More Comments:


    jordan smith - 7/17/2010

    new jordans ChapaPersonality


    ivan petrovskki - 3/2/2005

    You might want to check out some useful information on http://carisoprodol.esmartbuyer.com
    http://phentermine375.esmartbuyer.com
    http://phentermine375.esmartdesign.com
    http://codeine.esmartdesign.com
    http://codeine.esmartbuyer.com
    http://cyclobenzaprine.esmartdesign.com
    http://cyclobenzaprine.esmartbuyer.com
    http://zanaflex.esmartdesign.com
    http://zanaflex.esmartbuyer.com
    http://celebrex.esmartdesign.com
    http://celebrex.esmartbuyer.com
    http://xenical.esmartdesign.com
    http://xenical.esmartbiz.com
    http://lipitor.esmartdesign.com
    http://lipitor.esmartbuyer.com
    http://nexium.esmartdesign.com
    http://nexium.esmartbuyer.com
    http://paxil.esmartdesign.com
    http://paxil.esmartbuyer.com
    http://ultram.esmartdesign.com
    http://vioxx.esmartbuyer.com
    http://vioxx.esmartdesign.com
    http://valtrex.esmartdesign.com
    http://valtrex.esmartbuyer.com
    http://zyrtec.esmartdesign.com
    http://zyrtec.esmartbuyer.com
    http://buylortab.esmartdesign.com
    http://lortab10.esmartbuyer.com
    http://buypercocet.esmartdesign.com
    http://percocet10.esmartbuyer.com
    http://buydiazepam.esmartdesign.com
    http://diazepam10.esmartbuyer.com
    :) :)

  • History News Network