Media's Take on the News 4-25-03 to 5-30-03
Click here to reach the current page for Media's Take on the News.
- What Palestinians Can Learn From a Turning Point in Zionist History
- Bill Clinton: Presidents Should Get a Chance to Serve 3 Terms
- Dick Morris on Sidney Blumenthal: A Second Term Is a Terrible Thing to Waste
- Zionists Are Facing Their Biggest Crisis in 100 Years
- Writing a Constitution for Europe
- Bush Faces the Challenges of FDR, Truman, and JFK Combined
- The Palestinians Have a State: Jordan
- Bush and 9-11: The Movie
- Iraqi Looting of Ancient Sites Goes on Under the Nose of US Soldiers
- History for Dummies
- Iraqi Looting of Ancient Sites Goes on Under the Nose of US Soldiers
- What We Can Learn from the Pearl Harbor Memorial
- SARS Could Well Usher in an Age of Cleanliness
- Why Deserters Are No Longer Court Martialed
- James G. Blaine, Catholics and the U.S. Supreme Court
- Why is Roosevelt's birthplace honoring a pro-communist pol?
- New Klaus Fuchs Files
- The Godfather of the Neo-Conservatives
- America's Rewriting of History Books for Iraqi Pupils Is a Dangerous Business
- Richard Posner: Plagiarism Isn't Theft
- Weekly Standard's Parody of the Iraq War
- Law Prof: Presumption of Innocence Eroding
- University in Iraq Uses Elections to Right Course
- Did Peter Jennings Mislead Viewers About the Looting of the Iraqi National Museum?
- Reports that the U.S. Is Blocking Experts' Access to the Iraqi Museum
- Profile of the American in Charge of the Investigation into the Looting of the Iraqi National Museum
- Saddam Hussein's Monument to Himself: Babylon
- Michael Kinsley: What If Republicans in the 1990s Had Actually Passed the Balanced Budget Amendment?
- Headline in Slate:"George Walker Hoover?"
- The Falluja Massacre?
- Only 12 Items Looted in 1991 Have Been Returned
- NYT: Looting May Have Been Less Severe
- Ties Between the CIA and Academia
- Paul Samuelson: Our Plutocracy
- Mid-Term Report Card on George W. Bush
- Scholars Using the Internet to Help Rebuild Iraqi Museum
- Diane Ravitch: Textbooks Ruined by the Language Police
- Brookings Institution: The Other Vietnam Quagmire to Beware Of
- Accused of Warmongering
- Dick Morris: Bush Could Lose in 2004
- Is There a Good Chance of Building a Democracy in Iraq?
- Frank Rich: The Toll of Looting
- Pure Democracy in Iraq Would Be a Disaster
- Greenspan Should Have Warned People About the Stock Market Bubble
- George Will: U.S. Record on Regime Change
- American Mistakes in the Middle East
- SARS Compared to the Influenza Outbreak of 1918
- The Books the Military Is Handing Out to the Troops
- "The Market Is the Key to Preserving the Past"
- Wall Street Journal: Chart Showing"Litany of Destructions"
- Two Soldiers and a Tank
- The Federal Government Should Pay for NYC's Anti-Terrorism Program
Ethan Bronner, writing in the NYT (May 30, 2003):
When Ariel Sharon and Mahmoud Abbas, the Israeli and Palestinian prime ministers, met last night and prepared to see President Bush next week, one of the biggest issues they discussed was ending the terrorism of renegade Palestinian groups. Mr. Abbas said that by next week he hoped to have a pact with Hamas, the main Palestinian Islamic party, to halt violence against Israelis.
Mr. Sharon and his aides say a cease-fire pact is not enough, however, that what is needed is to arrest and disarm the militants. What Israelis increasingly say is that the Palestinians need "their own Altalena." Little known to the outside world, the Altalena episode is frequently invoked because without some equivalent, the Palestinian state may never come to be.
In the final years of the British mandate in Palestine, there was not one Jewish militia but several, just as there are competing Palestinian groups today. The main one, the Haganah, was led by Mr. Ben-Gurion. A more violent and radical one, the Irgun Zvai Leumi, often called simply the Irgun, was led by Menachem Begin. The Irgun, along with an even more radical group, the Stern Gang, was responsible for a massacre of more than 200 Palestinians in the village of Deir Yassin in April 1948.
A month later, after the British walked out of Palestine and Mr. Ben-Gurion declared the state of Israel, Arab armies attacked. On June 1, the Haganah and Irgun agreed to merge into the Israel Defense Forces, headed by Haganah commanders. The accord called on Irgun members to hand over arms and terminate separate activity, including arms purchases abroad.
But there remained the question of an old American Navy landing vessel bought by the Irgun's American supporters and renamed the Altalena. The ship, whose purchase had predated the June 1 agreement, was packed with 850 volunteers, 5,000 rifles, 3,000 bombs, 3 million cartridges and hundreds of tons of explosives.
Mr. Ben-Gurion wanted every soldier and bullet he could get and ordered the ship to dock. But Mr. Begin said the arms should go to Irgun troops. Mr. Ben-Gurion refused; at that point, Irgun men headed to the beach to unload the arms.
Mr. Ben-Gurion realized the challenge he faced. As he put it in his memoir, "I decided this must be the moment of truth. Either the government's authority would prevail and we could then proceed to consolidate our military force or the whole concept of nationhood would fall apart."
He ordered the Altalena shelled. Dan Kurzman, in his biography of Mr. Ben-Gurion, "Prophet of Fire," describes the old man sitting with his cabinet just before his decision, "his eyes inflamed from sleeplessness, his hair in even wilder disarray than usual," and saying, "The state can not exist until we have one army and control of that army."
After the volunteers disembarked, Mr. Begin boarded the ship, as did other Irgun fighters. The shelling began. When one hit and the Altalena burst into flames, Mr. Begin was hurled overboard by his men and carried ashore. The ship sank, along with most of its arms and more than a dozen Irgun members. Others were arrested, and the Irgun's independent activities were finally put to an end. "Blessed be the cannon that shelled that ship," Mr. Ben-Gurion declared, providing his political enemies on the right with a rallying cry against him for the next generation.
In his 1953 memoir, "The Revolt," Begin says he had known hunger and sorrow in his life but had wept only twice once, out of joy, when the state was declared, and the second time, in grief, the night the Altalena was destroyed.
The point for the Palestinians is that until their radical militias are put out of action, those groups will always be in the position of spoilers. In 1996, the Palestinian Authority showed itself capable of confrontation, making widespread arrests of extremists in the wake of several suicide bombings. Thousands of militants were arrested. But most were eventually let go. The Palestinians must do it again and in a definitive manner. The Altalena is a symbol of that task because it involved genuine confrontation yet little loss of life. As Mr. Ben-Gurion wrote in his memoir:
"The incident caused near civil war among the Jews themselves. But in the eyes of the world we had affirmed ourselves as a nation. When the smoke cleared and the indignation died down, the population at large put itself squarely behind its government. The days of private armies were past, and, in the manner of every other well-organized state, we had the makings of a central command under government control."
David Stout, writing in the NYT (May 30, 2003):
The man once taught law, and he has personal experience with the workings of the Constitution, so perhaps his views deserve attention. He said it would be good if former presidents could return to the White House, even if they had already served two terms.
"For future generations, the 22nd Amendment should be modified," he said on Wednesday in a conversation with the historian Michael Beschloss at the John F. Kennedy Library and Museum in Boston. The man, William Jefferson Clinton, said he was not thinking about himself, of course.
Sure he was, the historian Douglas Brinkley of the University of New Orleans said today. The 22nd Amendment, enacted after Franklin D. Roosevelt became the only president elected to more than two terms, "has to drive someone like Bill Clinton crazy," Professor Brinkley said with a chuckle.
Professor Brinkley said Mr. Clinton is not just a physically fit man of 56 who has been put out to pasture. He is in withdrawal from "the ultimate power palace" of the White House who yearns to make up for his failures, and not just by party fund-raising or Jimmy Carter-style good deeds with hammer and saw.
Consider Andrew Johnson, the only other president to be impeached. Professor Brinkley noted that, after barely surviving his Senate trial and finishing Abraham Lincoln's second term under a cloud, Johnson went home to Tennessee, where he achieved a comeback as the only former president ever elected to the Senate.
But that path would seem closed for Bill Clinton. One of the senators from his new home state of New York is Hillary Rodham Clinton. The Clintons have had their troubles, but they may not want to campaign against each other. And it might be a stretch to ask New York voters to elect Clintons to both Senate seats.
Nevertheless, Professor Brinkley said, Mr. Clinton made "a reasonable argument."
"There may come a time when we have elected a president at age 45 or 50, and then 20 years later the country comes up with the same sort of problems the president faced before," Mr. Clinton said. "And the people would like to bring that man or woman back." He said he was not proposing that a president be elected to three or more consecutive terms, but that he be able to get elected again after an interim.
Professor Brinkley saw Mr. Clinton's point. "British prime ministers can come back," the historian said. "Why can't American presidents?"
They can't because of the 22nd Amendment, which took effect in 1951 and bars a person from being elected president more than twice. And despite Mr. Clinton's remarks, or because of them, there will be no rush to change it, historians said today.
"It will seem self-serving if there's an ex-president sitting around who might be affected," said Allan J. Lichtman, a history professor and presidential scholar at American University in Washington.
Professor Lichtman said that enacting a constitutional amendment, through a two-thirds vote of both Houses of Congress and ratification by three-quarters of the state legislatures, often takes several years. A colleague, James A. Thurber, said, "It would take a very popular president in his second term, like Reagan" to stir real momentum for a constitutional change. Professor Thurber is director of American University's Center for Congressional and Presidential Studies.
Historians sometimes invoke images of the frail, cloak-wrapped Franklin Roosevelt shortly before his death in 1945 as an argument that a president should not serve more than two terms.
But the authors of the 22nd Amendment had pure politics as well as history in mind. In 1947, with new majorities in both houses and smarting from their years out of power, Republicans pushed the amendment through Congress. The measure took four years to gather the necessary support among the states.
Given the bitter partisanship in today's politics, Professor Lichtman, who is 56, said, "I do not expect to see another constitutional amendment in my lifetime."
Professor Brinkley also had a prediction: "The only way you'll ever see Bill Clinton in the White House again is in the role of First Man."
Dick Morris, writing in frontpagemag.com (May 30, 2003):
A second term is a terrible thing for a president to waste. Sidney Blumenthal's new book makes clear how totally Bill Clinton wasted it. He was a one-term president who lived in the White House for eight years.
The Clinton Wars speaks not about the war on terror or the war on drugs or even the war on poverty. Instead, its about the wars that occupied Clinton in his second term: on Paula Jones, on Kenneth Starr, on the Washington Post's Susan Schmidt, on Matt Drudge, on Clinton's women and the war to get Hillary into the Senate.
For those who havent plumbed the depths of the Clintons' denial mechanisms and their obsession with petty revenge, Blumenthals book offers a road map into their distorted perceptions of reality. Like an account of a hallucination, he takes us into a world where Monica Lewinsky blackmails Clinton into sex, Whitewater is the epitome of innocence and Starr the personification of evil.
In Bill's and Hillary's world, no accusation is too ridiculous to make against their enemies, no transparent fraud of their own too obvious to attempt to conceal and no justified criticism too reasonable to resent. The roster of those who maliciously attack Clintons' virtue is so long that it makes President Nixon's enemies list seem trivial by comparison. Only the list of the Mikados Lord High Executioner is longer.
Hillarys own book is due out next month. The platitudes that are likely to festoon the former First Lady/presidential wanabees book are not the real Hillary: Sidney's book is her true voice. In a world in which ghostwriters assist celebrity authors in their memoirs, this book is an odd role reversal. Here, Hillary is the ghost putting her prejudices, animosities, biases, resentments, fulminations, and paranoid mutterings in Sidney's mouth.
Hillary needed someone to affirm her credentials as a New York Yankee fan, so Blumenthal obliged. She wanted a benign description of her acceptance of the need to have a special prosecutor, so his book portrays her as philosophically accepting it. (By contrast, both George Stephanopoulous and I recall her obstinate refusal and tearful ranting against the appointment.) Mrs. Clinton needed to affirm that she was the author of It Takes A Village so Blumenthal attests to it, despite the fact that her ghost writer was paid $120,000. Hillary wants to grab some undeserved credit for the Irish peace process, so Blumenthal obligingly informs us that her "work" made it all possible.
Anything that needs doing, Blumenthal does in this book, like he did in the White House. This 800-page job application for a job in a Hillary White House shows his willingness to buy any line she hands out and treats it as gospel. One can imagine Sidney as her Bob Haldeman, sitting across the Oval Office desk, willing to do anything she wants, copying down her most delusional and paranoid proposals and seeing them through to full implementation.
Michael Freund, writing in the Jerusalem Post (May 28, 2003):
As a result of this past Sunday's vote in the Israeli cabinet, Zionism now finds itself confronting the gravest identity crisis it has known in the past century.
Not since 1903, when the Sixth Zionist Congress indicated a willingness to consider Great Britain's proposal to create a Jewish national home in Uganda, has the movement come so perilously close to abandoning its ideological moorings.
Indeed, there is a lot of similarity between the Ugandan road map and its Palestinian equivalent, and a look at the former provides an intriguing clue as to how best to defeat the latter.
The Uganda plan was born precisely 100 years ago this past summer, when Theodor Herzl, father of political Zionism, was summoned to London for a meeting with British Colonial Secretary Joseph Chamberlain. Chamberlain had just returned from a visit to Africa and told Herzl that while he was there, "I saw a country for you: Uganda. On the coast it is hot, but in the interior the climate is excellent for Europeans - I thought to myself: that's just the country for Dr. Herzl."
Herzl, of course, was less than enthused by the idea. After all, Jews for generations had spent the previous 2,000 years longing for the hills of Zion, not the jungles of Kampala.
But after the British Foreign Office officially presented the proposal to him in August 1903, Herzl decided to bring the Uganda Project to a vote at the upcoming Zionist Congress, which was set to meet in Basel.
Herzl and his allies portrayed the plan as a temporary solution and an emergency measure, but many of the delegates were outraged, labeling it a betrayal, and a storm of protest quickly ensued.
Eyewitnesses described tumultuous scenes which "continued into the small hours of the morning." In the end, it was only due to the personal prestige which Herzl commanded that the Congress voted to send a committee to Uganda to investigate its viability as a possible Jewish national sanctuary.
In both instances, then, we find a superpower putting a plan on the table whose underlying principles run counter to everything Zionism stands for. In 1903, the idea would have meant forgoing the Land of Israel, while in 2003 it means dividing it.
George Parker, writing in the Financial Times (London) (May 29, 2003):
Valery Giscard d'Estaing likes to compare the work of his European Convention with that of the American founding fathers, authors of the 1787 US constitution.
But one big difference emerged yesterday. While James Madison and his colleagues in Philadelphia introduced the US constitution in a single sentence, the former French president needed six paragraphs to distil the values and aspirations of the European Union.
"Flowery and pretentious," was the verdict of Andrew Duff, a British member of the Giscard convention and normally an admirer of the former French president.
Mr Giscard d'Estaing has high aspirations for his long-awaited constitutional preamble: he says he wants it to be taught in schools.
But yesterday there was a critical reaction from many, who felt it either looked like it was written by a committee, or that it had too many lyrical Giscardian flourishes....
According to historians, Mr Giscard d'Estaing should have been more ruthless with the editing. The US constitution, whose preamble famously starts "We the people of the United States", is introduced in a single pithy sentence.
The French constitution, which includes a "dedication to the rights of man", has a short two-sentence introduction. Germany's is even more concise, beginning: "Conscious of their responsibility before God and Man."
"It strikes me that there are three key differences between the European draft and the US preamble, like the US constitution itself: brevity, generality and concreteness," said Andrew Moravcsik, professor of government at Harvard University. "The European version canvasses the past, the present and the future; it touches base with every possible ideal in a pluralistic society."
Mr Giscard d'Estaing's spokesman observed that the framers of the US constitution had one big advantage when it came to keeping things tight: "European history is a bit longer than American history at the time they wrote their preamble."
Larry Sidentop, a fellow of Keble College, Oxford, said he disliked the fact that while the US constitution is written as if by the people, Giscard's version appears to be addressed to Europe's citizens.
"It's not the people speaking, it's speaking to them," he said. "There's also a lack of conviction which comes through." Andrew Duff, once described by Giscard as his "Socrates" on the convention, believes that even the first sentence is contentious. "I'm not sure what the Chinese would make of the idea of Europe bringing forth civilisation," he said. "I think it's too long, and it's flowery and pretentious in both English and French."
Mr Duff and his colleagues on the 105-member convention have just three weeks to persuade Mr Giscard d'Estaing to change his draft text. But the grand old man of European politics is not going to lightly surrender the preamble, over which he has been mulling for the 16-month lifespan of the European Convention.
Noemie Emery, writing in the Weekly Standard (June 2, 2003):
ALL THROUGH the Clinton administration and into the 2000 election, some said we had run out of history. It had been tapped out, like an overused resource. It had run dry, like a well. Then came September 11, and history came flooding back with a vengeance, swamping us all in a torrent of crisis and incident. We have so much history now that we have nowhere to put it. We have a history glut. Elected in peace, George W. Bush has become a war president, fighting hot wars and covert wars on terror, while trying to rebuild the Atlantic alliance and bring peace and order to the Middle East. He is making history more than he ever imagined, but he is also reliving it, in an unusual fusion of incidents. We are reliving not one but four past crises: [In 1938, when the League of Nations failed to stop German aggression. In 1941, when the Japanese bombed Pearl Harbor. In 1962, when the Soviets slipped missiles into Cuba, forcing Kennedy to threaten a preventive war to get them removed. And in 1946.]....
IN THE COURSE of the year 1946, President Harry S. Truman came to understand that the assumptions he'd held about the world when he'd become president one year before no longer applied to the world he was living in, and the alliances inherited from World War II would have to be wholly reconfigured. Between September 11, 2001, and March 2003, George W. Bush came to understand that the assumptions he had held about the world when he'd become president one year before no longer applied to the world he was leading, and that the world order would have to be reconstructed and remade. Truman found that the coalition that had won World WarII was no longer stable; that Russia was an enemy and China becoming one; and that the Western nations would have to rebuild their defeated Axis enemies as part of the new non-Communist bloc. Bush came to see that the alliance that had won the Cold War was splintering; that France and Germany were now at odds with the United States and Britain, which went to find allies among their former opponents, the once-captive Communist satellites. Both were surprised by the bad faith of their one-time war partners. Both were accused by their liberal critics of having started the quarrels themselves. ...
Truman and Bush both started from the simple desire to safeguard their country, and gradually moved to the final idea that the only way to fight communism and terror was to end the conditions that made them appealing. Truman understood that his postwar world would have no lasting security unless he turned Japan and Germany into stable democracies. Bush understands that his world can have no real security without bringing reform and order to the terror-spawning Middle East. The hardest job of the 20th century went to Franklin D. Roosevelt, but the toughest decisions belonged to Harry S. Truman, who had to name, frame, and contain a wholly new form of trouble. Truman's problems, along with those of FDR and John Kennedy, now have all come to President Bush. ...
LET US NOT play up the strains of the present by running down those of the past. No president had a worse job than Franklin Roosevelt, a worse week than John Kennedy, a worse set of choices than Harry S. Truman, so many of which could have gone wrong. But no president other than Bush ever faced so many conflicting cross-pressures and strains. FDR joined a coalition when Pearl Harbor was bombed; he did not have to create one. Truman faced his worst moments only after containment and NATO were safely in place. Kennedy had no hostile Hans Blix to contend with, and the weapons he confronted were large enough to be photographed, and too large to be easily hidden or carted away.
Rachel Neuwirth, Los Angeles-based analyst on the board of the American Jewish Congress, writing for a forthcoming issue of ChronWatch:
The creation of a second Arab/Palestinian state, run by the terrorist PLO, will cause a humanitarian crisis and economic disaster for both Jordan and Israel. If we are ever to achieve peace between Israel and the Arabs, we must know history and face reality: the Arab people who live in what is now Israelincluding Judea, Samaria, and Gazaand Jordan are one people.
The solution to the Middle East conflict lies in recognizing that 78% of historical Palestine is today called Jordan, and 80% of Jordan's population is (so-called) Palestinian. (See maps of historical Palestine: http://masada2000.org/historical.html.) The only rational conclusion is that there is already a state for the Palestinian Arabs: the Hashemite Kingdom of Jordan.
Dr. Kadri Toukan, a former Jordanian foreign minister, declared on December 9, 1970, that Jordan is Palestine and Palestine is Jordan.
Anwar Nusseibi, a former Jordanian defense minister, stated on October 3, 1970, ''The Jordanians are also Palestinians. This is one state; this is one people. The name is not important. The families living in Salt, Irbid, and Karak not only maintain family and matrimonial ties with the families in Nablus and Hebron, they are one people.''
Ahmad Shuqairy, the first president of the PLO, told the Palestine National Council in May of 1965 that, ''Our Jordanian brothers are actually Palestinians.''
Further, the Washington, D.C. website of the Hashemite Kingdom of Jordan says:
''The ... close historical and geographic relationship between Jordan and Palestine over the ages, together with ... the national affiliation and cultural position of Jordanians and Palestinians ... have endowed this relationship with a special and distinctive character. It is bolstered by the strong ties and deep common interests that exist between them'' (http://www.jordanembassyus.org/new/aboutjordan/nationalcharter.shtml#7).
This implies that King Abdullah of Jordan should have taken responsibility for the plight of his own people: the Arabs who live in Israel and the West Bank. Jordan still has enough land to absorb all the Palestinian Arabs who presently reside in the West Bank. Jordan has 35,000 sq. miles of contiguous land.
If President Bush wants to achieve peace, his road map must reflect the fact that the Hashemite Kingdom of Jordan is, essentially, ARAB ''Palestine!''
As recent history has painfully taught us, Israel and Palestinian Arabs can never live peacefully together. Most Palestinian Arabs declare day and night that their goal is to destroy Israel. And many Israeli Arabs living in Israel have proven themselves to be disloyal to the Jewish state.
However, since the 1994 peace treaty with Jordan, Israel and Jordan have lived side by side in relative peace. Only two statesthe Jewish ''Palestinian'' state and Jordan, the Arab ''Palestinian'' stateare truly capable of finding a resolution to the so-called Palestinian dispute! Jordan and Israel are 95% of British Mandated Historical Palestine and were the only legitimate successors to the British Mandated Palestine
The separation of Arab and Jew is imperative for regional stability. Furthermore, the international funds that currently support the PLO ''thugocracy'' should be redirected to support Jordanian and Israeli diplomatic efforts toward achieving an equitable solution to the conflict in the Middle East.
Jordan should grant citizenship to all Arabs within Israel, including Judea, Samaria, and Gaza. Those who want to locate elsewhere can do so (backed by international financial support, for a limited time). Those who choose to remain in (pre-'67) Israel will be given alien status, with no political rights, since Israel is the JEWISH ''Palestinian'' state.
Doug Saunders, writing in the Globe and Mail (Canada) (May 29, 2003):
Trapped on the other side of the country aboard Air Force One, the President has lost his cool: "If some tinhorn terrorist wants me, tell him to come and get me! I'll be at home! Waiting for the bastard!"
His Secret Service chief seems taken aback. "But Mr. President . . ."
The President brusquely interrupts him. "Try Commander-in-Chief. Whose present command is: Take the President home!"
Was this George W. Bush's moment of resolve on Sept. 11, 2001? Well, not exactly. Actually, the scene took place this month, on a Toronto sound stage.
The histrionics, filmed for a two-hour TV movie to be broadcast this September, are as close as you can get to an official White House account of its activities at the outset of the war on terrorism.
Written and produced by a White House insider with the close co-operation of Mr. Bush and his top officials, The Big Dance represents an unusually close merger of Washington's ambitions and Hollywood's movie machinery.
A copy of the script obtained by The Globe and Mail reveals a prime-time drama starring a nearly infallible, heroic president with little or no dissension in his ranks and a penchant for delivering articulate, stirring, off-the-cuff addresses to colleagues.
That the whole thing was filmed in Canada and is eligible for financial aid from Canadian taxpayers, and that its loyal Republican writer-producer is a Canadian citizen best known for his adaptation of The Apprenticeship of Duddy Kravitz, are ironies that will be lost on most of its American viewers when it airs on the Showtime network this fall....
Lionel Chetwynd, the film's creator, sees nothing untoward about his role as the semi-official White House apologist in Hollywood. For him, having a well-connected Republican create the movie was a way to get the official message around what he sees as an entertainment industry packed with liberals and Democrats.
"A feeding frenzy had started to develop around this story, and a lot of people who wanted to do this story had a very clear political agenda, very clear," Mr. Chetwynd said in an interview from his Los Angeles home yesterday.
"My own view of the administration is somewhat more sympathetic than, say, Alec Baldwin's. . . . In fact, I'm technically a member of the administration [Mr. Chetwynd sits on the President's Committee on the Arts and Humanities], so I let it be known that I was also interested in doing it. I threw myself on the mercies of my friend Karl Rove."
Mr. Rove is the President's chief political adviser, so this was not a typical Hollywood pitch. But then, Mr. Chetwynd is not a typical Hollywood writer-producer: He is founder of the Wednesday Morning Club, an organization for the movie colony's relatively small band of Republicans, and he led the White House's efforts to enlist Hollywood's support after the Sept. 11 terrorist attacks.
Mr. Chetwynd's script is based on lengthy interviews with Mr. Bush, Mr. Rove, top aide Andy Card, retiring White House press aide Ari Fleischer, Defence Secretary Donald Rumsfeld and other Republican officials in the White House and the Pentagon. He says every scene and line of dialogue was described to him by an insider or taken from credible reports.
An exchange among journalists posted on Poynteronline.org (May 21, 2003):
Ellen E. Heltzel The media capitalizes on the public's short attention span, so it tends to ignore this salient fact: Americans go nuts over the subject of history, especially their own. The spillover into the book market is huge.
Take the case of Kenneth C. Davis, who 13 years ago launched what would become a series with a book called "Don't Know Much About History." Davis aimed his book at Boomers like himself, putting life into material that, like the lima beans served in the cafeteria, was reduced to dull pulp back when he was in high school. Now, 1 1/2 million copies later, his book has just been revised and updated. In 1996, James Loewen had a hit and a prize winner with "Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong." Then there's Richard Shenkman's title, "One-Night Stands with American History."
No question, we're talking history lite here, an attempt to both entertain and inform. The breezy view of history can lead to some ouchers, like this from the publisher's website describing "U.S. History for Dummies": "Get a grip on the legacy of slavery..." Get a grip? Even if seen from the perspective of your average pierced and tattooed 15-year-old, I wonder how such a serious and far-reaching topic can be understood in a quick take or two.
But hold on here. We journalists can't get too huffy about the reductive thinking required to pull off these kinds of books. That's what we do! The only difference is that we're catching the wave as it goes by instead of combing the archives. And here's my point: Even this version of history serves a purpose. Authors like Davis are telling tales that need to be told to an audience that hasn't the time nor the interest in conquering the typical door stoppers. And by virtue of their formats, the quickie versions tend to tackle history with an irony that's good for the soul. Don't we need a counterweight to the good guy-bad guy sensibilities that have resurfaced since 9/11? A plug nickel for your thoughts...
Margo Hammond I am not going to argue against anyone learning history, but I don't share your belief that Americans have much of an appetite for it. Ambrose Bierce's infamous comment ("War is God's way of teaching Americans geography") can be appled to history as well, I'm afraid. In a country that already is talking about '90s nostalgia, collective historical memory doesn't stretch back very far here. I also don't share your conviction that history lite always tackles history with a skeptical eye -- any more than the more ponderous versions do. Viewpoints and tone -- like your "get a grip on the legacy of slavery" -- vary greatly, depending on the author. Facts are not always neutral -- or at least are not always placed in a neutral context.
I do agree, though, that a compendium of facts can be useful -- but only as long as we recognize that it is incomplete. I worry that these "histories lite" leave the impression that history can be studied as a static, unchanging subject. Yes, there are facts that can be unearthed, but the interpretation of those facts is constantly up for debate.
Isn't there a middle ground between the heavy, unreadable history books churned out by scholars who are publishing rather than perishing and the quickie versions of history? To interest people in history, I would much rather recommend history books like David McCullough's "Truman" and "John Adams" that both inform and entertain. Or how about the recently-published "A Short History of Nearly Everything," Bill Bryson's not-so-short (515 pages), but very funny and enlightening history of science? Another recent favorite of mine is "George Washington's False Teeth: An Unconventional Guide to the Eighteenth Century" by Princeton historian Robert Darnton. It doesn't tell everything about that century, but with stories of a chevalier dueling in drag and Washington being inaugurated with one tooth in his mouth (a lower left bicuspid), it sure whets the appetite.
Ellen E. Heltzel First, the fact that Americans love history is indisputable. Check out "The Presence of the Past," a book by Roy Rosenzweig and David Thelen that includes their extensive phone survey to find out just how much folks care about history, anyway. Sure, many are hazy on general facts and figures (much as I love the subject, I'm a bit fuzzy on these details myself -- which is why I keep a reference library). But history as it relates to their own past matters a lot to them.
As an historian, David McCullough is not "history lite," of course, but he's in a league of his own -- mainly because he's not a professional historian at all, but a good storyteller who builds his tales on solid research. (Ditto fellow bestselling authors and self-taught scholars Antonia Fraser and the late Barbara Tuchman.) And for all his strengths, McCullough has a weakness, which is a tendency to become too much an advocate for his subject; John Adams was not quite as stellar a fellow as McCullough made him seem, nor Jefferson quite the dandy. But back to the subject: McCullough is deep, not wide. Ken Davis is wide, not deep. Both serve their purpose. (As for Bill Bryson, he fills a niche closer to Davis on the continuum, synthesizing and summarizing information that most of us leave to the specialists.)
This brings us back to where I began: the dull mush of high school history. Sad to say, most middle and high school history teachers don't even have history degrees. And, according to Diane Ravitch in "The Language Police: How Pressure Groups Restrict What Students Learn," history textbooks are so compromised by political wars over whose version to teach that American high school seniors score lower on U.S. history than they do on math, science, or reading. Consequently, Bush's "No Child Left Behind" plan is pouring $100 million into "remedial ed" -- not for students, but for teachers who need to "get a grip" (on slavery and everything else). In the meantime, teachers themselves use the "Don't Know Much" and "No More Lies" books because they bring the subject alive. No more lima beans!
Iraqi officials here say that they asked American military leaders as early as a month ago to help protect major archaeological sites from looters, but that for the most part, their pleas were ignored and artwork and relics from ancient Babylon are still being stolen from many locations.
Marine officials said they had taken care to protect Babylon and a handful of other famous ruins from looters. But they also made it clear in the last few days that protecting archaeological treasures was merely one of many priorities, and not necessarily the top one.
On a visit Sunday, three sites near here were pocked with freshly dug holes and littered with hastily abandoned shovels, indicating looting in the last day or two. At one spot, about two dozen people ran off when they saw approaching trucks.
At Isan Bakhriat, site of the ancient city of Isin to the north of here, more than 100 looters were openly digging out and selling urns, sculptures and cuneiform tablets.
"It's happening at almost every site," said Tofiq Abed Muhammad, director of antiquities for the province of Samawa. "They are smart. They take the antiquities that they know have value, and they know how to get them out of the country."
The plundering of Iraqi archaeological sites is the second major wave of culture theft since American forces toppled the government of Saddam Hussein in early April.
The first wave came as American soldiers were seizing Baghdad, when looters broke into and largely gutted Iraq's national museums.
The archaeological lootings could amount to even larger losses over time. Archaeologists say the sites have been so disrupted that systematic historical research there may now be impossible.
Mr. Muhammad said his first request for help was to Lt. Col. Daniel O'Donahue, the commanding officer at a Marine base just outside Samawa.
"We told them we needed American soldiers at checkpoints, in combination with Iraqi guards," he said.
Colonel O'Donohue confirmed that he had discussed the issue. But he said marines were attending to more basic needs like securing enough water, food and medical care for people in the area.
Sam Roberts, writing in the NYT (May 25, 2003):
Six decades after it was sunk by Japanese bombs at Pearl Harbor, the U.S.S. Arizona still cries black tears.
Every day, about a quart of oil bubbles up from somewhere inside the barely submerged battleship. In designing the Arizona memorial, nobody anticipated the leak indeed, the National Park Service is seeking to protect the harbor from any environmental damage (the ship had been refueled shortly before the attack) without defiling the vessel, which still holds the remains of most of the 1,177 crewmen who died on board. And unless a park ranger points out the iridescent oil rings on the surface, they often float off into the bay unnoticed or unexplained.
But to the thousands of visitors each day who tour the memorial including the museum exhibits at the visitor center, the stark concrete arched monument unveiled in the harbor on Memorial Day, 1962, and the white marble wall on which the victims' names are inscribed it is those tears from a ship that still weeps for its crew and its country after 61 years that may provide the most poignant and enduring memory of Pearl Harbor.
The symbolism of this place an evocative vision of naval disaster, not victory and the unanticipated power of the battleship's unplanned oil leak are worth recalling as the Lower Manhattan Development Corporation advances plans for a World Trade Center memorial, itself a commemoration of disaster on 4.7 acres below street level and incorporating the slurry wall that holds back the Hudson River....
The bombings of Pearl Harbor and of the World Trade Center were both sneak attacks that punctured the myth of American invincibility.
"What I have always been struck by at the Arizona," says Professor Edward T. Linenthal of the University of Wisconsin at Oshkosh, who has studied the site extensively, "is the uniqueness of the relic itself. It is in an active Navy base where, quite literally, the world changed. It is a historic site without any of the physical boundaries that demarcate sacred space at other battlefields.
"It is not unlike the challenge in downtown Manhattan," he said.
What distinguishes the Arizona from many war memorials is that most of the victims are still entombed here, largely because of the difficulty in recovering their bodies at the time. They were officially declared buried at sea.
Sunanda k. Datta-ray, writing in the Signapore Straits Times (May 27, 2003):
THE next time you hear children chanting 'Ring-a ring-a roses, Pocketful of posies, Hush-a, Hush-a, We all fall down!', remember the grim meaning behind their playfulness. The verse commemorates the 14th century bubonic plague epidemic when, as Boccaccio of The Decameron wrote, people 'ate lunch with their friends and dinner with their ancestors in paradise'.
The Black Death profoundly changed European living habits - for the better. Rare is the cloud without such a silver lining. Cancer explains the World Health Organisation's treaty against cigarette advertising. Recurring malaria in Calcutta prompted the authorities to drain marshland and build a smart new suburb.
When the tide of the severe acute respiratory syndrome (Sars) runs out, it should leave behind considerate and courteous people with cleaner habits.
Paul Pringle, writing in the LA Times (May 20, 2003):
Officials say today's Army takes a passive, good-riddance approach to its runaways, who account for fewer than 1% of enlistees. Prosecutions and prison sentences have become rare.
Most of the several thousand deserters who bolt each year aren't even actively pursued. Of those who do wind up in custody, more than 90% are discharged as quickly as the paperwork can be processed.
"Hunt them down? No way," said Thomas, who sat in a wind-hammered bungalow as Humvees lumbered along the dusty roads outside. "I've never heard of a court-martial" for a deserter.
The Army has been a volunteer vocation since the end of the Vietnam War-era draft, so commanders have grown increasingly content to cut loose anyone unwilling to fight.
A similar attitude prevails in the Marine Corps and Navy, officials say, adding that it hasn't changed because of the campaigns in Afghanistan and Iraq.
"We really don't look for deserters anymore," said Mark Raimondi, spokesman for the Army's Criminal Investigation Command. "If folks don't want to stay around, we don't want them."
From the Revolutionary War on, deserters have been seen as the dirty laundry of the armed forces -- the mockers of code and honor, the drags on morale.
They come in varied shadings of character and motivation: lonesome 19-year-olds with a sick mom back home. Late-blooming conscientious objectors who signed up for the college benefits. Miscreants with an appetite for drugs and street violence. And the ones who simply got scared.
During the Vietnam War, especially in its early stages, the FBI helped the military track down deserters. Courts-martial were common.
Now deserters are generally free to run until local civilian authorities happen to detain them -- often for traffic violations -- and warrant checks identify them as military fugitives. A large number turn themselves in. Others are given up by parents or spouses.
David G. Savage, writing in the LAT (May 19, 2003):
If states pay for scholarships, textbooks and other types of aid that benefit private secular schools, does the U.S. Constitution require them to do the same for church-related or religious schools?
That question is before the Supreme Court in the latest twist in the long debate over religion and its relationship with the government. Last year, the high court said states may use taxpayers' money to pay for children to go to church-related schools. The 5-4 ruling upheld a voucher program in Ohio that gives low-income parents a stipend that they can use to send their child to a church-related school. The flow of public money to a parochial school did not violate the 1st Amendment's ban on an "establishment of religion," the court ruled.
Now, religious-rights advocates and voucher proponents are urging the justices to go a step further and rule that if states are supporting nonreligious private schools through scholarships, tuition aid or other means, they must also cover costs for those at religious schools....
The case arose in 1999, when the Legislature in Washington state offered "Promise Scholarships" to top high school graduates from low-income families.
Joshua Davey qualified for a scholarship and said he planned to study "pastoral ministries" at a small college run by the Assemblies of God. His aim was to become a minister, he said.
State officials, pointing to Washington's Constitution, said students studying theology did not qualify for this public aid. "Absolute freedom of conscience in all matters of religious sentiment, belief and worship, shall be guaranteed to every individual," it says. However, "no public money or property shall be appropriated for or applied to any religious worship, exercise or instruction." Lawyers for the American Center on Law & Justice, a conservative religious-rights group based in Virginia Beach, Va., sued in federal court on Davey's behalf.
While a judge in Seattle sided with the state, the 9th Circuit in San Francisco ruled for Davey.
Lawyers for the Becket Fund for Religious Liberty and the Institute for Justice, a libertarian group that has championed vouchers, hope the court will take the case and issue a ruling that voids the state constitutional bans on aid for religion.
These bans are "the remnants of 19th century religious bigotry ... and should be nullified," said Kevin Hasson, president of the Becket Fund.
In his view -- and that of several Supreme Court justices -- these state bans are the legacy of a largely forgotten but ugly chapter in American history.
In the late 19th century, James G. Blaine, the speaker of the House and a Republican presidential candidate, led a movement to bar the use of public money to support Catholic schools. "The public schools had a distinctly Protestant character. That's why a parallel, parochial system of Catholic schools developed," says Richard D. Komer, a lawyer for the Institute of Justice. Blaine was "riding an anti-Catholic animus."
Blaine's failed presidential campaign of 1884 is remembered by historians for a supporter's claim that the Democrats were the party of "rum, Romanism and rebellion." While Blaine's proposed federal constitutional amendment fell just short in Congress, 36 states adopted a version for their state constitutions. Until recently, these measures were seen as upholding the principle of separation of church and state. But two years ago, U.S. Supreme Court Justice Clarence Thomas denounced these state amendments as having a "shameful pedigree" of anti-Catholic "bigotry."
Wall Street Journal editorial (May 23, 2003):
It's one of those fascinating "what ifs" of American history: What if Henry Wallace had still been vice president when Franklin Roosevelt was felled by a cerebral hemorrhage in 1945? Instead of a postwar American foreign policy grounded in the Truman Doctrine and the Marshall Plan, we would have had one shaped by a man who deemed Britain as much a threat as the Soviet Union, whose advisers included Soviet spies, and who once described a Siberian slave-labor camp as a "combination TVA and Hudson's Bay Company."
Which makes it all the more incredible that the Franklin and Eleanor Roosevelt Institute in Hyde Park, N.Y., would choose to affix the Wallace name to the visitor and exhibition center scheduled to open its doors this November. Wallace will become the portal through which visitors are introduced to Roosevelt.
Not that the Iowa-born Wallace was without achievements. Most notable of these was the founding of Pioneer Hi-Bred, an innovative corn-seed company that increased crop yields, helped gain Wallace his job as FDR's agriculture secretary and left his family wealthy enough to help finance this center. As institute president and CEO Christopher Breiseth tells us: "If you stopped the clock at April 12, 1945"--the day FDR died--the portrait of Wallace would be dominated by domestic policy.
Well, yes. But if you stop the clock in 1945, Ronald Reagan would be a New Deal Democrat too. In 1946 Wallace, then serving as commerce secretary, broke with Truman and two years later ran for president under the banner of a Progressive Party effectively run by communists and fellow travelers. In so doing he threatened the Democratic Party from the left at the same time that Strom Thurmond's Dixiecrats were threatening it from the right.
Certainly historians are free to speculate that a Wallace presidency might have avoided the Cold War and its attendant antagonisms. In addition to being purely speculative, however, it is a view that elevates good intentions and noble dreams over acts and consequences. In Wallace's own day, American liberals saw his naiveté for what it was: dangerous.
Arthur Schlesinger Jr. makes the point in a review of the reverential biography of Wallace, "American Dreamer," published by former Sen. John Culver and former newsman John Hyde three years ago. "Eleanor Roosevelt herself," wrote Mr. Schlesinger, "led the repudiation of Wallace in column after column." Not only did Eleanor attack Wallace for his alliance with the communists, she declared that "any use of my husband's name in connection with that party is from my point of view entirely dishonest."
How much of this complicated record visitors to the Wallace Center are likely to take away is anyone's guess, but we'll venture ours: not much. If the Wallace name is to go up on the Roosevelt grounds, shouldn't there be some public acknowledgement of the very different FDR legacy we would have been left with had Wallace and not Truman prevailed?
Neil Tweedie and Peter Day, writing in the London Telegraph (May 23, 2003):
Klaus Fuchs, the man who betrayed the secret of the atom bomb to the Soviet Union, believed he would be allowed to remain a member of the British nuclear weapons programme following his confession because of his importance as a scientist.
The arrogance of Fuchs, who in fact served nine years in prison for his treachery, is displayed in MI5 files released by The National Archives yesterday chronicling the clandestine career of possibly the most damaging spy in British history.
More than 100 Security Service files covering the immediate post-war period have been made public, representing the first major release of documents covering the Cold War. More than 20 files on Fuchs, which have been heavily censored, portray a man capable both of idealism and cold calculation, and suggest the 38-year-old mathematical physicist would almost certainly have avoided prison had he not volunteered his confession, despite 24-hour surveillance by MI5.
They also cast light on his private life, suggesting he carried on an affair with the wife of his superior and friend while living at the couple's home; and disclose the methods he used to contact his Soviet controllers, including the use of chalk marks on shop walls and magazines thrown into suburban gardens.
There is also evidence showing how the authorities repeatedly ignored concerns that Fuchs, a political refugee from Nazi Germany granted British citizenship because of his intellectual value, was an active Communist and major security risk.
Fuchs was a central player in Enormoz, the Soviet intelligence operation set up to penetrate the wartime Manhattan Project. Some believe the secrets he handed over, together with the efforts of other spies at the Los Alamos laboratory in New Mexico, allowed Stalin's Russia to construct an atomic bomb years earlier than might have been the case.
He was also the first major Soviet agent to be identified by Venona, the American operation that succeeded in cracking Soviet coded messages transmitted during the Second World War. Venona could never have been cited in evidence against Fuchs, being too precious a secret to reveal, and so the spy had to be persuaded to confess. Ironically, Kim Philby had already warned the KGB about Venona.
The man who must take most credit for the confession was Jim Skardon, the MI5 interrogator who interviewed Fuchs in January 1950. It was only during the fourth interrogation, on Jan 24, that Fuchs cracked after a pub lunch with Skardon near his workplace, the Atomic Energy Research Establishment at Harwell.
Skardon told his superiors: "He [Fuchs] was evidently under considerable mental stress, and I told him that it seemed to me that whereas he had told me a long story providing a motive for acts, he had told me nothing about the acts themselves. I suggested that he should unburden his mind and clear his conscience by telling me the full story. He said: 'I will never be persuaded by you to talk.'
"At this stage we went off to lunch. During the meal he seemed to be resolving the matter, and towards the end he suggested that we should hurry back to his house. On our arrival he said that he had decided that it would be to his best interests to answer my questions."
At a meeting the following day there was concern that Fuchs might attempt to escape or commit suicide, but Skardon calmed the fears.
"Mr Skardon stated that, in his opinion, Fuchs still had complete confidence in the likelihood of his remaining at Harwell or, as second best, obtaining a senior university post. He recalled the fact that the interrogation had shown that Fuchs regards himself as the linchpin of the Harwell organisation; in his present state of mind it is almost inconceivable to Fuchs that he might be removed.
"Skardon stated his conviction that, in any event, Fuchs now regarded his future as being in this country where, despite his past, he had acquired certain ties, loyalties and friendships. He did not therefore consider it at all likely that Fuchs would attempt to leave the country. Nor, for the same reasons - including Fuchs's great personal vanity - did he think it probable that he would try to take his own life."
William Pfaff, writing in the International Herald Tribune (May 15, 2003):
The main intellectual influence on the neoconservatives has been the philosopher Leo Strauss, who left Germany in 1938 and taught for many years at the University of Chicago. Several of the neoconservatives studied under him. Wolfowitz and Shulsky took doctorates under him.
Something of a cult developed around Strauss during his later years at Chicago, and he and some admirers figure in the Saul Bellow novel, "Ravelstein." The cult is appropriate because Strauss believed that the essential truths about human society and history should be held by an elite, and withheld from others who lack the fortitude to deal with truth. Society, Strauss thought, needs consoling lies.
He held that philosophy is dangerous because it brings into question the conventions on which civil order and the morality of society depend. This risks promoting a destructive nihilism.
According to Strauss, the relativism of modern American society is a moral disorder that could block it from identifying its real enemies. "Moral clarity" is essential. The Weimar Republic's toleration of extremism allowed the rise of the Nazi party.
Strauss made an intellectually powerful and sophisticated critique of post-Enlightenment liberalism. He saw the United States as the most advanced case of liberalism and thus the most exposed to nihilism.
He believed that Greek classical philosophy, notably that of Plato, is more true to nature than anything that has replaced it. Some critics say that his interpretation of Plato is perverse, but he said that he had recovered the "real" Plato, lost by later Neo-Platonic and Christian thinkers.
He also argued that Platonic truth is too hard for people to bear, and that the classical appeal to "virtue" as the object of human endeavor is unattainable. Hence it has been necessary to tell lies to people about the nature of political reality. An elite recognizes the truth, however, and keeps it to itself. This gives it insight, and implicitly power that others do not possess. This obviously is an important element in Strauss's appeal to America's neoconservatives.
The ostensibly hidden truth is that expediency works; there is no certain God to punish wrongdoing; and virtue is unattainable by most people. Machiavelli was right. There is a natural hierarchy of humans, and rulers must restrict free inquiry and exploit the mediocrity and vice of ordinary people so as to keep society in order.
This is obviously a bleak and anti-utopian philosophy that goes against practically everything Americans want to believe. It contradicts the conventional wisdom of modern democratic society. It also contradicts the neoconservatives' own declared policy ambitions to make the Muslim world democratic and establish a new U.S.-led international order, which are blatantly utopian.
Strauss, who died in 1973, was no friend of hegemony, American or otherwise. He said that "no human being and no group of human beings can rule the whole of the human race justly." His concern during the Cold War was that Soviet universalism invited an alternative American claim to world rule.
His real appeal to the neoconservatives, in my view, is that his elitism presents a principled rationalization for policy expediency, and for "necessary lies" told to those whom the truth would demoralize.
Christopher Bantick, writing in the Age (Melbourne) (May 21st, 2003):
HISTORY is gaining a new profile in Victorian schools, as last week's Age Education reported. While this is good news for the humanities, there is no more critically urgent time to study history than now.
Teachers of history are, above all else, stewards of the past. It is imperative that students know not only their own past but how history can be manipulated. This is especially evident in the battleground of historical textbooks.
With the war in Iraq over, when the nation's children return to school in September they will open very different history books. The Bush Administration has planned detailed revisions of the Iraqi history curriculum. Good idea? Maybe.
What the US Government wants to do is expunge from memory large parts of Iraq's past. This, in short, is a demilitarisation of its past. Where once history teaching in the country was centred on battles and the identification of weaponry for use against America, Iraqi textbooks are now in danger of a distinctly victorious American spin. This is cause for concern.
The problem is that an American-influenced revisionist reappraisal of Iraq's past may end up looking and sounding like what it was supposed to replace. If the country's history was deemed too biased against America, what's to say that the American take won't be equally biased?
What history teaches - when it is well taught - is the ability to be discriminating when faced with conflicting sources of information and evidence. Without this skill, students are in danger of reading the past with limited appreciation of its contradictions and subtleties.
But, beyond analytical skills, history also teaches context. It offers a sense of why things in the past happened in a particular way.
The need for this is abundantly plain when trying to understand such things as the historical antecedents of international terrorism. The Bali bombings are a case in point.
American curriculum writers might believe they, like a latter-day Ministry of Truth, own the past. Winston Smith found to his chagrin in George Orwell's 1984 that history was decided by Big Brother. In Iraq, a similar allegation could be levelled at the Bush Administration.
In another example of rewriting the past, South Korea protested in 2001 over Japanese junior high school textbooks that failed to address Japan's wartime activities, in particular the abuse of South Korean" comfort women".
Selective history was also taught after the Balkan wars. Serbian textbooks handled the fall of Slobodan Milosevic neatly. He was simply written out of the past.
There is some truth in what Edmund Burke, a radical figure in pre-revolutionary America, noted about the primacy of history. He said:"You can never plan the future by the past."
The urgent need to teach the"right" history is what is driving American textbook revisionists in Iraq. But is there ever a right history?
Judge Richard Posner, writing in Newsday (May 18, 2003):
Plagiarism is considered by most writers, teachers, journalists, scholars and even members of the general public to be the capital intellectual crime. Being caught out in plagiarism can blast a politician's career, earn a college student expulsion and destroy a writer's, scholar's or journalist's reputation. In recent days, for example, The New York Times has referred to "widespread fabrication and plagiarism" by reporter Jayson Blair as "a low point in the 152-year history of the newspaper."
In James Hynes' splendid satiric novella of plagiarism, "Casting the Runes," the plagiarist, having by black magic murdered one of the historians whom he plagiarized and tried to murder a second, is himself killed by the very same black magic, deployed by the widow of his murder victim.
There is a danger of overkill. Plagiarism can be a form of fraud, but it is no accident that, unlike real theft, it is not a crime. If a thief steals your car, you are out the market value of the car; but if a writer copies material from a book you wrote, you don't have to replace the book. At worst, the undetected plagiarist obtains a reputation that he does not deserve (that is the element of fraud in plagiarism). The real victim of his fraud is not the person whose work he copies, but those of his competitors who scruple to enhance their own reputations by such means.
The most serious plagiarisms are by students and professors, whose undetected plagiarisms disrupt the system of student and scholarly evaluation. The least serious are those that earned the late Stephen Ambrose and Dorothy Kearns Goodwin such obloquy last year. Popular historians, they jazzed up their books with vivid passages copied from previous historians without quotation marks, though with footnote attributions that made their "crime" easy to detect.
(One reason that plagiarism, like littering, is punished heavily, even though an individual act of plagiarism usually does little or no harm, is that it is normally very difficult to detect - but not in the case of Ambrose and Goodwin.) Competing popular historians might have been injured, but I'm not aware of anyone actually claiming this.
Confusion of plagiarism with theft is one reason plagiarism engenders indignation; another is a confusion of it with copyright infringement. Wholesale copying of copyrighted material is an infringement of a property right, and legal remedies are available to the copyright holder. But the copying of brief passages, even from copyrighted materials, is permissible under the doctrine of "fair use," while wholesale copying from material that is in the public domain - material that never was copyrighted, or on which the copyright has expired - presents no copyright issue at all.
From the American Revolution Round Table Newsletter (June 2003):
On the back page of The Weekly Standard, we recently found the front page of something called"Ye Newe York Times" for November 11, 1781. On the left hand side of the page was a headline; THREE WEEKS AFTER YORKTOWN, STILL NO CONSTITUTION READY. The subhead of the story, reported by R. Berke, read:"Hamilton, Fellow 'Neo-Federalists' Said To Be Eyeing Empire Across Continent." The story reported there were"troubling signs that American culture might not be compatible with democracy." Below was an Analysis written by R. W. Apple with the head:"Triumph Over British Empire Was Easy Part." A subhead declared:"Mayhem, Discontent Betray Hollow Victory." A second subhead maintained:"Desire to Return to British Sovereigntye Becoming Widespread." On the right was another story, headed:"War Viewed As Disastrous By Dismayed Citzenrie." A subhead predicted:"Reconstruction Costs Doom New Nation to Destitution, Obscuritie." In a box were headlines for a four page inside pull out section of other stories on the war."Calvinists and Other Extremists Planning Theocracies," was one title. A second:"Harvard Tutor: Sanctions Against George III would have worked." A Third:"Clog Dancing Troupe Retracts Disparagement of Washington." The leader of the troupe said they regretted the"wood- toothed slave-owning stiff" slur. Who says you can't find laughs in our favorite subject?
Lou Marano, writing for UPI (May 20, 2003):
The concept of "reasonable doubt" is being redefined in a way that makes it easier for prosecutors to gain convictions, a law professor said.
The presumption of innocence is well known, as is the principle that a juror should vote to convict a criminal defendant only if the juror believes the accused to be guilty "beyond a reasonable doubt." But University of Arkansas law professor Steve Sheppard has found that courts are increasingly changing the definition of reasonable doubt, particularly in judges' instructions to juries, in a way that is eroding the presumption of innocence....
Guilt beyond a reasonable doubt has always been a prosecutor's tool, Sheppard told United Press International in a phone interview, but it dates only to treason trials in Ireland in the 1760s and 1770s. Before that the jury was told it had to be convinced of guilt, a stricter standard. That's where we get the word conviction.
The first recorded use of guilt beyond a reasonable doubt in the American colonies was by the prosecution in the 1770 trial of the British soldiers involved in the Boston Massacre. The defense counsel -- framer of the Constitution and future President John Adams -- argued that guilt beyond a reasonable doubt lowered the standard of proof. Adams maintained that the jury should not convict unless it was absolutely convinced of guilt.
The lay jury is a very unusual office in the law, Sheppard said, and jurors didn't have anything like independence until the early 18th century. "A juror could be arrested for bringing in the wrong verdict," he said. Instructions to the jury became important.
(If you didn't accept a jury trial, you couldn't be convicted. So people would choose to be crushed with weights to keep their blood from being "tainted" and their property forfeited to the crown. The last words of those who wanted to protect their families was, "More weight." Sheppard said the last pressing he is aware of occurred in Cambridge, England, in 1741.)
Over the years, there was disagreement about what "reasonable" means. "Literally 25 different definitions bounced around the American courts throughout the 19th century, and there still are at least a dozen," Sheppard told UPI.
A judge's instruction upheld by the Second Circuit Court of Appeals defined reasonable doubt as "doubt for which you can give a reason if called upon to do so by a fellow juror."
One of the problems with this standard is that it hinders the juror whose doubt is based on the belief that the totality of the evidence is insufficient, Sheppard said in a summary prepared with Carolyne Garcia, University of Arkansas science and research communications officer. Yet these are precisely the circumstances in which the rhetoric of the law requires acquittal.
Sheppard said the requirement of articulability is subject to infinite reduction. For example, if a juror says a certain witness is not credible, he might be ask to explain why. "A juror who lacks the rhetorical skill to communicate reasons for a doubt is then, as a matter of law, barred from acting on that doubt," he said.
Patrick Healy, writing in the Boston Globe (May 19th, 2003):
In a test run for democracy, more than 500 professors elected interim leaders and deans of Iraq's elite university over the weekend, in many cases selecting onetime pariahs like Mudhaffar, sidelined by the school's Ba'ath Party leaders who left it impoverished and demoralized. (A previous president, Muhammad Rawi, was believed to be Saddam Hussein's personal physician."We have no Internet, we have no science journals, we have no contacts with Harvard or MIT. But at least we will now have an honorable man in charge," said Mohammad Abdul-Khader Ibrahim, a professor of biotechnology.
The elections, held as Iraq's universities reopened Saturday, were mostly peaceful, yet there were growing pains for Iraqis flexing this new privilege of one person, one vote.
Adjunct teachers were barred from the balloting, leading to spontaneous protests, and students argued that they should have a voice.
Some professors were outraged that former Ba'athists were still on the faculty, and at Baghdad University, a few had the audacity to run for the presidency. (One dropped out before the vote. Classes begin in earnest today for an estimated 200,000 students nationwide, and US officials say they want these academic enclaves to set an example of a calm, stable Iraq, after several weeks in which a few professors and even one campus president were injured in"revenge shootings" over their Ba'athist ties.
"There is a kind of tension on campuses," said Drew Erdmann, the US adviser for Iraq higher education, a historian who attended Williams College and Harvard University and taught at Harvard's Olin Institute for Strategic Studies."But the regime has changed, and the universities aren't for political cronies anymore."
Iraq's 15 universities and 30 technical schools share many traits with Western institutions of higher education. Students in both systems take about four years of required courses and electives to earn a degree, and those who wish to become teachers apprentice in local schools. There are soccer teams, poetry magazines, and student governments that lobby for cheaper parking on campus.
But there are anomalies that reflect Hussein's absolute rule.
A staple of the core curriculum, for instance, was a Ba'athist indoctrination course in which students learned about the party apparatus, the history of the regime, and Hussein's speeches and background. Most faculties have now scuttled that course, or will probably do so under the US"de-Ba'athification" policy in Iraq, said Army Lieutenant Colonel Stephen Curda, Erdmann's colleague and a former education professor at the University of West Florida.
Party membership was required for most professors, and the higher the party rank, the better the lab and office space. Most science facilities were prehistoric by American standards, with equipment that didn't even anticipate the World Wide Web.
"The military side of science seemed to be crowding out the academic side of science," said Erdmann, who added that he could not ascertain whether university labs were used to generate Iraqi weaponry.
Academic freedom, a bedrock principle of American campuses that stipulates the right of faculty to teach and research as they wish, was truly a foreign concept in this country, even at Baghdad University, whose status as the capital's premier school was long tarnished by its Ba'athist administration in the eyes of the faculty and the students.
Under the supervision of US monitors, and with the US Third Infantry at the school gates, the elections Saturday and yesterday were seen on campus as nothing short of the start of a new era.
An entry from the Media Research Center (May 2, 2003):
Oh, never mind. After running multiple stories about the failure of U.S. troops to prevent the looting of Iraq's national museum, with ABC's Peter Jennings going so far as to charge that the U.S. did not act according to international law to prevent it, on Thursday's World News Tonight Jennings read a short item about how the looting at the national museum may not have been as extensive as some people first reported and it turns out that many pieces were removed before the war.
As some people first reported. Those some people would be Peter Jennings' own World News Tonight, though he didn't remind his viewers, as well as other ABC News programs.
Jennings announced on the May 1 World News Tonight: The looting at the national museum may not have been as extensive as some people first reported. A Marine Colonel who's been investigating tells us today that hundreds of items have been recovered from smugglers, Iraqis have returned items they may have had for safekeeping, other pieces have been found in the rubble. And it turns out that many pieces were removed before the war. 27 so-called 'significant pieces' were stolen, some of them priceless, but those who said that more than 150,000 items were looted appear to be wrong.
As noted in the April 18 CyberAlert: Two of the Bush administrations cultural advisors in Iraq have now resigned, Peter Jennings intoned on the April 17 World News Tonight. They were frustrated by the failure of U.S. forces to prevent the pillage of Iraqs national museum, Jennings relayed in setting up an entire story examining the variety of places the U.S. has not been protecting. For details: www.mediaresearch.org
The April 21 CyberAlert pointed out: ABC News has displayed an obsession with the looting and thefts from the Baghdad Museum which several network reporters have blamed on the U.S., culminating with Peter Jennings on Friday night charging that the U.S. did not act according to international law to prevent it.
World News Tonight has now run at least three full stories in addition to the entire April 17 Nightline devoted to the topic and a segment about it on the April 18 Good Morning America. Both Nightline and GMA featured interviews with Martin Sullivan, Chairman of a federal antiquities board, who quit and denounced President Bush.
On April 18 Jennings offered another piece on the same subject. He lectured on World News Tonight: The country has been a living archive of mans earliest history where real connections can be made between then and now, which is why the Pentagon is being so widely criticized for not protecting the history when it captured the capital city. The U.S. is now guarding the entrance to Iraqs national museum, but the damage has already been done.
Jennings concluded the story he narrated himself by dismissing the Pentagon's defense as inadequate and mimicking the spin of critics: The Pentagon has said, in reply, look, this is war, and stuff happens, the U.S. was fired on from the museum grounds. Not a satisfactory answer for people who say that if the U.S. managed to protect the Ministry of Oil, why not this repository of civilization? Why, they ask, is neglect forgivable? For more: www.mediaresearch.org
Being a TV news anchor means never having to say you're sorry.
Simon Jenkins, writing in the London Times (May 2, 2003):
Beware of memory. For the time being, 2003 marks the fall of a hated tyrant. In years to come it may mean something else, the destruction of the greatest treasure from the oldest age of Western civilisation. We know of the sacking of the Library at Alexandria in AD624. Who cares what caused it?
Until this week only soldiers and reporters had witnessed the devastation of the National Museum of Baghdad, the seventh biggest in the world, and the burning of the National Library, containing some 5,000 of the earliest known manuscripts. On Tuesday a team led by John Curtis from the British Museum returned from Iraq and agreed with the senior archaeologist, Lord Renfrew of Kaimsthorn, that we face the greatest heritage catastrophe since the Second World War. Though it is early days, two vast repositories of world history appear simply to have vanished. ...
Much water will doubtless flow under this bitter bridge. But even the Bolsheviks protected the Hermitage during the Russian Revolution. In the Second World War, armies were under specific orders to spare historic sites and museums, even at cost to themselves. Chartres was not shelled though it contained snipers. Museums were looted, but by soldiers who respected what they were looting. They knew that a museum is not a warehouse. It is the custodian of the identity of a people. Robbing it is like seizing the crown jewels of a collective memory. It seeks to erase that memory.
On Tuesday the British Museum’s director, Neil McGregor, began the task of reassembling that chain. He welcomed experts from Russia, France, Germany, Philadelphia and New York to begin perhaps the greatest task of cultural rescue. Black markets must be scoured for faces and torsos. Some shattered pieces may be reassembled. Offers of money came from Japan, Italy and Germany, but none from the clearly embarrassed Governments of Britain and America, scurrying for cover. Coalition troops have even refused border searches for looted items. A private donor had to pay for last week’s survey by the British Museum team.
I am now told that Washington is preventing the Iraqi antiquities staff, the most experienced in the Middle East, from conducting their own audit of what they have lost. This is an urgent task if police forces are to be warned of what might be recoverable. A US military base has been stationed in a wing of the museum. The coalition wants no more bad publicity about cultural losses. The insult could hardly be better designed to fuel the rumour machine.
Bill Glauber, writing in the Chicago Tribune (April 30, 2003):
Warrior and prosecutor, classicist and amateur boxer, Marine Col. Matthew Bogdanos faces one of the great legal and investigative challenges of his career: getting to the bottom of the plunder of the National Museum of Iraq.
Bogdanos was dispatched here last week to help investigate a case that swirls with unanswered questions over how and why one of the world's great repositories of antiquities was looted in the aftermath of the fall of Saddam Hussein....
The looting of the museum may have grown into an international scandal, but the investigation is in many ways standard, Bogdanos said.
"First, you treat this location as a crime scene with the limitations inherent with the crime scene contaminated by a mob," Bogdanos said. "You don't take fingerprints. You establish a timeline. Who came in. Who came out. Who had access.
"Second, you do a physical inspection to determine points of entry. Third, you determine what is gone, determine what was here, what's here now and then you do the math.
"That is a formidable task, and that's not completed."
Bogdanos said investigators and curators still cannot enter some of the museum storerooms because of the damage....
Bogdanos said his initial inspection of the museum took more than 12 hours, and said it appeared that axes and hammers were used to enter the storerooms. Microfilm, microfiche, video and film were spread everywhere, doors were kicked in and statues smashed.
He said it was "clear that there were a combination of groups" among those who raided the museum.
"One being a group of individuals who knew what they wanted and selected the items they wanted almost as if from a shopping list," Bogdanos said. "The second group, you're calling them looters, for lack of a better term. Those individuals destroyed as much as they stole and clearly destroyed indiscriminately."
Acting on tips, Bogdanos has tried to retrieve works, so far with little success. Appeals also have gone out to the community to return treasures. A few people have heeded the call, including one man who returned more than 40 items Friday.
"People are returning items on a daily basis with no fear of retribution," Bogdanos said, "and no questions asked."
Alan Riding, writing in the NYT (May 2, 2003):
Saddam Hussein was hardly modest. Among the legendary Iraqi heroes whom he considered his equals were Hammurabi, Nebuchadnezzar II and Alexander the Great.
Since each had ruled Mesopotamia from Babylon, Mr. Hussein built himself a palace on a man-made hill beside the footprint of the once-great city. And since the view was of a dusty excavation site, in 1987 he ordered construction of a replica of Nebuchadnezzar's vast palace.
For school groups and tourists visiting this latest Babylon, 45 miles south of Baghdad, the message was clear: Mr. Hussein, too, would be remembered centuries hence.
His connection to a glorious past was further reinforced by two museums one devoted to Hammurabi (1792-1750 B.C.), the other to Nebuchadnezzar (605-562 B.C.) and by a replica of the open-air theater built here by Alexander the Great around 330 B.C. And as always, portraits of Mr. Hussein were everywhere....
A few remnants of old Babylon survive, principally where archaeologists have excavated the foundation stones of some of Nebuchadnezzar's palace. Part of a mile-long Processional Way, which once went from the Ishtar Gate to the Temple of Marduk, also survives, with its original paving stones still visible. But almost everything else is pastiche.
Nebuchadnezzar's museum, which once recounted the Chaldean monarch's epic rule, including his destruction of Jerusalem in 586 B.C., has now been bricked up to protect its two valuable reliefs. But Hammurabi's museum, which recalled his famous code of laws covering all aspects of life and commerce, is open to the elements and strewn with broken glass, loose paper and remnants of an oil portrait of Mr. Hussein.
Alexander the Great, who died in Nebuchadnezzar's palace in 323 B.C. upon his return from India, had planned to make Babylon his imperial capital. But he left only a Greek-style theater. Upon its ruins Mr. Hussein built a still grander theater, with a large glassed-in box, marble floors and stucco ceilings, where the dictator sat on his one visit there, in 1987. The theater was nonetheless used annually for a festival that brought performers from nearby countries....
What should be done with Mr. Hussein's palace and the replica of Nebuchadnezzar's palace?
To tear them down might please archaeological purists, but it would be immensely costly and would also destroy a possible tourist attraction. Yet to leave them would realize Mr. Hussein's dream of being remembered long after his death.
Today's Babylon, after all, is Mr. Hussein's Babylon.
Michael Kinsley, writing in Slate (May 2, 2003):
"As Republican Members of the House of Representatives we propose not just to change its policies, but even more important, to restore the bonds of trust between the people and their elected representatives. That is why, in this era of official evasion and posturing, we offer instead a detailed agenda for national renewal, a written commitment with no fine print."The "Contract With America," 1995
"By the year 2002, we can have a federal government with a balanced budget or we can continue down the present path towards total fiscal catastrophe."Rep. Tom DeLay, R-Texas, 1995
When, in the Course of human Events, it becomes necessary for one Party to dissolve the Political Bands which have connected its leaders to one alleged core belief, and to bind themselves with equal pomp and gravity to a contradictory core belief, as the Laws of Politics and Political Winds entitle them, a decent Respect to the Opinions of Mankind requires that they have a Pretty Good Explanation. Or So you would Think. Especially when the First Belief comes Wrapped in a pompous Document with a lot of Bullying Language about how Noble its adherents are and what Scum any opponents Must be.
There are plenty of available reasons to explain why someone might have voted for the famous Contract With America in 1995, including a constitutional amendment mandating a balanced federal budget, and why today that same someone is an enthusiastic backer of President Bush's tax-cut proposal, which will take a $300 billion-plus budget deficit and push it higher. "Those pills are finally kicking in," might be one such explanation. No, on second thought, maybe the GOP is smart to avoid drawing attention to the historical contrast and to stick with the traditional politicians' philosophy that today is the first day of the rest of your re-election campaign.
Republicans aren't the only source of such fun. There is a limited amount of hilarity to be had from watching formerly or even currently wastrel Democrats as they try on the preachers' robes of frugality. But Bill Clinton pretty much ruined that joke by actually balancing the budget (OK, OK, with some help from those congressional Republicans, along with considerable hindrance and near-universal predictions of disaster). That delightful 1995 quote from Tom DeLay, now House majority leader, predicting fiscal calamity in 2002 unless the Balanced Budget Amendment is enacted, brings to mind that famous joke about the 1964 election: "They told me if I voted for Goldwater we'd soon have half a million American troops in Vietnam. So I voted for Goldwater, and they were right."
Daniel Gross, writing in Slate (April 30, 2003):
The Treasury Department is putting out word that accelerating the planned reduction of marginal tax rates, cutting taxes on dividends, and otherwise tinkering with the tax code will create more than 1 million jobs by the end of 2004. The Council of Economic Advisers is more sanguine, estimating the Bush plan will create 1.4 million new jobs by the end of 2004.
But 1.4 million jobs in 18 months isn't many jobs, and it isn't much growth. By historical standards, when it comes to job creation, Bush is shaping up to be more like Herbert Hoover than Ronald Reagan. He stands to preside over the first presidency since Hoover's in which the American economy lost jobs.
The Bureau of Labor Statistics' time series on non-farm payrolls, dating back to 1919, testifies to the amazing long-term performance of the U.S. economy. ... Even in times that were considered badthe oil shock of 1973, or the stagflation of the Carter years, or the early 1990s recession (which candidate Bill Clinton somewhat disingenuously labeled the worst economy in 50 years)payrolls continued, by and large, to rise.
There was only one downturn in payrolls that spanned a president's four-year term. Between 1929 and 1933, the notorious single term in Hooverville, the number of Americans with payroll jobs fell 24 percent, from 31.32 million to 23.69 million. (Shockingly, not until 1940 would the U.S. economy employ as many people as it did at its 1929 peak.)
The seasonally adjusted figures for the past decade should jar the tax-cut supporters who insist marginal tax rate reductions create jobs. Orthodox economic theory holds that raising taxes kills jobs and cutting taxes creates them. But in the 16 months after the passage of the 1993 Clinton budget plan, which raised marginal income tax rates on the highest earners, payrolls rose from 110.96 million to 115.92 million. In other words, the biggest tax increase in American history "created" nearly 5 million jobs in less than a year and a half. In the 22 months since President Bush signed his tax cuts in June 2001, the number of payroll jobs has fallen from 132.11 million to 130.41 million in March 2003. In other words, the biggest tax cut in American history has so far "cost" us 1.7 million jobs and counting. (Bush supporters, with more passion than evidence, insist that job losses would have been worse had the tax cuts not been passed.)
The good news for Bush is that with a base of 130 million jobs, adding 1.4 million in an 18-month period isn't out of the ordinary. In fact, 1.4 million jobs would still be below average: Over the past 84 years, the economy typically adds nearly 2 million jobs every 18 months.
The bad news for Bush is that even if the economy does add 2 million jobs by October 2004, he will still have presided over the only job-losing presidency since Hoover. And as Karl Rove surely knows, that name is never good company for a president seeking re-election. Since 1900, the only incumbent Republican presidents to lose second-term bids have been named Hoover and Bush.
Martin Sieff, Senior News Analyst, writing for UPI (May 1, 2003):
I was just a teenage kid on the streets of Belfast when the British army rolled in during the sectarian riots of August 1969 to protect the Catholic community of the western side of the city from Protestant mobs who had already burned down entire city blocks of terraced homes. The local people welcomed them as liberators.
A year and a half later, people from the same community were shooting at the army, and they were shooting to kill.
It took only 18 months for the British army to turn from being liberators to being the hated and feared oppressors in the eyes of the community they had entered the city to protect. It has taken less than four weeks for that to happen in Iraq.
The killing of 15 anti-American demonstrators by U.S. troops in the Iraqi city of Falluja this week, followed by the reported killing of at least two more Wednesday, is a dire omen for those who imagined Iraq could be quietly but firmly guided on the paths of stable, pro-American democracy in the next few months, or even years.
It is, rather, the kind of event that Thomas Jefferson called "a fire-bell in the night" -- the harbinger of infinitely worse conflict and travails to come.
In its scale and likely repercussions, the Falluja Massacre -- as it will soon clearly be known -- appears remarkably similar to the killing of 13 Northern Irish Catholics by the British army during fierce demonstrations in the city of Londonderry -- a provincial center comparable to Falluja -- on what became known as "Bloody Sunday" on Jan. 30, 1972.
That event, more than anything else, proved a windfall for the rapidly mobilizing paramilitary Provisional Irish Republican Army, at the time known popularly as the "Provos." And over the next few years, it launched a campaign of urban terror and bomb massacres that in its calculated efforts to kill and maim civilians was without parallel in Europe during the 46 years from the end of World War II to the beginning of the wars in the former Yugoslavia in 1991.
It will be surprising if we do not see the same thing in Iraq in the coming months, and possibly even in the next few weeks.
How then, should we assess the political significance of the Falluja riots and their death toll? As a proof that the pre-Saddam dynamics of Iraqi society have already asserted themselves and that the U.S. Army and the American people will rapidly become the subjects of Iraqi popular wrath.
Another vivid phrase of Thomas Jefferson sums up this new dire state of affairs. The United States has seized a wolf by the ears in Iraq. And now it dare not let it go.
Martin Gottlieb and Barry Meier, writing in the NYT (May 1, 2003):
After a rampage of looting of museums in Iraq in the wake of the Persian Gulf war of 1991, American and British archaeologists compiled a list of more than 2,000 stolen objects, a sad catalog of losses to the history of civilization. Eleven years later, experts say, no more than half a dozen of the pieces have been tracked down.
Many others are presumed to have been traded away through a thriving international market in antiquities. The poor record of returning artifacts lost after the gulf war suggests the daunting obstacles that museum officials and police investigators face as they commit to finding items recently sacked from the National Museum of Iraq in Baghdad and other sites. The plunder from 1991 added fuel to a global industry of scavengers, shippers and traders, who funneled stolen items from Iraq into the hands of private collectors overseas. While reputable dealers and owners insist they work hard to identify and avoid illicit goods, eager buyers continue to demand rare items, and the market flourishes.
"Sometimes we feel we are fighting a war we have already lost," said Manus Brinkman, secretary general of the International Council of Museums in Paris, one of many museum officials engaged in current recovery efforts.
The booty from the National Museum includes invaluable one-of-a-kind treasures as well as thousands of artifacts of everyday ancient life. John Curtis, who heads the British Museum's Near East department, said here Tuesday that paper records and microfilm were strewn about in a way that will take the staff "months if not years to sort out."
Museum curators and law enforcement officials say that the disarray and loss of documents will make it especially difficult to recoup the artifacts. To show that an item has been stolen, experts require papers tracing it to an ancient site or museum. Many Iraqi objects lost in the 1991 looting were removed from sites and understaffed museums that had no careful recording in photographs and catalogs.
"These cases can be a nightmare," said Tony Russell, a former detective sergeant with Scotland Yard's art squad, who is now with the James Mintz Group, an investigative agency. Stolen artifacts often disappear for years before emerging for sale. Other factors add to the difficulties: the ease with which material can slip through customs, the meager numbers of police assigned to art theft, and the circuitous trails of ownership in the world of trading.
Alan Riding, writing in the NYT (May 1, 2003):
Even though many irreplaceable antiquities were looted from the National Museum of Iraq during the chaotic fall of Baghdad last month, museum officials and American investigators now say the losses seem to be less severe than originally thought.
Col. Matthew F. Bogdanos, a Marine reservist who is investigating the looting and is stationed at the museum, said museum officials had given him a list of 29 artifacts that were definitely missing. But since then, 4 items ivory objects from the eighth century B.C. had been traced.
"Twenty-five pieces is not the same as 170,000," said Colonel Bogdanos, who in civilian life is an assistant Manhattan district attorney.
There is no doubt that major treasures have been stolen. These include a lyre from the Sumerian city of Ur, bearing the gold-encased head of a bull, dated 2400 B.C.; a Sumerian marble head of a woman from Warka dated 3000 B.C.; a white limestone votive bowl with detailed engravings, also from Warka and dated 3000 B.C.; a life-size statue representing King Entemena from Ur, dated 2430 B.C.; a large ivory relief representing the Assyrian god Ashur; and the head of a marble statue of Apollo, a Roman copy of a fourth century B.C. Greek original.
Even if the damage may not be as widespread as originally reported, there is still no clear answer to the most important question: just how much has been taken?
"I don't know exactly," said Jabbir Khalil, chairman of the State Board of Antiquities.
Mark Clayton, writing in the Christian Science Monitor (April 29, 2003):
Closer ties between academia and the agency do not yet rival the clubby atmosphere of the 1950s. The Vietnam War, 1970s congressional inquiries, and scandals over covert funding on campuses in the 1980s (see sidebar, below) contributed to the frosty relationship. Yet all signs indicate that the tweedy set and the CIA are getting cozier.
For one, the revolving door between the agency and the ivory tower has been spinning of late. Last year, two public universities named presidents with CIA ties: At Texas A&M, former agency director Robert Gates took the helm, while Arizona State University picked Michael Crow, vice chairman of In-Q-Tel, a nonprofit venture-capital arm of the CIA.
Then, too, agency insiders and scholars cite a leap in CIA funding of academic-research contracts and conferences, though numbers are hard to come by.
The CIA has also reached out to higher education with its Officer in Residence (OIR) program, which since 1985 has sent 84 agents to 46 universities....
Ties between the CIA and academia are old and deep, but also frayed. The fledgling CIA in 1947 grew out of the old World War II Office of Strategic Services, recruiting the "best and brightest" of a generation of scholars into the agency's intelligence-analysis side.
During the 1950s, the CIA and other intelligence agencies provided the early primary-funding sources for the social sciences.
Christopher Simpson and other experts in the area say at least some funds supported notable organizations such as Columbia University's Bureau of Applied Social Research, Princeton's Institute for International Social Programs, and MIT's Center for International Studies.
The CIA became anathema on campus in the 1960s. The Vietnam War soured the relationship with academia, and the situation amplified when a Ramparts magazine report in 1967 revealed the CIA had for years been covertly funding foreign activities of the National Student Association.
In 1975, Sen. Frank Church headed a Senate committee that investigated intelligence-gathering abuses during Watergate. The committee's findings revealed CIA-orchestrated efforts to overthrow elected governments in Chile and to assassinate Cuba's Fidel Castro.
Then came covert-funding scandals surrounding academic work and conferences at Harvard in the mid-1980s and at the Rochester Institute of Technology in the early 1990s.
Agency recruiters are still persona non grata on campuses such as the University of California, Berkeley. But today lines of students at CIA recruiting booths at campus job fairs easily outnumber egg throwers. Agency spokesmen report high numbers of applicants since the terrorist attacks on Sept. 11, 2001, many from graduates seeking first jobs.
Stansfield Turner was one of the earliest pioneers in trying to mend fences between the CIA and higher education, while director of the CIA from 1977 to 1981. Now a professor at the University of Maryland, he says he was the first director to send CIA officers for teaching stints on college and university campuses to try to build better ties.
"I was very concerned about the rupture with academia in the 1970s," Admiral Turner recalls. "I spent a lot of time mending fences, trying to get a few academics to be on advisory boards."
In 1985, the practice of sending CIA analysts to universities was formalized as the Officer in Residence program under Director of Central Intelligence William Casey.
"Since the end of the cold war, we've had to focus on more than 180 countries around the world," Turner says. "You don't know what's going to blow up tomorrow. The agency should develop expertise on all of that, but it's just not easy to do or maintain. So if you can find someone to supplement the in-house capability, it's very useful."
Paul Samuelson, writing in the Business Times Singapore (April 29, 2003):
What ails America now is its 'plutocratic democracy'. Republican George W Bush in the White House and the Republican majorities in the congressional Senate and House of Representatives behave neither wisely nor cleverly. Their one economic preoccupation is to cut taxes massively for the affluent while at the same time stealthily reneging on the Social Security and welfare entitlements of the median classes and the American poor.
Why has this Reagan-Bush new trend become so powerful? When 200 years ago liberal philosophers contemplated setting up democracy, they understandably feared that without checks and balances there was danger that the mob would gang up against the affluent owners of property. One person-one vote might turn out to endanger market efficiency and overall progress. The philosophers inadequately sensed the opposite danger, which now shows itself in new-century America. One thousand rich persons, by their election contributions to conservative candidates for office, can outweigh in effective political influence 100,000 rank-and-file citizens.
That is not all of it. Corporate lobbyists can motivate elected officials to pass laws that put special interests above overall social interest. Why do voters not see this and understand it? The arguments devised to persuade so many to ignore their own true self-interests is radical right supply-side economics. We are urged to favour those already affluent because that is what is needed to make them more inventive and efficient.
Mike Allen, writing in the Washington Post (April 27, 2003) about a conference held at Princeton University:
President Bush has not been particularly friendly to historians: He signed an executive order making it easier for officials to classify documents, gave former presidents a veto over the release of their papers, and in most cases allows his staff to give only the most sanitized accounts of life in his White House.
But the 43rd president is providing rich fodder for those historians, offering a colorful and elusive target for a raft of professors trying to explain how a semi-prepared Texan, armed with simple eloquence and prickly certitude, managed to elevate the office but alienate much of the world after the catastrophe of Sept. 11, 2001.
Hugh Heclo, a public affairs professor at George Mason University, says Bush's presidency "is already destined for a remarkable place in the history books," not just because of his response to the terrorist attacks, but also because of his early decision to brush aside the conventional advice to proceed cautiously after the election debacle of 2000.
"The only modern president with less of a mandate was Gerald Ford in 1974, who received zero popular votes," Heclo said. But he added that in contrast to Bush's image as a slacker, "focus, self-control and unblinking perseverance prepared Bush to be a wartime president before he, or America, knew it was at war."
Bush has 43 percent of his term left -- 828 days down, 634 to go. Two-thirds of his "axis of evil" remains. Osama bin Laden as well as Saddam Hussein and Iraq's weapons of mass destruction are unaccounted for. The economy dithers. His reelection race is still in previews.
But in the era of the 24-hour news cycle, history can't wait. Fifteen well-known presidential scholars packed a lecture hall at Princeton University on Friday and today for "The George W. Bush Presidency: An Early Assessment."
Liberals and other Bush skeptics were well-represented, and one presenter joked privately that the subtext of the sessions was: "This guy's crazy. Why is he so successful?"
Fred I. Greenstein of Princeton University, an authority on presidential leadership styles who organized the conference, said the consensus was that Bush has mastered the art of doing a few things well: He is very much in charge, sets a few priorities and sticks to them, and surrounds himself with very experienced people but is not intimidated by them.
"That might not keep him from driving the country off the cliff," Greenstein said. "But he would be a very good race-car driver."
Karen M. Hult, a political scientist at Virginia Tech, drew flow charts of the West Wing and found a White House "permeated with concerns about public relations" that drive policy deliberation and initiatives. She said Bush's aides accelerated the trend, building since President Richard M. Nixon, of using the presidency to serve "the permanent campaign."
A criticism from several seminars was that Bush, 56, has favored short-term victories and may leave messes behind. Allen Schick, a specialist in public finance at the University of Maryland, said the White House has mounted a misleading "no-fault defense" for rising deficits by blaming the terrorist attacks and a fragile economy rather than the $1.3 trillion tax cut that was the signature victory of Bush's first months in office.
"The Bush White House is not clueless on the fiscal course the president has charted," Schick said. Instead, he sees Bush as embarked on a strategy of depriving government of revenue so that Congress will be forced to restrain spending and unable to rescue Social Security and Medicare.
"Just as Reagan was succeeded by presidents who boosted taxes, so too will George W. Bush," Schick said. "It does not even matter whether his successor is a Republican or a Democrat."
While Schick alleged intended consequences, two former members of President Bill Clinton's National Security Council staff warned of unintended consequences from Bush's policy of preempting attacks by crushing states that harbor terrorists. Ivo H. Daalder and James M. Lindsay, both senior fellows at the Brookings Institution, argued that administration arrogance has bred mistrust and resentment abroad, and could turn the United States into "a powerful pariah state that, in many instances, will prove unable to achieve its most important goals."
The Brookings scholars sought to debunk the notion that the terrorist attacks had changed Bush's worldview. They said today's foreign policy, which Daalder described as "killing people before they kill you," is a logical outcome of Bush's choice of "intelligent hardliners" rather than moderate Republicans as his campaign advisers. The scholars wrote that the al Qaeda hijackings affirmed Bush's conviction "that this dangerous world could be made secure only by the confident application of American power, especially its military power."
"George Bush is an agent of his own making," Daalder said. "This is a man who is in control. This is not a revolution of an administration. It is a revolution of one man."
From a report in Wired News (April 28, 2003):
Working to locate those treasures [lost in the looting] -- which reach back 7,000 years to the advent of civilization -- archaeologists are building a comprehensive, searchable image database of the tens of thousands of objects that are missing and presumed to be in the hands of professional art thieves.
The Lost Iraqi Heritage project is a joint effort of over 80 universities, museums and individuals working to create a tool that law enforcement, customs officials and art dealers can use to prevent the sale and export of stolen objects. The group, which is coordinated by professors at the University of Chicago, includes the Archaeological Institute of America, University of California at Berkeley and the University of Michigan.
Archaeologists say they are motivated by what they see as an unprecedented, incalculable loss.
"Imagine if Michelangelo's statue of David and the Mona Lisa and the Magna Carta and Botticelli's paintings and all the major Impressionist painters' works were in one museum that got looted," said Dr. Clemens Reichel, a research associate at the University of Chicago's Oriental Institute.
Nicholas Kouchoukos, an associate professor of anthropology at the University of Chicago who heads up the technical effort, says the project will be built in phases. The first online effort is to display images of some of the most famous pieces from the museum in order to show the extent of the losses.
Iraq's Lost Heritage will be the backbone of an extensive effort to catalog the losses, as well as to facilitate the objects' return and the rebuilding of the Iraq Museum. The first version will show only images of the museum's known masterpieces, but the organizers plan to turn it into a searchable database as soon as possible.
The effort faces severe challenges. Little is known outside Iraq about the extent of the holdings, which makes the process of learning what has been looted almost impossible.
The museum's own records were apparently destroyed in the two days of looting. Some say that was an attempt on the part of professional art thieves to cover their trail. The hope is that smashed computer hard drives can be salvaged.
But even if the information can be retrieved, Kouchoukos says the computerized records included only a small fraction of the collection -- the museum's access to software and hardware was extremely limited during the 12-year embargo against Iraq.
The database is being populated with images from published books and museum exhibition catalogs, as well as unpublished images from scholar's notes and from institutions that excavated artifacts in Iraq and documented them before turning them over to the museum.
Christian Bourge, writing for UPI (April 29, 2003):
Conservative and liberal special interest groups have hijacked the process of designing textbooks and standardized tests in the United States, in the process effectively lowering the quality of much of the education materials produced for American schoolchildren, according to education policy experts.
In a speech at the New America Foundation on Thursday, Diane Ravitch, a senior fellow in education policy at the liberal-centrist Brookings Institution and a research professor of education at New York University, said the problem can be traced to the state and federal government committees charged with handling special interest concerns about what is said in public school textbooks and standardized tests.
"I stumbled across what I think is a major crisis in education," said Ravitch. "It involves censorship, and special interests, and big business, and state policies that allow censorship and the political actions of pressure groups that end up dumbing down texts.
In her new book, "The Language Police: How Pressure Groups Restrict What Students Learn," Ravitch explores the evolution of the system of government sensitivity committees, which originally were established to help eliminate stereotypes and biases in school materials regarding gender, race and occupations. She concludes that the publishers of textbooks and standardized tests now self-censor books and other materials in an effort to avoid the high costs of making last-minute changes to please state education officials and special interests.
She said that states -- such as California and Texas -- that purchase education materials en masse for their public schools rely on anti-bias and sensitivity guidelines to oversee their decisions, including lists of taboo words, subjects and concepts. Ravitch said that many companies that produce these textbooks and standardized tests have adopted similar guidelines in an effort to avoid producing material that state education authorities will not buy.
"The one way to get a textbook killed is to become controversial," she said.
Charlene F. Gaynor, executive director of the Association of Education Publishers, told United Press International that although the problem is most evident in the textbook and standardized test sections of the industry, self-censorship to avoid confronting issues sensitive to special interests is rampant throughout the educational publishing field.
Catharin E. Dalpino, Fellow, Foreign Policy Studies at Brookings, writing in Slate (April 23, 2003):
Last week the United States concluded the military phase of the war in Iraq and began discussing the creation of a new political regime and a new system. During the warjust as in every other U.S. military intervention of the past decadeWashington had to face the so-called Vietnam syndrome: the fear that conflict in a foreign country will lead to quagmire, especially in a country where the native population can use guerrilla tactics to stymie superior military technology. But there's another type of Vietnam syndrome, less well-known but just as pervasive. It derives from our relationship with South Vietnam and the political quagmire that resulted from our experience as democratic imperialists there. And if we don't address it, we may very well repeat it in Iraq.
Underestimating nationalism. The alliance between the United States and South Vietnam was uneasy and ambivalent. The failure of America's "hearts and minds" campaign in the South can be attributed as much to nationalism as to the North's resolve to expel foreigners from Vietnamese territory. Americans saw themselves as protectors; many South Vietnamese viewed them as occupiers. Early warning signs were plentiful but ignored. The majority of Saigon streets were named after Vietnamese heroes who had liberated the country from a millennium of foreign occupiers: Chinese, French, and Japanese. Washington's opposition to a reunification vote was based in part on intelligence that Ho Chi Minh would win a free election in the South because his nationalist credentials trumped reservations about ideology. A similar cognitive dissonance appears to be brewing in Iraq, evidenced by the mounting anti-American demonstrations in Mosul and other areas.
Reinforcing religious and ethnic divisions. When France gained control of Vietnam in the 19th century, French administrators put many Catholics in official government positions, replacing the traditional Vietnamese leaders who had quit their posts in protest against colonial rule. By the mid-20th century, Catholics made up 20 percent of the Vietnamese population and formed the political elite. When the United States took up the struggle against Vietnamese communism, Washington, too, showed an initial preference for the Catholic elite. Many Buddhist leaders equated the repressive South Vietnamese government with its American sponsors and were an early source of anti-Americanism.
In an eerie parallel, the percentage of Sunni Muslims to Shiites in Iraq is roughly that of the Catholic-Buddhist ratio in Vietnam in the 1950s. The United States has historically been closer to the Sunni community, and some Shiite groups have issued calls to resist any American involvement in a postwar government. Conversely, some Sunni groups fear that the U.S. will try to compensate for its past slight of the Shiites and are protesting the American presence in Iraq on the grounds that they will be disadvantaged.
Michael P. Tremoglie, a former Philadelphia police officer, writing in frontpagemag.com (April 30, 2003):
Two famous brothers were once accused of "warmongering " by a senate subcommittee. Who were they?
No, they were not George and Jeb Bush. They were Harry and Jack Warner, the founders of Warner Brothers studios. The similarities between the two incidences are worth noting. Both in the America of the 1930s and early 1940s and the America of the 1990s and early 21st fifth columnists worked to change American foreign policy. Yet, it was little noticed by the media and government.
In 1939, just months before Hitler invaded Poland and World War II began, Warner Brothers released a film called "Confessions of a Nazi Spy, starring Edward G. Robinson as an FBI director investigating a Nazi spy ring. This movie was a fictitious account of a real spy investigation and trial that occurred in 1937. The movie became a clarion call to many Americans of the Nazi threat to the United States.
Confessions of a Nazi Spy was controversial during production. People involved in making the movie received anonymous threats. After the movie was shown in theaters, it ignited such a firestorm that in 1941, after complaints by special interest groups (i.e. isolationists, Nazi sympathizers and those with German business interests), the Warners were subpoenaed to testify before a Senate hearing. Ironically, the Senate convened the hearing in October 1941, to investigate " war mongering propaganda" by the motion picture industry.
The parallels between the Warner brothers being accused of warmongering are eerily similar to those who today accuse President Bush of warmongering or who want him impeached. The allegations made against the Warners dovetails well with the accusations by liberal politicians, the mainstream media, Hollywood liberals such as Tim Robbins, and activist groups like Peace Action. Their specious claims of warmongering, propagandizing, and censorship by the Baseball Hall of Fame, Fox News Channel, and Clear Channel Communications is nearly interchangeable with those who sought to persecute the Warners 60 years ago.
Dick Morris, writing in frontpagemag.com (April 30, 2003):
As George Santayana put it "Those who do not remember the past are condemned to repeat it." He might have included an injunction to remember the past accurately.
There is so much mythology surrounding the dramatic fall of Bush I from dizzying heights of popularity after the Gulf War to defeat less than two years later that the essential and real risk Bush II runs of repetition may be obscured and his campaign team could draw the wrong lessons from a misguided view of history.
George H. W. Bush did not lose because of "the economy, stupid." A good economy might not have saved him and a bad one need not have doomed him. The economy provided the coup de grace. But he was laid low and rendered vulnerable by four other factors:
1. Bush I faced an opponent who took away his best issues
Bill Clinton supported the death penalty, pledged an end to "welfare as we know it," and promised a tax cut for the middle class. So Bush could not use crime, welfare, or taxes as issues, the three staples of the GOP. ...
2. Bush I screwed up his signature issue by raising taxes
3. The Gulf War War lost its relevance
Once Bush Sr. left Saddam in power, the war disappeared as an issue. It was nowhere to be found in the '92 campaign.
Will the war on terror still captivate the nation's attention eighteen months from now? Oddly, Bush Jr.'s successes, not his failures,that may haunt him. If he succeeds in dealing with North Korea and prevents attacks at home, the political potency of the terrorism issue may evaporate before November, '04.
4. Bush Sr. had no domestic policy issue with which to control events.
Since Bush I had no domestic policy agenda beyond fighting the recession and cutting the deficit, he lost control over the political dialogue. Here, Bush II faces much the same problem. He lacks a domestic policy issue. If terror fades - either because of Bush's success or because Lieberman wins the Democratic nomination - he's got no backup strategy. Tax cuts aren't the answer; nor is partial birth abortion or energy production or lawsuit limitation.
Tom Zeller, writing in the NYT (April 27, 2003):
We ask a lot of history. We believe that if we study it closely enough, we can avoid repeating it. We also believe that its lessons are comprehensible enough to give us analogies to fit any unfolding tragedy or triumph. So as Iraq presumably gets a fresh start under Jay Garner, the retired lieutenant general appointed by the Bush administration to oversee postwar reconstruction, analysts of every kind wonder if there can ever be anything new under the desert sun....
The two most frequently cited historical object lessons in support of this confidence have been postwar Germany and Japan.
"There was a time when many said that the cultures of Japan and Germany were incapable of sustaining democratic values," President Bush said in February. "Well, they were wrong. Some say the same of Iraq today. They are mistaken."
But even if one accepts those countries as apt comparisons (and there are plenty who do not), it is difficult to find other examples. Germany and Japan are among only four success stories highlighted in an otherwise rather gloomy study compiled in January by the Carnegie Endowment for International Peace, a nonpartisan research group, which found that in most instances where the United States used its military to help oust a government, democracy rarely followed.
Typically, this is attributed to conditions on the ground before American forces ever arrive. A strong national identity, previous experience with constitutional government and a reasonably experienced bureaucracy are good predictors of success. But Iraq, the study suggests, is unpromising, given its ethnic and religious divisions and its lack of an administrative structure.
"I think we're going to have big trouble ahead," said Minxin Pei, an author of the paper and a senior associate at the endowment specializing in democracy. "Germany and Japan were developed, modern societies, but developing countries like Iraq have so many internal characteristics that aren't conducive to that kind of change. High levels of inequality, a distribution of power that favors entrenched elites, a weak national identity the odds are against you even in the first 10 years."
Other success stories Panama after the ouster of Manuel Noriega's government in 1989 and Grenada after the American invasion in 1983 were far less complex. Ethnic rivalries were few, and something akin to a constitutional system was being restored rather than invented.
Multinational or United Nations-backed efforts, like that which toppled the Taliban in Afghanistan, also tend to fare better than those in which the United States goes it alone. But even in Afghanistan, the jury is still out. The government of Hamid Karzai really only controls Kabul and its environs, and newly active elements of the Taliban, apparently able to reorganize in the borderlands of Pakistan, have mounted fresh attacks, perhaps with the help of officials who are supposed to be loyal to Mr. Karzai.
To some analysts, that is evidence that success or failure at democracy may well depend less on the particulars of the country and more on the commitment of the liberators.
"The tragedy is that we haven't been very grown up in our relationships along these lines," said Joseph Montville, director of the preventive diplomacy program at the Center for Strategic and International Studies in Washington. "We can get ourselves together enough to bomb somebody that's not a problem. But after we've bombed, everything usually becomes kind of a mess. The military doesn't want to play policeman, and the administration doesn't want to get involved in nation-building."
Frank Rich, writing in the NYT (April 17, 2003):
For all the news reports of "billions of dollars" of losses, for all the golden objects shown on TV, the most devastating crime may have been the pillaging of cuneiform clay tablets and other glitter-free objects that tell us of the birth of writing, cities and legal codes in what was the former Mesopotamia. This land was the cradle of our civilization, too, long before there was Islam. Most of the early chapters of Genesis are believed to have been set in what only recently has been known as Iraq.
If this history was forgotten or ignored by our ostentatiously Bible-minded administration, so was much more recent American history. In 1943, American armed forces fielded a monuments, fine arts and archives section to try to protect cultural treasures as we prosecuted the war in Europe. Lynn H. Nicholas, who wrote the definitive account of that story in "The Rape of Europa," told me that she had been invited to give lectures "to reserve units doing serious study on the securing of cultural artifacts" in recent years. "They were being prepared for the eventuality of something like this," she says. "Why weren't they deployed?" According to Mr. Rumsfeld, it would be "a stretch" to say our failure to take such measures was "a defect in the war plan." Rather, he said, the looting is just a reminder that freedom is "untidy" or, in this case, literally just another word for nothing left to lose.
Now that the pillaging of the Baghdad museum has become more of a symbol of Baghdad's fall than the toppling of a less exalted artistic asset, the Saddam statue, all the president's men are trying to put Humpty Dumpty back together again. Colin Powell was once again suited up to counter crude Pentagon rhetoric. Karl Rove has been on the phone with Mr. de Montebello. F.B.I. agents are on the case. But even if all such efforts, from Unesco's to that of the mobilized museum world, disable the black market for the major loot, nothing is going to restore the priceless library that is now ash or reconstitute the countless relics that have modest individual monetary value but collectively would have helped scholars reconstruct mankind's deepest past. "These items will appear for sale for $50 or $100 in antique stores all over the Middle East, Europe and North America or on eBay," said Oxford's Professor Robson. "The unsuspecting or the unscrupulous will buy them as novelty Christmas presents or coffee-table pieces."
It's hard to put a loss this big in perspective. I asked Mahrukh Tarapor, the associate director for exhibitions at the Met, to try. Ms. Tarapor has spent the past six years seeking Mesopotamian holdings from museums throughout the world for "Art of the First Cities," an all too timely exhibition that by coincidence is opening on May 8. "It's almost a new emotion," she said, noting that she has felt it only once before, when the Taliban destroyed the Great Buddhas of Bamiyan in central Afghanistan two years ago. "One is almost conditioned to accept even human death as part of life. The destruction of art of our heritage goes very deep in our unconscious. To a museum person, the worst thing you can experience is damage to an object on your watch. For the magnitude of what happened in Iraq, you have no words. You lose faith in your fellow man."
John Armor, a lawyer and author, writing for UPI (April 27, 2003):
[Iraq] MUST NOT be a pure democracy.
Few nations in the world have ever attempted to establish themselves as a pure democracy. And all that have tried have failed. Athens is cited as the first democracy. But it had a limited franchise. Only native citizens who were male and not slaves were allowed to vote. They amounted to about 10 percent of the population. And even that small group still was sufficient to permit the demagoguery that led to its destruction.
From 447 to 404 B.C., Athens had its "Golden Age" under Pericles. It had democracy, peace, and prosperity. But when, by democratic vote, the Athenians banished Gen. Alcibiades, it sealed its own doom. He left Athens, briefly joined the Spartans, and that contributed to the defeat of Athens in the Second Peloponnesian War.
Aristotle's treatise, "On Politics," defined democracy as one of the corrupt forms of government. His conclusion was that any pure democracy would eventually vote itself into failure. He was right about Athens; he has been right since then, with the most recent example being France.
After its revolution, France established a pure democracy. Its government quickly degenerated into a tyranny, with each new set of elected leaders feeding their predecessors to the guillotine. Applying this lesson to Iraq leads to the conclusion that Iraq should not be established as a pure democracy, but as a constitutional republic. Leaders might be elected democratically, but their powers must be circumscribed by the constitution. And the constitution itself must be protected by a supra-majority requirement for ratification of any amendment.
Again, the lessons of history are clear. As Madison, Hamilton and Jay wrote in the Federalist, the U.S. Constitution should not be amendable "by the mere whim of a majority." The same applies to any constitution in any country, including Iraq. Only a constitution that offers protection to minorities of any type -- religious, ethnic, linguistic, etc. -- is worthy of the name "constitution." And only a supra-majority requirement can prevent any constitution from self-destruction at the hands of a temporary majority.
How long will it take for Iraq to develop and put in place a democratic government under a constitution that limits the powers of its government? Again, history provides solid answers. It took two years for Japan to put in place its new government under its new constitution after World War II. That process was, of course, strongly guided by Gen. Douglas MacArthur. It took India two years to put in place its own constitution, with its elaborate protections for religions, languages, and its constituent states.
It took the United States less than a year to write its first constitution. But that constitution, called the Articles of Confederation, failed utterly within 11 years for political and economic reasons. That failure led five states to call the Constitutional Convention in Philadelphia in 1787. In turn, that convention drafted the Constitution, which, as amended, has remained in place longer than any other constitution ever written for any other nation in history.
The United States provided in those events another critical example for those who will write the new Iraqi constitution. They should pay attention to the failure of the first American constitution.
There is no room for constitutional failure in Iraq. Its first effort must be successful. It does not have the luxury of a second chance or more, as the United States and most other nations have had. If the first Iraqi constitution fails, Turkey's influence will reach in from the north, Syria's from the west, and Iran's from the east. Iraq will then have a tripartite dictatorship to replace the single one from Saddam. The historical example here is Lebanon.
Originally, Lebanon was accurately described as the "Switzerland of the Middle East." Its divergent ethnic and religious groups existed peacefully side by side. Despite its lack of oil, it was one of the most prosperous nations in that region. When it degenerated into guerilla warfare between those factions, Syria moved into the power vacuum that resulted. Syria still dominates Lebanon, and its troops occupy the Bekaa Valley, the center of agriculture -- and terrorism -- in Lebanon.
Notice I am not suggesting that Iraq would benefit from adoption of the U.S. Constitution as is, suitably translated. That would most assuredly fail. I suggest that the Iraqis spend substantial time with the histories of constitution writing, across many societies and across the centuries. It is a record mostly of failure, but from that the Iraqis can learn what not to do.
They should take their time. Two years is not an unreasonable time for such an effort. Furthermore, the re-establishment of Iraqi government for Iraqis should not be done from the top down. A democratic republic is best established from the bottom up. The Kurdish areas in the north already have a functioning elected government. Basra should be next, since it has a relatively homogenous population. Mosul and then Baghdad should follow, because the principles of multi-ethnic and multi-religious government must and can be worked out there.
Should the United Nations be involved in the process of Iraqi constitution writing? Absolutely not. A majority of the nations of the United Nations have no use for religious or political freedom, or honest and fair courts of justice, or respect for basic human rights. Furthermore, some of its nations which are themselves highly civilized, have economic or political reasons for interfering in Iraq -- such as Germany and France.
Regardless of what the process is labeled, the umbrella of American and coalition power should be the guarantee of Iraqi borders and Iraqi freedom of movement, of religion, of the press, etc., until the new Iraqi constitution is completed and a new national government is established and, most importantly, functioning. All criticisms opposing that policy should be summarily rejected.
Looking at history, the odds are against Iraq succeeding in establishing a constitutional republic on the first effort, and having it survive. The best chance they have depends on the coalition maintaining the stability of Iraq until that moment. Coalition involvement in the peace is equally as important as its involvement in the war. The proper and circumscribed use of American power for a few years is essential to the long-term success of Iraq.
And lastly, as for those who accuse the United States of imperialism, those charges should be rejected summarily. As in Japan and Germany after World War II, after the war is won, and after the peace is won, the United States will not only withdraw but provide such aid as is needed. Imperial powers do not voluntarily withdraw. Throughout history, no empire has ever shrunk by choice. Once the United States withdraws from Iraq, it will prove, once again, that it is not an imperial power interested in empire.
Economist Dean Baker, commenting on two articles: John M. Berry,"Bush Signals Another Term For Greenspan," in the Washington Post (April 23, 2003) and Richard W. Stevenson,"President Willing to Give Greenspan New Term at Fed," in the New York Times (April 23, 2003):
These articles report on President Bush's statement that he intends to reappoint Alan Greenspan as chairman of the Federal Reserve Board. Both articles include only a passing mention of the stock market bubble. The stock market bubble was the largest financial bubble in the history of the world. It would be difficult to imagine a more serious mistake for a central banker than to allow this sort of bubble to develop and grow.
Tens of millions of families lost much of their retirement savings because they placed their money into a hugely over-valued market. The 2001 recession and the subsequent period of slow growth are direct outcomes of the collapse of the bubble. The fact that President Bush would opt to reappoint a person whose tenure was marked by such a calamity deserved more attention. The negative effects of the collapse of the housing bubble and the dollar bubble, both of which Mr. Greenspan has also fostered, are likely to be comparable to the impact of the collapse of the stock market bubble.
At one point the Times article cites an assertion by Alan Greenspan that it is difficult to recognize financial bubbles. In fact, the over-valuation of the stock market in the late nineties was quite easy to recognize (see e.g. "Too Much of the Bubbly on Wall Street"). This should have been noted.
The Post article cites critics who blame Mr. Greenspan for not raising interest rates to deflate the stock market bubble. It is not clear that it would have been necessary to raise interest rates to deflate the bubble. When Mr. Greenspan made his famous"irrational exuberance" comment in 1996, the market quickly plunged. It bounced back when he subsequently backed away from this comment. If Mr. Greenspan had used his stature and his public platform to carefully explain why the market valuations of the late nineties did not make sense, it seems unlikely that the bubble would have grown to the extent it did.
The Post article also includes a reference to the"double taxation" of dividends. This is a term used by proponents of President Bush's dividend tax cut. The use of this term is comparable to the use of the term"death tax" by people who propose eliminating the estate tax. Since corporations and individuals are legally separate entities, it is not accurate to assert that dividends are subject to double taxation.
George Will, writing in frontpagemag.com (April 25, 2003):
Since the Second World War, which culminated in many regime changes, the United States has had at least a hand in shaping regimes in many places beyond Japan and Germany, as in Italy, where the CIA helped the democratic parties turn back the Communist challenge in the 1948 elections. U.S. actions have determined, or helped to determine, the nature of regimes in Iran (1953), Guatemala (1954), South Vietnam (the coup that killed President Ngo Dinh Diem in October 1963), Chile (1973), Panama (1989), Nicaragua (via the Contras, 1990) and Afghanistan (2001), among other places.
Particularly instructive is the U.S. experience in South Korea, which was still occupied as a colony by Japanese forces at the end of the Second World War. In his history of the Korean War, Max Hastings writes that when U.S. officials arrived on the peninsula in September 1945, their not altogether helpful instruction was ``to create a government in harmony with U.S. policies.''
Americans had no immediate alternative to confirming Japanese colonial officials in their civil administration duties. And Japanese soldiers and police continued to be responsible for maintaining order. When, after four months, 70,000 Japanese civil servants and 600,000 Japanese soldiers and civilians had been sent home, the American military government replaced them mainly with Koreans who had government experience--because they had collaborated with Japan's detested colonial administration.
America's chosen leader for South Korea was Syngman Rhee, who had a Harvard M.A. and a Princeton Ph.D.--he was the first Korean to receive an American doctoral degree. He had lived in America for the previous 35 years. So, although he lacked the credential of having been active in the resistance to the Japanese, he was free from the taint of having collaborated with them. Rhee proved to be autocratic and corrupt.
To fathom today's challenge of political reconstruction in Iraq, consider three things: How much ingenuity was required for Americans in the 1780s and beyond to construct a permanent replacement for the colonial system of governance. How many political geniuses were found for that task. And how much easier America's task, although Herculean, was than Iraq's will be.
Uri Dan, Middle East correspondent of the NY Post, writing in the Jerusalem Post (April 24, 2003):
The American president who made the gravest mistake in the Middle East since World War II was Nobel Peace Prize laureate Jimmy Carter. In the name of human rights Carter applied pressure to the Shah of Iran and caused the collapse of his regime for the sake of another one that trampled human rights, headed by the Ayatollah Khomeini. The CIA at that time rejected the proposals made by Israelis to eliminate Khomeini while he was still in exile in France, where the French government naturally supported him. The regime of Ayatollahs in Teheran created the greatest secret state organization in the world for perpetrating terrorist acts in the name of Islam and Allah. In fact, Carter deserves the Nobel Prize just as two other laureates deserve it - Yasser Arafat and Shimon Peres. Carter was actually the first American president who overthrew a regime in the Middle East. Instead of the semi- democratic regime of the Shah, he established the black dictatorship of the Ayatollahs.
When president Ronald Reagan came into office he began his own campaign of errors, in order to try and correct Carter's historical mistake. When Saddam Hussein, who had just begun his regime of evil in Baghdad, went to war against the regime of the Ayatollahs, the Reagan administration gave him aid. I well recall the meetings between US Ambassador Samuel Lewis and Defense Minister Ariel Sharon in 1981, when Sharon protested against the secret US arms shipments to Iraq. Sharon explained that Saddam was a dangerous enemy, and the weapons supplied to him by the US were liable to reach the Palestinian terrorist organizations.
All that I heard then from people close to Lewis was that Sharon, the Israeli foreign minister, was the one endangering US policy in the Middle East, and not Saddam Hussein.
Even earlier, when Sharon served as minister of agriculture but was a member of the Ministerial Committee for Security, he helped prime minister Menahem Begin to take the secret decision to bomb and destroy Saddam Hussein's nuclear reactor in Baghdad.
Tiny Israel did not need a tremendous coalition and a huge American expeditionary force in order to carry out this strategic move, one of the most important ones that any country has made in the post-war era.
Instead of thanking Israel, US Defense Secretary Casper Weinberger condemned and punished Israel. For example, he delayed an additional shipment of fighter bombers purchased in the US by the Jewish State.
Casper Weinberger was even less forgiving to Sharon when he succeeded, as Defense Minister, in expelling Yasser Arafat and his ten thousand terrorists from Beirut in August 1982. I was present at various meetings between Sharon, Weinberger and other senior American officials, in Israel, in the Pentagon, and in Beirut. I obtained the impression that they couldn't forgive Sharon, Begin, and Israel for succeeding, for the first time in history, in besieging an Arab capital, Beirut, in order to expel the largest terrorist organization in the world, the PLO. Sharon explained, till his throat was hoarse, that it was essential to continue and uproot the Muslim terror in Western Beirut, that was supported by Iran and Syria, in order to free the Middle East and not just Israel from this threat. I heard the replies of Weinberger and I saw his face expressing unconcealed hostility to Sharon and to Israel.
US Secretary of State Alexander Haig, who understood and supported Sharon's policy from the beginning of that war, was forced to resign. "They didn't want to listen to me," he recently admitted in an interview on the Fox News Channel. Weinberger and the State Department wanted Israel to give in immediately to the Palestinians according to the Reagan plan. Consequently the Reagan administration was pleased when the blood libel against Sharon was concocted regarding the massacre of Palestinians by Christians in Sabra and Shatila in September 1982. Sharon was thrown out of the Ministry of Defense and Begin resigned shortly afterwards.
WASHINGTON HOPED that now Israel would do precisely what the US wanted in the war against terrorism, instead of acting independently as Sharon had done. "We want Sharon's policy without Sharon," the Reagan administration said to the successors of Sharon and Begin.
But Israel became impotent. It lost its initiative and its belief in its ability to defend itself. Israel had never been America's policeman in the Middle East. However, the presence of the sole democracy in the region and its stand against Palestinian terror in particular and Muslim terror in general, removed the need for US military involvement in the region in order to protect its interests.
From the very moment that Israel was made impotent in Lebanon by the Reagan administration, with the active cooperation of the Labor Party and the Left in Israel, the security situation in the region degenerated. From Tripoli, Libya, to Baghdad the dam burst open to Islamic fascism and terrorism.
About 250 Marines were slaughtered in October 1983 in Lebanon, and shortly afterwards the US retreated from there because of the attacks initiated by Teheran and Damascus.
About two years later, when Muammar Qaddafi killed Americans with a bomb in a Berlin discotheque, Reagan was forced to send an armada of bombers to try and kill Qaddafi. Because France refused to permit the bombers to overfly its territory, the American bombers were delayed by an hour and Qaddafi was no longer in the estimated place. Then French foreign minister, Roland Dumas, boasted of this recently.
The weakness displayed by the US, and the military paralysis that took hold of Israel produced a lack of deterrence against Saddam, and in 1990 he began overtly threatening the use of weapons of mass destruction, and even invaded Kuwait that same year.
The senior president Bush and his secretary of state James Baker were forced at that time to wage the first American war in the Middle East, in order to expel Saddam from Kuwait.
Just as in a Greek tragedy, the drama merely intensified. Bush and Baker thought that after that justified, but uncompleted, war, they must appease the Arabs. They in fact caused Prime Minister Yitzhak Shamir to agree for the first time to make concessions to Yasser Arafat, through the "Peace Conference" held in Madrid in October 1991. They then acted in order to defeat the Shamir-Sharon government and aid Yitzhak Rabin to be elected as prime minister.
This opened the road to hell in the Middle East. The Israeli-American front surrendered to terror. Shimon Peres succeeded in bringing Yasser Arafat to the White House in September 1993, and from there to the gates of Jerusalem. Arafat became a regular guest of Bill Clinton. Israel fled from southern Lebanon. The road to the terrorist offensive of Arafat's Muslim suicide bombers was open.
William Watson, writing in the Montreal Gazette (April 23, 2003):
If you want an example of a pandemic that really was worth panicking about, consider influenza in 1918. Estimates of the death toll differ, with some ranging as high as 100 million deaths worldwide, but none is less than 20 million, almost double the body count in the Great War, which was ending just as the flu hit.
The U.S. lost almost 600,000 people, Canada 30,000 - that in a country of just 9 million people. And the flu did most of its killing in just six months, not the four years the war took.
The latest SARS count, adjusted upward after China's admission of Enron-like accounting techniques, is 3,948 cases worldwide, with 229 deaths, over a period now measured in months. That's doesn't compare to 1918, when New York City's worst day was 851 deaths. In the worst U.S. month, October 1918, 195,000 Americans died - the equivalent of more than 60 World Trade Centre bombings. In some places, bodies were picked up in dump trucks, and graves were dug with steam shovels. In November 1918, 2,500 people died from flu in Saskatchewan alone. More precisely, they drowned from the flu, as bloody fluid filled their lungs.
Treatment was rudimentary. There were no anti-bacterial drugs, let alone anti-viral agents. As they do today, people wore masks. "In gauze we trust," is the title of one account of treating the epidemic, though there's little evidence masks helped. People who caught the deadly flu were bathed and given aspirin. Whisky was another popular remedy. Under public pressure, Saskatchewan even allowed it to be sold without a prescription.
Whether modern treatments would have worked is the subject of some debate. Experiments with manufactured viruses thought to be similar show some success. But current estimates are that it will take at least a year to develop a SARS vaccine. In 1918, most of the damage was done faster than that.
One of the most striking things about accounts of the 1918 epidemic are the stories of widespread volunteering. In New York, despite the risk of infection, hundreds of women answered appeals to help with nursing. It's hard to believe you'd get the same reaction today. On the other hand, nurses occasionally were kidnapped by families desperate for treatment - which has a comfortingly contemporary stench of panic about it.
Margie Burns, English teacher at the University of Maryland, writing in Style (April 23, 2003):
Shakespeares Henry V contains some of the great warlike speeches of all time Once more unto the breach, dear friends, once more! and its poetry has been pillaged for every English war since it was written in 1599.
Understandably, Henry V is one of four books now being given to military personnel in the Pentagons Legacy Project, recently revived. During World War II more than 123 million Armed Services Editions (ASEs) were handed out to U.S. troops overseas.
It was the largest free distribution of fiction and nonfiction books in the history of the world. More than 1,300 titles in all were published, including classic works of literature by such authors as Hemingway, Steinbeck and Melville.
My father read those books in the Army in World War II someone shipped a crate of classics to New Guinea. He mentioned later that in six months he read most of the greatest novels ever written.
The program was discontinued in 1947, according to its Web site, but now has several publishers distributing free ASEs to American troops throughout the world and on U.S. warships.
It would be hard to fault the World War II project. The current project, however, is drawing criticism, largely because Hemingway, Steinbeck and Melville seem to have been passed over in favor of more militaristic content. Sun Tzus The Art of War, Mike Wallace and Allen Mikaelians Medal of Honor, and War Letters: Extraordinary Correspondence from American Wars (ed. Andrew Carroll) have all made the cut. Mark Twains The War Prayer has not.
But the criticism underestimates Shakespeare, if not the Pentagon. Anyone who assumes Henry V is gung-ho jingoism should read its first scene.
The play is about Henrys invading France, to conquer it and take it over for England. (Some scenes trash the French exactly they way theyre being trashed today.)
But the invasion is instigated not by Henry but by two eminencies behind the throne, the political archbishop of Canterbury and bishop of Ely. As the play opens, they are worried, not about France, but about a threat closer to home: The House of Commons is considering a bill to confiscate half the churchs possessions....
When the bishop asks repeatedly how they can ward off this threat, the archbishop comes up with his campaign idea: they will urge Henry to conquer France instead, and will give him the money to do it:
Which I have opend to his grace at large,
As touching France, to give a greater sum
Than ever at one time the clergy yet
Did to his predecessors part withal.
France, they will argue, offers bigger booty than the church.
The severals and unhidden passages
Of his true titles to some certain dukedoms
And generally to the crown and seat of France
Derived from Edward, his great-grandfather
and along with the money, they will also give his invasion the churchs blessing. Incidentally, they also treat the invasion as inevitable.
And thus begins a great play, in its own ironic way, with a truly enjoyable, smarmy kickoff. The conquest of France, of course, turns out to be a premier example of be-careful-what-you-wish-for: Henry V dies young, France rebels against the English interlopers (even in the English history plays) and expels them ignominiously within a few years, leaving English politicians quarreling over who lost France for two generations, and a breach between France and England natural allies and trading partners for three centuries.
Art dealer Andre Emmerich, writing in the Wall Street Journal (April 24, 2003):
Contrary to what some believe, trade in ancient objects is not the enemy of preservation. The great contribution the art market makes to this cause is to endow works of art with value. When objects have no value they are inevitably at grave risk of destruction because preserving them is a costly enterprise. Storing, safeguarding, heating and air conditioning, and conserving art can only be done for a relatively few things. In practice, there is a constant triage which saves a few treasured objects while consigning the remainder to destruction through benign neglect.
Recognition of the usefulness of the art market in reclaiming as much as possible of the Iraq museum's looted objects came with a proposal by Philippe de Montebello, director of the Metropolitan Museum in New York, and others to offer amnesty and a stipend to anyone turning any in. As for the larger world market outside of Iraq, no responsible art dealer or antiquarian will touch any Mesopotamian object unless there is positive proof of provenance, or ownership history, dating prior to the second Iraqi war.
In the meantime, archaeologists continue to advocate prohibiting the export of archaeological art from its countries of origin by nationalization and by banning exports into this country. They support laws and treaties which will slowly but surely strangle the art market and access to such art for museums and collectors. And they support the efforts of foreign countries to reclaim objects and denude the holdings of museums and collectors. Greece's continuing campaign to repatriate the Elgin Marbles is the most visible example of this program. The tragedy is that by bandying about such terms as "stolen art," "smuggled," and "looted," the retentionists claim the moral high ground. In fact higher morality, as so often, is best served by the free market.
One curious point in this debate is that the present-day population of so many archaeologically well-endowed regions consists of the descendants of the invaders who destroyed the very cultures whose remnants their modern governments now so jealously claim as exclusively theirs. Turkey's Adriatic coast is rich in ancient Greek art -- but in the 1920s, the remnant of its Greek population was expelled in an early instance of ethnic cleansing. Most modern Latin Americans are descendants of the Spanish Conquistadors who destroyed the Aztec and Inca empires and all their works within reach. Do these descendants have a better moral claim to the buried artifacts of earlier civilizations than the rest of humanity?
A case must also be made for American exceptionalism. We are a country of immigrants, coming from countries all over the world, and surely have a moral claim to reasonable access to the buried treasures of our common ancestors. It should also be remembered that as American museums and collectors have purchased a part of this international cultural heritage, American scholarship has more than repaid any debt connected with such acquisitions. For example in the field of pre-Columbian art alone, well over half the existing scholarly literature has been produced in the U.S., along with such feats as the recent successful deciphering of Maya writing by American archaeologists.
From the Wall Street Journal (April 24, 2003):
World War I: Germans shell the library at Louvain, Belgium, and Rheims Cathedral, France.
World War II: The Nazis bomb London and other historic British cities; they deliberately destroy palaces, churches and synagogues in Russia and Poland, and flatten Warsaw. Bombing by the Allies reduces most German cities to rubble; they bomb the Italian monastery of Monte Cassino, which had become a German artillery emplacement.
India, 1947: Fighting after independence produces major destruction of monuments.
Egypt, 1960: Construction of the Aswan High Dam floods hundreds of square miles of archaeological sites.
Italy, 1966: The Arno river floods Florence, causing immense destruction in the cradle of the Renaissance.
China, 1966-76: The Cultural Revolution causes still unmeasured destruction.
Italy, 1972: A vandal attacks Michelangelo's "Pietà" in St. Peter's Basilica.
U.S., 1974: A vandal attacks Picasso's "Guernica" at the Museum of Modern Art.
Cambodia, 1975-79: Temples of Angkor Wat are looted during the civil war.
Balkans, early 1990s: The breakup of Yugoslavia produces the wanton destruction of the ancient city of Dubrovnik, its library and the Mostar Bridge, along with much else.
Italy, 1993: The Sicilian Mafia bombs the Uffizi Gallery in Florence, home of some of the greatest masterpieces of the Renaissance.
Italy, 1997: An earthquake hits Assisi, destroying murals by Giotto.
Greece, 1999: A little-publicized earthquake hits Athens, destroying a significant part of the National Museum's ancient Greek vases, displayed on glass shelves.
Afghanistan, 2001: The Taliban destroy giant statues of the Buddhas of Bamiyan.
Reza Baraheni, president of Pen Canada, writing in the Toronto Star (April 22, 2003):
Joseph Goebbels once said: "When I hear the word culture I reach for my revolver."
No one has seen anyone reaching for his revolver to destroy the Baghdad Museum, the cradle of Eastern and Western civilizations and cultures, and the Iraqi National Library, the collective memory of the many peoples who lived in the Middle East or passed through it to reach other lands. Looters did the job, instead.
The list these civilizations is very long: Assyrians, Akkadians, Sumerians, Babylonians, Arabs, Jews, Medes, Persians, Parthians, Turks and Mongols.
Baghdad, one of the most ancient cities of the world, is made of two words: the Indo-Iranian Bagh and the Persian dad. The first of these meant "god" in the ancient world among Arabs, Persians, Turks and Greeks. The name of the Greek god Bacchus has the same root. The second word means "gave."
So "god-given" would be the correct meaning of the word Baghdad. It seems that God gave the people of Iraq two things: oil, and a collective memory, with its books and artifacts gathered in their national library and museum....
Museums and libraries bring together cultural manifestations of different periods under a contemporary roof. In them, to paraphrase T.S. Eliot, "time past and time present point to one end, which is time future."
We turn to them to see what happened, or rather what is happening in the tricky trip of time. By securing the past we do not look into the past for its sake. We look forward to the future. The remembrance of things past is the reinvention of memory, the method to show the way to future.
A museum shows that people do not grow like plants, and do not die like plants. Human beings have shadows. A museum, a library, is a shadow. The past watches us as we watch the future.
This war deprived humanity of one of its precious shadows....
Two soldiers and one tank would have been sufficient to keep off the looters from the museum and the library.
It is difficult to count all the dead in a war. Saddam Hussein, assisted by Rumsfeld, killed a million Iranians during the war between Iran and Iraq.
We wept for them in prose and poetry.
We weep now for the dead in Iraq, because of American aggression. And we weep with Nabihal Amin for the death of a museum, and the death of a library.
Jason Mazzone, a professor of constitutional law at Brooklyn Law School as of the fall of 2003, writing in the NYT (April 24, 2003):
Operation Atlas, New York City's plan to protect itself from terrorist attacks, is likely to cost $700 million a year, much of it in overtime pay for police officers and firefighters. Mayor Michael R. Bloomberg has asked the federal government for money that would offset the costs of the program. While Congress has offered some $200 million in security spending, it has no intention of footing the entire bill. A close reading of the Constitution, however, suggests that it should.
Article IV, Section 4 of the Constitution says, "The United States . . . shall protect each of [the states] against Invasion." Unlike other provisions that merely authorize governmental action, this article imposes on Washington an obligation to defend states and their cities from foreign attacks. If New York City needs Operation Atlas, the federal government must pay for the program.
Article IV embodies a fundamental structural change that occurred when the former American colonies ratified the Constitution. While the states would oversee domestic police work, they gave to the new national government the power to keep troops, wage war, enter into treaties with foreign nations and regulate the admission and movement of foreigners.
Eighteenth-century Americans who were as worried about sneak assaults from foreign agents (and British sympathizers) as they were about the arrival of enemy gunships off the coastline would have understood that attacks like those of 9/11 fall within the scope of Article IV. The Bush administration itself has repeatedly characterized terrorism as an act of war....
The framers of the Constitution anticipated such situations, imagining that state militia units, rather than a large, federal military force, could provide security in times of need. The Constitution therefore gave Congress authority to "provide for calling forth the Militia to . . . repel Invasions," and it put those militia under the authority of the president. If, however, the federal government chose to rely on state militiamen and employed them "in the Service of the United States," it had to foot the bill for "organizing, arming, and disciplining" them.
Hence, beginning with a $200,000 annual appropriation under the Militia Act of 1808, Congress provided money to arm militias and pay them for periods of federal service. While today's police officers and firefighters are of course not members of a state militia, comparable logic would nonetheless require Congress to cover the costs of employing them to prevent terrorism.
comments powered by Disqus
- T. rex fossils arrive at Smithsonian’s National Museum of Natural History
- Quote of the Day -- Time Magazine's Top 100 People
- Investigation: The Resegregation of America's Schools
- 5 Explosive Revelations Leaked from Senate Report Exposing CIA Torture
- In Parts of the South, Glorifying Slavery No Longer Pays the Bills
- UC Berkeley professor emeritus Robert Harlan dies at 84
- She Came All the Way from Melbourne to Attend the OAH
- The 7 Most Popular HNN Videos from the 2014 OAH
- Jesse Lemisch’s up-from-below history is still strikingly original
- U.Va. Historian Alan Taylor Wins 2014 Pulitzer for Book on Slaves and War -- His second Pulitzer!