Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: tomdispatch.com (blog of Tom Engelhardt) (8-30-09)
A week ago, two convicted mass murderers leaped back into public consciousness as news coverage of their stories briefly intersected. One was freed from prison, continuing to proclaim his innocence, and his release was vehemently denounced in the United States as were the well-wishers who welcomed him home. The other expressed his contrition, after almost 35 years living in his country in a state of freedom, and few commented.
When Abdel Baset al-Megrahi, the Libyan sentenced in 2001 to 27 years in prison for the 1988 bombing of Pan Am Flight 103 over Lockerbie, Scotland, was released from incarceration by the Scottish government on" compassionate grounds," a furor erupted. On August 22nd, ABC World News with Charles Gibson featured a segment on outrage over the Libyan's release. It was aired shortly before a report on an apology offered by William Calley, who, in 1971 as a young lieutenant, was sentenced to life in prison for the massacre of civilians in the Vietnamese village of My Lai.
After al-Megrahi, who served eight years in prison, arrived home to a hero's welcome in Libya, officials in Washington expressed their dismay. To White House Press Secretary Robert Gibbs, it was"outrageous and disgusting"; to President Barack Obama,"highly objectionable." Calley, who admitted at trial to killing Vietnamese civilians personally, but served only three years of house arrest following an intervention by President Richard Nixon, received a standing ovation from the Kiwanis Club of Greater Columbus, Georgia, the city where he lived for years following the war. (He now resides in Atlanta.) For him, there was no such uproar, and no one, apparently, thought to ask either Gibbs or the president for comment, despite the eerie confluence of the two men and their fates.
Part of the difference in treatment was certainly the passage of time and Calley's contrition, however many decades delayed, regarding the infamous massacre of more than 500 civilians."There is not a day that goes by that I do not feel remorse for what happened that day in My Lai," the Vietnam veteran told his audience."I feel remorse for the Vietnamese who were killed, for their families, for the American soldiers involved and their families. I am very sorry." For his part, al-Megrahi, now dying of cancer, accepted that relatives of the 270 victims of the Lockerbie bombing"have hatred for me. It's natural to behave like this... They believe I'm guilty, which in reality I'm not. One day the truth won't be hiding as it is now. We have an Arab saying: 'The truth never dies.'"
Calley was charged in the deaths of more than 100 civilians and convicted in the murder of 22 in one village, while al-Megrahi was convicted of the murder of 270 civilians aboard one airplane. Almost everyone, it seems, found it perverse, outrageous, or"gross and callous" that the Scottish government allowed a convicted mass murderer to return to a homeland where he was greeted with open arms. No one seemingly thought it odd that another mass murderer had lived freely in his home country for so long. The families of the Lockerbie victims were widely interviewed. As the Calley story broke, no American reporter apparently thought it worth the bother to look for the families of the My Lai victims, let alone ask them what they thought of the apology of the long-free officer who had presided over, and personally taken part in the killing of, their loved ones.
Whatever the official response to al-Megrahi, the lack of comment on Calley underscores a longstanding American aversion to facing what the U.S. did to Vietnam and its people during a war that ended more than 30 years ago. Since then, one cover-up of mass murder after another has unraveled and bubbled into view. These have included the mass killing of civilians in the Mekong Delta village of Thanh Phong by future senator Bob Kerrey and the SEAL team he led (exposed by the New York Times Magazine and CBS News in 2001); a long series of atrocities (including murders, torture, and mutilations) involving the deaths of hundreds of noncombatants largely committed in Quang Ngai Province (where My Lai is also located) by an elite U.S. unit, the Tiger Force (exposed by the Toledo Blade in 2003); seven massacres, 78 other attacks on noncombatants, and 141 instances of torture, among other atrocities (exposed by the Los Angeles Times in 2006); a massacre of civilians by U.S. Marines in Quang Nam Province's Le Bac hamlet (exposed in In These Times magazine in 2008); and the slaughter of thousands of Vietnamese in the Mekong Delta during Operation Speedy Express (exposed in The Nation magazine, also in 2008). Over the last decade, long suppressed horrors from Vietnam have been piling up, indicating not only that My Lai, horrific and iconic as it may have been, was no isolated incident, but that many American veterans have long lived with memories not unlike those of William Calley.
If you recall what actually happened at My Lai, Calley's more-than-40-years-late apology cannot help but ring hollow. Not only were more than 500 defenseless civilians slaughtered by Calley and some of the 100 troops who stormed the village on March 16, 1968, but women and girls were brutally raped, bodies were horrifically mutilated, homes set aflame, animals tortured and killed, the local water supply fouled, and the village razed to the ground. Some of the civilians were killed in their bomb shelters, others when they tried to leave them. Women holding infants were gunned down. Others, gathered together, threw themselves on top of their children as they were sprayed with automatic rifle fire. Children, even babies, were executed at close range. Many were slaughtered in an irrigation ditch.
For his part in the bloodbath, Calley was convicted and sentenced to life in prison at hard labor. As it happened, he spent only three days in a military stockade before President Richard Nixon intervened and had him returned to his"bachelor apartment," where he enjoyed regular visits from a girlfriend, built gas-powered model airplanes, and kept a small menagerie of pets. By late 1974, Calley was a free man. He subsequently went on the college lecture circuit (making $2,000 an appearance), married the daughter of a jeweler in Columbus, Georgia, and worked at the jewelry store for many years without hue or cry from fellow Americans among whom he lived. All that time he stayed silent and, despite ample opportunity, offered no apologies.
Still, Calley's belated remorse evidences a sense of responsibility that his superiors -- from his company commander Capt. Ernest Medina to his commander-in-chief President Lyndon Johnson -- never had the moral fiber to shoulder. Recently, in considering the life and death of Johnson's Secretary of Defense Robert McNamara, who repudiated his wartime justifications for the conflict decades later ("We were wrong, terribly wrong."), Jonathan Schell asked:
"[H]ow many public figures of his importance have ever expressed any regret at all for their mistakes and follies and crimes? As the decades of the twentieth century rolled by, the heaps of corpses towered, ever higher, up to the skies, and now they pile up again in the new century, but how many of those in high office who have made these things happen have ever said, 'I made a mistake,' or 'I was terribly wrong,' or shed a tear over their actions? I come up with: one, Robert McNamara."
Because the United States failed to take responsibility for the massive scale of civilian slaughter and suffering inflicted in Southeast Asia in the war years, and because McNamara's contrition arrived decades late, he never became the public face of slaughter in Vietnam, even though he, like other top U.S. civilian officials and military commanders of that time, bore an exponentially greater responsibility for the bloodshed in that country than the low-ranking Calley.
Butchery in the Mekong Delta
A few weeks after McNamara's death, Julian Ewell, a top Army general who served in two important command roles in Vietnam, also passed away. For years, the specter of atrocity had swirled around him, but only among a select community of veterans and Vietnam War historians. In 1971, Newsweek magazine's Kevin Buckley and Alex Shimkin conducted a wide-ranging investigation of Ewell's crowning achievement, a six-month operation in the Mekong Delta code-named Speedy Express, and found evidence of the widespread slaughter of civilians."The horror was worse than My Lai," one American official told Buckley."But… the civilian casualties came in dribbles and were pieced out over a long time. And most of them were inflicted from the air and at night. Also, they were sanctioned by the command's insistence on high body counts."
As word of the impending Newsweek article spread, John Paul Vann, a retired Army lieutenant colonel who was by then the third-most-powerful American serving in Vietnam, and his deputy, Colonel David Farnham, met in Washington with Army Chief of Staff General William Westmoreland. At that meeting, Vann told Westmoreland that Ewell's troops had wantonly killed civilians in order to boost the body count -- the number of enemy dead that served as the primary indicator of success in the field -- and so further the general's reputation and career. According to Farnham, Vann said Speedy Express was, in effect,"many My Lais."
A Pentagon-level cover-up and Newsweek's desire not to upset the Nixon administration in the wake of the My Lai revelations kept the full results of the meticulous investigation by Buckley and Shimkin bottled up. The publication of a severely truncated version of their article allowed the Pentagon to ride out the coverage without being forced to convene a large-scale official inquiry of the sort which followed public disclosure of the My Lai massacre. Only last year did some of the reporting that Newsweek suppressed, as well as new evidence of the slaughter and the cover-up, appear in a piece of mine in The Nation and only in the wake of Ewell's death was it mentioned in the Washington Post that a long-secret official Army report, commissioned in response to Buckley and Shimkin's investigation, concluded:
"[W]hile there appears to be no means of determining the precise number of civilian casualties incurred by US forces during Operation Speedy Express, it would appear that the extent of these casualties was in fact substantial, and that a fairly solid case can be constructed to show that civilian casualties may have amounted to several thousand (between 5,000 and 7,000)."
A year after the eviscerated Buckley-Shimkin piece was published, Ewell retired from the Army. Colonel Farnham believed that the general was prematurely pushed out due to continuing Army fears of a scandal. If true, it was the only act approaching official censure that he apparently ever experienced, far less punishment than that meted out to al-Megrahi, or even Calley. Yet Ewell was responsible for the deaths of markedly more civilians. Needless to say, Ewell's civilian slaughter never garnered significant TV coverage, nor did any U.S. president ever express outrage over it, or begrudge the general his military benefits, let alone the ability to spend time with his family. In fact, in October, following a memorial service, Julian Ewell will be buried with full military honors at Arlington National Cemetery.
Chain of Command
In his recent remarks, William Calley emphasized that he was following orders at My Lai, a point on which he has never wavered. The Army's investigation into My Lai involved 45 members of Medina's company, including Calley, suspected of atrocities. In a second investigation, 30 individuals were looked into for covering up what happened in the village by"omissions or commissions." Twenty-eight of them were officers, two of them generals, and as a group they stood accused of a total of 224 offenses. Calley, however, was the sole person convicted of an offense in connection with My Lai. Even he ultimately evaded any substantive punishment for his crimes.
While an opportunity was squandered during the Vietnam era, Calley's apology and the response to al-Megrahi's release offer another chance for some essential soul-searching in the United States. In considering Calley's decades-late contrition, Americans might ask why a double-standard exists when it comes to official outrage over mass murder. It might also be worth asking why some individuals, like a former Libyan intelligence officer or, in rare instances, a low-ranking U.S. infantry officer, are made to bear so much blame for major crimes whose responsibility obviously reached far above them; and why officers up the chain of command, and war managers -- in Washington or Tripoli -- escape punishment for the civilian blood on their hands. Unfortunately, this opportunity will almost certainly be squandered as well.
Similarly, it's unlikely that Americans will seriously contemplate just how so many lived beside Calley for so long, without seeking justice -- as would be second nature in the case of a similarly horrific crime committed by an officer serving a hostile power elsewhere. Yet he and fellow American officers from Donald Reh (implicated in the deaths of 19 civilians -- mostly women and children -- during a February 1968 massacre) to Bob Kerrey have gone about their lives without so much as being tried by court martial, let alone serving prison time as did al-Megrahi.
In the immediate wake of Calley's contrition, it wasn't a reporter from the American media but from Agence France Presse (AFP) who thought to check on how Vietnamese survivors or relatives of those massacred at My Lai might react. When an AFP reporter spoke to Pham Thanh Cong, who saw his mother and brothers killed in the My Lai massacre (and now runs a small museum at the village) and asked what he thought of Calley's apology, he responded,"Maybe he has now repented for his crimes and his mistakes committed more than 40 years ago." Maybe.
Today, some of Calley's cohorts, the mostly anonymous others who perpetrated their own horrors in Southeast Asia and never faced even a modicum of justice for their crimes, go about their lives in American cities and suburbs. (Others, who have committed unpunished offenses in the Global War on Terror, are still on active duty.) As a result, the outrage over what happened to the only man convicted of the terrorist act against Pan Am Flight 103 over Lockerbie, Scotland, has a strikingly hollow ring.
A failure to demand an honest accounting of the suffering the United States caused the Vietnamese people and a willingness to ignore ample evidence of widespread slaughter remains a lasting legacy of the Vietnam War. So does a desire to reduce all discussion of U.S. atrocities in Southeast Asia to the massacre at My Lai, with William Calley bearing the burden -- not just for his crimes but for all U.S. crimes there. And it will remain so until the American people do what their military and civilian leadership have failed to do for more than 40 years: take responsibility for the misery the U.S. inflicted in Southeast Asia.
Posted on: Monday, August 31, 2009 - 22:59
SOURCE: froginawell.net (8-31-09)
Last time I lived in Japan, the LDP lost control of the Diet, and for a year and a half there was a Socialist Prime Minister in charge of an implausible coalition between the Japanese Socialist Party (JSP) and the Liberal Democratic Party (LDP). The Democratic Party of Japan, which just took control of the lower house of the Diet, was formed in the aftermath of that coalition: the more liberal elements of the LDP combined with the more moderate elements of the JSP.1 This left a more conservative LDP and a more Socialist SDP, and also, as a side effect, left the LDP again in charge of the government, in coalition with the Komeito and other conservative groups. Another side effect: the bushy eyebrows and grandfatherly face of Murayama Tomiichi were immortalized in the “Ton-chan” dolls sold by the JSP; I bought one, thinking that this might be “historic.”2
You could hardly tell from the news reports coming out of Japan at the moment.3 I suppose that I’m not surprised by the lack of respect given to the mid-90s political turmoil: it was inconclusive and sloppy, not the kind of clear-cut “historic” event that makes for banner headlines. But what came out of it was an LDP that was, honestly, destined to fail: instead of representing the middle two-thirds of the Japanese political spectrum, it represented a heavily right-oriented one-third, while the DPJ took a big chunk of what was left. Essentially, the LDP split, probably the natural end to a party that was a coalition to begin with, formed out of a Cold War fear that Japan’s leftist parties might put aside their differences long enough to win control of the Diet. While it took a few elections, and another decade of disappointing economic stagnation, the left wing of the former LDP has overtaken the right wing of the former LDP, and a former member of the LDP is going to be Prime Minister.4
Is this “historic“? Well, it depends, of course. If the DP turns out to be more or less just like the LDP, then it’s no more historic than Pepsi™ overtaking Coca-Cola™. If the DP turns out to be a genuinely center-left party which reduces international entanglements while successfully fostering economic development, it could actually be a revival of the Yoshida Doctrine. That might actually be interesting, especially since it could mean a shift away from the normalization discourses we’ve been hearing so much of. I guess it’s a bit too soon to write the new narrative.
1. This is a rough approximation. The faction politics of the LDP did not neatly divide along ideological lines, but some sense of policy alignment was starting to become clearer when the split happened
2. Actually, I bought two: one for me and one for my parents.
3. I want to thank Adam Richards for his tireless political blogging during this election, possibly the best reportage in English this time around.
4. I don’t think anyone’s going to make plush toys out of Hatoyama Yukio, though he’d make a credible daruma.
Posted on: Monday, August 31, 2009 - 22:56
SOURCE: China Beat (8-31-09)
Dear Mr. President,
As a (perhaps the least prominent) member of your Asia Foreign Policy Group during the campaign, I am thrilled that you are soon headed for China. If your trip is, for you, anything like my trip was for me (albeit more than 32 years ago), you will be fascinated, impressed, and perhaps sobered at how much there is to see and know and how little time you have to accomplish all that you might want to.
Here are a few random tips on how to make your visit most successful; from what I have seen of you as president, most of the things I offer have long since come naturally to you anyway, and your personal grace and dignity, as well as your intellect and grasp of issues, will prove the guarantors of your successful visit. Still, here are a few thoughts.
1. Make a point of listening attentively. The pace of high-level meetings can be slow; don’t try to force it by pushing ahead before your counterpart has finished. If you do not fully understand, in translation, something your host has said to you, ask for clarification. Allow time for silence between deliveries. Sometimes the Chinese waits for a while to be sure that the American visitor has finished his remarks; unable to tolerate the silent interval, the American starts talking again. Let things settle in any back-and-forth.
2. Avoid verbal pyrotechnics and culture-bound American colloquialisms. You are blessedly well spoken anyway, but popular culture terms, US sports jargon, and humor based either on purely American experiences or on English language word play don’t work. We veterans of the early days will remember Doonesbury’s figure Honey (still very much alive and active in real life in Beijing, by the way) telling her Chinese official boss, “The American is making a joke; laugh now.”
3. Ask questions, particularly on topics generically similar to those of your own nation and your personal experience. How does China approach, say, the need for better health care delivery to the poor? How is China approaching environmental challenges such as mushrooming automobile-generated pollution in the cities?
4. Emphasize your commitment to promoting more and farther-reaching cooperation between the US and Chinese as uniquely significant global partners, but emphasize as well that despite differences in wealth and other national characteristics, such intensified “win-win” cooperation will require flexibility and realistic compromise by both countries.
5. While it is unlikely, there is the possibility that your Chinese hosts will subject you to one or another set-piece statement on an issue of contention. Your staff knows these issues well and you will be well briefed. My recommendation is to reply with a simple “Thank you for making your position clear,” without elaboration or attempt at response.
6. You will be well supplied by your staff with appropriate gifts and souvenirs for presentation at certain occasions. I hope they will include copies of your book, Dreams from My Father, which is a major contribution to the world’s understanding, not only of you, but of your country. If only Chinese leaders produced works of comparable insight, the world would be a better place.
7. Celebrate the contribution that Americans of Chinese descent have made, and continue to make, to the United States. This need not simply be a list of Nobel winners and world-renowned artists. The traditions of excellence, hard work, family and social community solidarity, and economic advancement that Americans of Chinese descent so widely embody are a credit both to the influences of their Chinese heritage and to the opportunities and rewards that America makes available to its citizens.
8. If Mrs. Obama, and especially your children, are with you, be sure to give them some China time. China can be a mind-expanding experience for kids, and the Chinese people love children and parents who love their children.
9. And finally: if you have to cite “old sayings,” cite ours, not theirs. The American experience has produced wonderful aphorisms; use them and explain them. Don’t pretend to instruct the Chinese on their own “old sayings” by reciting them back to your hosts. The Chinese will very likely provide you with some of theirs anyway.
Posted on: Monday, August 31, 2009 - 22:48
SOURCE: GLORIA Center (8-20-09)
One of Israel’s highest priorities in negotiations with the Palestinian Authority (PA) is recognition by the PA and Arab states as a “Jewish state.” The purpose of this demand is to ensure a lasting peace with Israel as it exists rather than some formal declaration which would thereafter be subverted in every possible way.
(For Israel's peace plan go here; for a summary of the two sides' negotiating positions, go here)
Remember, after all, that the Middle East is full of countries which, when you recognize them, you accept their self-definition. Here are some of the names of countries which you accept when you recognize them: The Arab Republic of Egypt, the Syrian Arab Republic, the Islamic Republic of Iran, or even—as in the Hashemite Kingdom of Jordan or Saudi Arabia—designating them as being under the rule of a single family.
The Palestinian Authority’s constitution for a Palestinian state—which will probably have the word “Arab” and possibly “Islamic” in its name—states that country is Arab in nationality and that the official religion is Islam.
But the most important reason is to counter various tricks like that of the “Right of Return,” which is based on a false reading of a single non-binding UN document that the Palestinians and Arabs rejected more than fifty years ago. Note that this demand—that all Palestinians who ever lived in what is now Israel or are descendants of such people—can come and live in Israel. Naturally, there first goal would be to destroy that country and the result would be horrible violence, bloodshed, and instability.
Don’t believe anyone who tells you this isn’t a serious demand on the PA’s part or that they will—as they tell credible people in private—not really implement it once Israel promises to let them do it. It is an absolutely central demand and if any Palestinian leader dared give it up publicly his life span—politically at least—would be very limited.
Now, however, Professor Shlomo Avineri, possibly Israel’s greatest public intellectual at present, has given a good explanation as to why recognition as a Jewish state is so important for Israel:
“Israel has never called into question the existence of the Egyptian political entity. On the other hand, the Palestinians, through their rejection of the  UN Partition Plan, refused to recognize the Jewish state and embarked on a war to destroy it. This is, after all, the root of the conflict. Indeed, the Palestinian narrative is based on the rejection of the existence of a Jewish nation-state in any part of the territory they call Palestine.
“If you declared war against the Jewish state, does not the signing of a peace treaty with that state obligate you to accept it? This does not mean the Palestinians are asked to accept the Zionist narrative, but it is incumbent upon them to alter their narrative, which rules out the existence of a Jewish state.
“This is exactly what Israel did at Camp David and Oslo. Under the terms of binding international agreements, Israel committed itself to recognizing"the legitimate rights of the Palestinian Arab nation." [Prime Minister] Menachem Begin was the first to do this. This is not tantamount to relinquishing the Zionist narrative; it is a willingness to accept the legitimacy of a competing narrative and to seek a compromise. We only ask of the Palestinians that which we ourselves have done in the past.”
Note by the way something extremely important: To accept the existence of a Palestinian Arab state, Israel or Zionist ideology does not have to make any change whatsoever in its world view. It is not exclusionary. Palestinian nationalism is. For it to accept the existence of Israel--in real terms or even by signing a final peace treaty--requires a political and intellectual revolution.
And one of the ways you know peace is not near is that this revolution has barely begun. Examine Palestinian media, education, the statements (in Arabic) of leaders, mosque sermons, and so on, and you find few hints that there is acceptance of Israel's long-term, much less permanent, existence. Of course, Hamas makes little secret of its view on the subject.
Fatah's view is more complex. In private, some of its leaders know they cannot defeat Israel but won't say so publicly and hope that a long-term battle of attrition will do what force of arms cannot.
Avineri's last point is particularly important: Israel has already recognized the Palestinians as an Arab people who will, of course, have an “Arab state.” Remember that it is on this very basis that the Palestinians will always demand that every Jewish settler must be removed from their territory.
A two-state solution is supposed to mean: Two states for two peoples. That is the best solution, though of course this doesn’t mean there will be a solution for a very long time, a distinction many people seem not to understand.
Posted on: Sunday, August 30, 2009 - 22:10
SOURCE: The Daily Beast (8-29-09)
Obama’s mid-year train wreck puts him in the inglorious company of previous Democratic presidents. How the heat derailed Johnson, Carter, and Clinton, too.
For recent Democratic presidents, in summertime the living isn’t easy. President Barack Obama’s bucolic vacation at Martha’s Vineyard takes place after two extraordinarily difficult months, and was interrupted by the death of his close friend and ally, Sen. Ted Kennedy. The virulent protests at the town-hall meetings have damaged public approval for the president’s health-care plan. Attorney General Eric Holder’s announcement of a preliminary review into CIA interrogation techniques opens up politically dangerous questions about national security that Obama was hoping to avoid. A broken Republican Party is now full of energy and hope. Many liberals express frustration with the administration for its willingness to compromise on key issues. The fate of health care remains uncertain. After encouraging comparisons to Lincoln and FDR, it is not inconceivable that Obama will emerge from his first year with an economic stimulus bill and not much more.
Summers past do not offer a sunny guide. In August 1968, President Lyndon Johnson and Vice President Hubert Humphrey watched as their party disintegrated into civil war at the Democratic Convention in Chicago. Inside the convention hall, insurgents demanded a peace plank to the party platform and protested the war in Vietnam. Outside the convention, thousands of activists gathered in Grant Park. Chicago Mayor Richard J. Daley ordered police to be tough with the protesters. Humphrey found himself winning the party’s nomination just as police pummeled and tear-gassed protesters in the streets and marshals inside the convention hall roughed up and dragged away TV network correspondents on air. Americans watched aghast as Walter Cronkite, personification of Middle American moderation and calm, voiced outrage at a party that seemed out of control. Richard Nixon was smiling.
Fast forward to the summer of 1979: President Jimmy Carter already had experienced one bad summer in 1977, when his strong approval ratings started to plummet as his close adviser and friend, Office of Management and Budget Director Bert Lance, became embroiled in a financial scandal, tarnishing Carter’s image as a pristine reformer. By the summer of 1979, conditions had worsened. Unemployment rates were high, inflation rose, and OPEC had increased oil prices several times. On July 15, Carter, whose approval ratings had plunged to less than 30 percent, tried to uplift the national mood and his political fortune by delivering a speech in which he called on Americans to change their consumption patterns and accept the need for sacrifice. Though he didn’t use the word, it was dubbed his “malaise” speech because he appeared to blame the problems afflicting the country on the public’s bad psychology. His speech gave him a blip in support as addresses to the nation billed as answers to crises usually do, until within two weeks he purged his Cabinet, once again giving the impression of disarray and confusion. Americans reacted negatively. They saw a president who could not and did not want to lead, instead pointing the finger at them. By late-August, voters were expressing even stronger disapproval of Carter. And Democratic congressional leaders urged Sen. Ted Kennedy to challenge Carter in the Democratic primaries. Ronald Reagan was smiling...
... Fifteen years later, even as the economy has slowly started to gain steam this summer, health-care reform is once again on life support. President Obama has invested his political capital and reputation in health care and lost control of the debate. The administration severely underestimated the intensity of conservative grassroots activists as well as Astroturf operations, and overestimated their ability to win over Republican support. During the one critical moment when Obama had the nation’s attention at a press conference in July to articulate his plan, with many Americans waiting to hear him deliver the same kind of powerful speech that voters saw when he tackled the issue of race on the campaign trail, the president instead delivered a lackluster, meandering performance and uttered a poorly thought-out comment about a white Cambridge, Massachusetts, policeman acting “stupidly” in the arrest for disorderly conduct of distinguished black Harvard Professor Henry Louis Gates.
When President Obama returns from the Vineyard, he’ll have to shake off these summertime blues. If not, this summer will be another for the history books, followed by a winter of our discontent.
Posted on: Sunday, August 30, 2009 - 21:29
SOURCE: Huffington Post (8-29-09)
Today President Obama honored the late Senator Ted Kennedy by calling him "the greatest legislator of our time." These were fitting words for a man who demonstrated just how much can be accomplished by learning the ways and means of Congress.
President Obama's words were a reminder that losing the 1980 Democratic nomination to President Jimmy Carter might have been one of the best things to ever happen to Senator Ted Kennedy. Much of Kennedy's earlier career had been consumed with hopes of winning the presidency. Although Kennedy proved to be a skilled legislative tactician from the moment that he arrived on Capitol Hill, there was always speculation about whether he would be the next member of the family to inhabit the White House. The Chappaquiddick scandal in 1969 forever undermined his ability to achieve that goal, but he did not stop trying during the 1970s.
By 1978, Senator Kennedy had become frustrated with Jimmy Carter. Like many liberals, Kennedy felt that Carter had moved too far to the center, focusing on issues like inflation over unemployment and abandoning problems like national health care. At the Democratic midterm convention in Memphis, Kennedy finally unleashed on the president: "Sometimes a party must sail against the wind," he said, "now is such a time."
In November 1979, Kennedy announced that he would challenge the president. Carter said he didn't care. "I'll whip his ass," the president said. But polls showed that Kennedy was favored by as much as two-to-one.
But Kennedy's campaign did not go well. During a television interview that was broadcast shortly before Kennedy officially announced his candidacy, the senator could not explain why he wanted to be president. Given his eloquent speech at Memphis, nobody thought he would have a tough time with the question. But he did, perhaps reflecting his assumption that he was always destined to run.
Kennedy won some primaries, including New York and California, yet he was outmaneuvered by the president who ran up the delegate count. According to biographer Adam Clymer, his staff had failed to conduct adequate polling before he ran and underestimated how much Chappaquiddick would define his image. One of Kennedy's advisors noted that the senator had responded to every question about the incident, but that didn't matter: "They've all been asked and all been answered. It's that people don't like the answers."
At the Democratic Convention, the tension between Carter and Kennedy was on public display. Kennedy delivered a powerful speech. He said: "I am confident that the Democratic Party will reunite on the basis of Democratic principles, and that together we will march towards a Democratic victory in 1980. And someday, long after this convention, long after the signs come down and the crowds stop cheering, and the bands stop playing, may it be said of our campaign that we kept the faith. May it be said of our Party in 1980 that we found our faith again."
Carter's speech paled in comparison. As the convention ended, a large number of Democrats appeared on the stage to stand alongside Carter and show their support. The crowd waited for Kennedy for fifteen minutes. When Kennedy finally walked on stage, he raised his fist to the Massachusetts delegates. Then he curtly shook Carter's hand and walked away after a few minutes. Kennedy had practiced a more enthusiastic embrace but decided not to do it. Nor did he lift Carter's arm--the traditional sign of party unity. After Kennedy left, the delegates chanted "We Want Ted!" The senator returned. At that point, it appeared as if Carter was chasing him, only to have Kennedy merely put his hand on the president's shoulder. Ronald Reagan took note.
Carter felt that Kennedy should have healed the divisions and that his challenge had hurt the Democrats in the general election.
Although Kennedy did not abandon his presidential ambitions after 1980, it had become evident that he had little chance of becoming the president of the United States, particularly after Ronald Reagan and the conservative movement seemed to have captured the heart of America.
But the loss in 1980 was an unexpected blessing, as it was responsible for focusing Kennedy on his career as a legislator. And Kennedy turned out to be outstanding at the job. What made him so unique was his ability to retain a broader ideological commitment while simultaneously mastering the art of compromise. When Kennedy had first entered the Senate in 1962, Georgia's Richard Russell told him that "you go further if you go slow." Kennedy took Russell's maxim to heart.
After 1980, he worked on fighting for liberalism one bill at a time. He joined the tradition of liberals like New York Senator Robert Wagner and Missouri Democratic Representative Richard Bolling who made Congress their home base as they fought for their political values. He was an unreconstructed Great Society liberal who was determined to fight for health care, civil rights, and social justice. When Kennedy made deals with Republicans, everyone was sure that he would be back the next year to fight for more. It was the second part of this equation that is crucial to understanding his legislative style. This is why the most ardent liberals respected him so much at the very same time that Republicans genuinely appreciated his role as dealmaker.
Kennedy offers an important lesson of politicians of the future. Too often, newcomers to Washington have their eyes set on the Oval Office from the moment they arrive in town. But up-and-coming stars should remember that members of Congress who do their job well can leave behind a legislative record that few presidents ever achieve. Kennedy also used the bully pulpit of Congress to caution against the use of military power and in favor of diplomacy and arms reduction.
When John F. Kennedy was assassinated in November 1963, he died a deeply frustrated president because most of his domestic agenda had been bottled up on Capitol Hill by a coalition of southern Democrats and Republicans. Freed from his own presidential aspirations after the 1980 primaries, Ted Kennedy was able to concentrate on taking the fight directly to Congress. In doing so, he made liberalism a legislative reality--even in an era of conservatism--and gradually inscribed his ideals into the nation's laws.
Posted on: Sunday, August 30, 2009 - 21:23
SOURCE: The Cutting Edge (8-24-09)
The summer of 2009 has been rife with misplaced fears about government death panels arising from proposed insurance reform. These fears are not based on anything in the proposed legislation. But government death panels and mass euthanasia were always a public option during the first decades of the twentieth century. This campaign to exterminate all those deemed socially or medically unworthy was not conducted by the worst segments of our society but by the elite of the American establishment. They saw themselves as liberals, progressive, do-gooders—and even utopians— trying to create a more perfect society.
The mission: eliminate the existence of the poor, immigrants, those of mixed parentage, and indeed anyone who did not approximate the blond-haired blue-eyed ideal they idealized. This racial type was termed Nordic, and it was socially deified by a broad movement of esteemed university professors, doctors, legislators, judges and writers. They called themselves eugenicists. This widely accepted extremist movement was virtually created and funded by millions in corporate philanthropy from the Carnegie Institution, the Rockefeller Foundation and the Harriman railroad fortune through a complex of pseudoscientific institutions and population tracking offices at Cold Spring Harbor, Long Island. From there, leading academics supported by big money lead a termite-like proliferation of eugenics into the laws, social policies and curricula of the nation. During these turbulent decades, eugenics enjoyed the active support of the government, especially the U.S. Department of Agriculture which wanted to breed men the way they bred cattle, and many state and county offices.
Indeed, Eugenics was enacted into law in some 27 states during the first decades of the twentieth century, and then exalted as the law of the land by the U. S. Supreme Court. In a famous 1927 opinion, revered jurist Oliver Wendell Holmes compared social undesirables to bacteria to be wiped out. The sanctioned methods to be used were nothing less than a combination of pseudoscientific raceology, social engineering, ethnic cleansing and abject race law, designed to eliminate millions in an organized fashion. More specifically, the American eugenics movement sought to continually subtract the so-called “bottom tenth” of America. These were to include Blacks, Native Americans, Southern Italians, East Europeans, Jews, Hispanics, the poor, criminals, the intellectually unaccepted, the so-called “shiftless,” and many others. The drive for perfection even included excising the existence of Appalachians with brown hair, frequently rounded up by county officials for confinement. When this effort began in the early twentieth century, some fourteen million Americans were targeted for elimination.
To eliminate entire bloodlines of undesirables, American eugenics advocated marriage prohibition and marriage voiding for those deemed racially or socially undesirable. Such laws were enacted from coast to coast. These criminal sanctions for interracial marriage were not completely negated until 1960 when Loving v Virginia had such laws debunked.
Eugenics advocated detention or confinement camps—some would call them concentration camps. These were established throughout Connecticut, Massachusetts, New York, New Jersey and other states to quarantine those considered otherwise unsuited to exist in society, especially the so-called “feeble-minded,” a never-defined and widely abused intelligence caste. Among the camps shrouded behind high-sounded names was The Vineland Training School in New Jersey and the Virginia Colony for the Epileptic and the Feebleminded. Forced surgical sterilization of the undesired was imposed in jurisdictions across American. Some 60,000 individuals in 27 states, mostly young women, were forcibly sterilized, many without their knowledge, often by the use of trickery using misidentified medical procedures. Untold additional thousands were coercively or stealthily sterilized by federal programs. California led the union in forced sterilizations. But marriage restriction, concentration, and forced sterilization were always the B Plan.
For American eugenics, mass murder was always a public option...
Eugenicide and Public Gas Chambers
In 1911, the leading pioneer eugenicists, supported by the U.S. Department of Agriculture, the American Breeders Association and the Carnegie Institution, met to propound a battle plan to create a master race of white, blond, blue-eyed Americans devoid of undesirables.
Point eight of the Preliminary Report of the Committee of the Eugenic Section of the American Breeders Association to Study and to Report on the Best Practical Means for Cutting Off the Defective Germ-Plasm in the Human Population specified euthanasia as a possibility to be considered. Of course, euthanasia was merely a euphemism—actually a misnomer. Eugenicists did not see euthanasia as a “merciful killing” of those in pain, but rather a “painless killing” of people deemed unworthy of life. The method most whispered about, and publicly denied, but never out of mind, was a “lethal chamber.” The lethal chamber first emerged in Britain during the Victorian era as a humane means of killing stray dogs and cats. Dr. Benjamin Ward Richardson patented a “Lethal Chamber for the Painless Extinction of Lower Animal Life” in the 1880s. Richardson’s original blueprints showed a large wood- and glass-paneled chamber big enough for a Saint Bernard or several smaller dogs, serviced by a tall slender tank for carbonic acid gas, and a heating apparatus. In 1884 the Battersea Dogs Home in London became one of the first institutions to install the device, and used it continuously with “perfect success” according to a sales proposal at the time. By the turn of the century other charitable animal institutions in England and other European countries were also using the chamber.
This solution for unwanted pets was almost immediately contemplated as a solution for unwanted humans—criminals, the feebleminded and other misfits. The concept of “the lethal chamber” was in common vernacular by the turn of the century. When mentioned, it needed no explanation; everyone understood what it meant.
In 1895, the British novelist Robert Chambers penned his vision of a horrifying world twenty-five years into the future. He wrote of a New York where the elevated trains were dismantled and “the first Government Lethal Chamber was opened on Washington Square.” No explanation of “Government Lethal Chamber” was offered—or necessary. Indeed, the idea of gassing the unwanted became a topic of contemporary chitchat. In 1901, the British author Arnold White, writing in Efficiency and Empire, chastised “flippant people of lazy mind [who] talk lightly of the ‘lethal chamber’…”
In 1905, the British eugenicist and birth control advocate H. G. Wells published A Modern Utopia. “There would be no killing, no lethal chambers,” he wrote. Another birth control advocate, the socialist writer Eden Paul, differed with Wells and declared that society must protect itself from “begetters of anti-social stocks which would injure generations to come. If it [society] reject the lethal chamber, what other alternative can the socialist state devise?”
The British eugenicist Robert Rentoul’s 1906 book, Race Culture; Or, Race Suicide?, included a long section entitled “The Murder of Degenerates.” In it, he routinely referred to Dr. D. F. Smith’s earlier suggestion that those found guilty of homicide be executed in a “lethal chamber” rather than by hanging. He then cited a new novel whose character “advocate[d] the doctrine of ‘euthanasia’ for those suffering from incurable physical diseases.” Rentoul admitted he had received many letters in support of killing the unfit, but he rejected them as too cruel, explaining, “These [suggestions] seem to fail to recognize that the killing off of few hundreds of lunatics, idiots, etc., would not tend to effect a cure.”
The debate raged among British eugenicists, provoking damnation in the press. In 1910, the eugenic extremist George Bernard Shaw lectured at London’s Eugenics Education Society about mass murder in lethal chambers. Shaw proclaimed, “A part of eugenic politics would finally land us in an extensive use of the lethal chamber. A great many people would have to be put out of existence, simply because it wastes other people’s time to look after them.” Several British newspapers excoriated Shaw and eugenics under such headlines as “Lethal Chamber Essential to Eugenics.”
One opponent of eugenics condemned “much wild and absurd talk about lethal chambers.…” But in another article, a eugenicist writing under the pseudonym of “Vanoc” argued that eugenics was needed precisely because systematic use of lethal chambers was unlikely. “I admit the word ‘Eugenics’ is repellent, but the thing is essential to our existence… It is also an error to believe than the plans and specifications for County Council lethal-chambers have yet been prepared.”
The Eugenics Education Society in London tried to dispel all “dark mutterings regarding ‘lethal chambers.’” Its key activist, Caleb Saleeby, insisted, “We need mention, only to condemn, suggestions for ‘painless extinction,’ lethal chambers of carbonic acid, and so forth. As I incessantly have to repeat, eugenics has nothing to do with killing.…” Saleeby returned to this theme time and again. When lecturing in Battle Creek, Michigan, at the First National Conference on Race Betterment in 1914, Saleeby emphasized a vigorous rejection of “the lethal chamber, the permission of infant mortality, interference with [pre]-natal life, and all other synonyms for murder.”
But many British eugenicists clung to the idea. Arthur F. Tredgold was a leading expert on mental deficiency and one of the earliest members of the Eugenics Education Society. His academic credentials eventually won him a seat on the Brock Commission on Mental Deficiency. Tredgold’s landmark Textbook on Mental Deficiency, first published in 1908, completely avoided discussion of the lethal chamber. But three subsequent editions published over the next fourteen years did discuss it, with each revision displaying greater acceptance of the idea. In those editions Tredgold equivocated: “We may dismiss the suggestion of a ‘lethal chamber.’ I do not say that society, in self-defense, would be unjustified in adopting such a method of ridding itself of its anti-social constituents. There is much to be said for and against the proposal.…” By the sixth edition, Tredgold had modified the paragraph to read: “The suggestion [of the lethal chamber] is a logical one… It is probable that the community will eventually, in self-defense, have to consider this question seriously.” The next two editions edged into outright, if limited, endorsement. While qualifying that morons need not be put to death, Tredgold concluded that for some 80,000 imbeciles and idiots in Britain, “it would be an economical and humane procedure were their existence to be painlessly terminated…The time has come when euthanasia should be permitted.”
Leaders of the American eugenic establishment also debated lethal chambers and other means of euthanasia. But in America, while the debate began as an argument about death with dignity for the terminally ill or those in excruciating pain, it soon became a palatable eugenic solution. In 1900, the physician W. Duncan McKim published Heredity and Human Progress, asserting, “Heredity is the fundamental cause of human wretchedness… The surest, the simplest, the kindest, and most humane means for preventing reproduction among those whom we deem unworthy of this high privilege [reproduction], is a gentle, painless death.” He added, “In carbonic acid gas, we have an agent which would instantaneously fulfill the need.”
By 1903, a committee of the National Conference on Charities and Correction conceded that it was as yet undecided whether “science may conquer sentiment” and ultimately elect to systematically kill the unfit. In 1904, the superintendent of New Jersey’s Vineland Training School, E. R. Johnstone, raised the issue during his presidential address to the Association of Medical Officers of American Institutions for Idiotic and Feebleminded Persons. “Many plans for the elimination [of the feebleminded] have been proposed,” he said, referred to numerous recently published suggestions of a “painless death.” That same year, the notion of executing habitual criminals and the incurably insane was offered to the National Prison Association.
Some U.S. lawmakers considered similar ideas. Two years later in 1906, the Ohio legislature considered a bill empowering physicians to chloroform permanently diseased and mentally incapacitated persons. In reporting this, Rentoul told his British colleagues that it was Ohio’s attempt to “murder certain persons suffering from incurable disease.” Iowa considered a similar measure.
By 1910, the idea of sending the unfit into lethal chambers was regularly bandied about in American sociological and eugenic circles, causing a debate no less strident than the one in England. In 1911, E. B. Sherlock’s book, The Feebleminded: a guide to study and practice, acknowledged that “glib suggestions of the erection of lethal chambers are common enough.…” Like others, he rejected execution in favor of eugenic termination of bloodlines. “Apart from the difficulty that the provision of lethal chambers is impracticable in the existing state law…,” he continued, “the removal of them [the feebleminded] would do practically nothing toward solving the chief problem with the mentally defective set…, the persistence of the obnoxious stock.”
But other eugenicists were more amenable to the idea. The psychologist and eugenicist Henry H. Goddard seemed to almost express regret that such proposals had not already been implemented. In his infamous study, The Kallikak Family, Goddard commented, “For the low-grade idiot, the loathsome unfortunate that may be seen in our institutions, some have proposed the lethal chamber. But humanity is steadily tending away from the possibility of that method, and there is no probability that it will ever be practiced.” Goddard pointed to family-wide castration, sterilization and segregation as better solutions because they would more broadly address the genetic source.
In 1912, Carnegie-financed eugenicist Harry Laughlin and others at the Eugenics Section of the American Breeders Association considered euthanasia as the eighth of nine options. Their final report, published by the Carnegie Institution as a two-volume bulletin, enumerated the “Suggested Remedies” and equivocated on euthanasia. Point eight cited the example of ancient Sparta, fabled for drowning its weak young boys in a river or letting them die of exposure to ensure a race of warriors. Mixing condemnation with admiration, the Carnegie report declared, “However much we deprecate Spartan ideals and her means of advancing them, we must admire her courage in so rigorously applying so practical a system of selection…Sparta left but little besides tales of personal valor to enhance the world’s culture. With euthanasia, as in the case of polygamy, an effective eugenical agency would be purchased at altogether too dear a moral price.”
William Robinson, a New York urologist, published widely on the topic of birth control and eugenics. In Robinson’s book, Eugenics, Marriage and Birth Control (Practical Eugenics), he advocated gassing the children of the unfit. In plain words, Robinson insisted: “The best thing would be to gently chloroform these children or to give them a dose of potassium cyanide.” Margaret Sanger was well aware that her fellow birth control advocates were promoting lethal chambers, but she herself rejected the idea completely. “Nor do we believe,” wrote Sanger in Pivot of Civilization, “that the community could or should send to the lethal chamber the defective progeny resulting from irresponsible and unintelligent breeding.”
Still, American eugenicists never relinquished the notion that America could bring itself to mass murder. At the First National Conference on Race Betterment, University of Wisconsin eugenicist Leon J. Cole lectured on the “dysgenic” effects of charity and medicine on eugenic progress. He made a clear distinction between Darwin’s concept of natural selection and the newer idea of simple “selection.” The difference, Cole explained, “is that instead of being natural selection it is now conscious selection on the part of the breeder.…Death is the normal process of elimination in the social organism, and we might carry the figure a step further and say that in prolonging the lives of defectives we are tampering with the functioning of the social kidneys!”
Paul Popenoe, leader of California’s eugenics movement and coauthor of the widely-used textbook Applied Eugenics, agreed that the easiest way to counteract feeblemindedness was simple execution. “From an historical point of view,” he wrote, “the first method which presents itself is execution… Its value in keeping up the standard of the race should not be underestimated.”
Madison Grant, who functioned as president of the Eugenics Research Association and the American Eugenics Society, made the point clear in The Passing of the Great Race. “Mistaken regard for what are believed to be divine laws and a sentimental belief in the sanctity of human life tend to prevent both the elimination of defective infants and the sterilization of such adults as are themselves of no value to the community. The laws of nature require the obliteration of the unfit and human life is valuable only when it is of use to the community or race.”
The Black Stork
On November 12, 1915, the issue of eugenic euthanasia sprang out of the shadows and into the national headlines. It began as an unrelated medical decision on Chicago’s Near North Side. At 4 A.M. that day, a woman named Anna Bollinger gave birth at German-American Hospital. The baby was somewhat deformed and suffered from extreme intestinal and rectal abnormalities, as well as other complications. The delivering physicians awakened Dr. Harry Haiselden, the hospital’s chief of staff. Haiselden came in at once. He consulted with colleagues. There was great disagreement over whether the child could be saved. But Haiselden decided the baby was too afflicted and fundamentally not worth saving. It would be killed. The method: denial of treatment.
Catherine Walsh, probably a friend of Anna Bollinger’s, heard the news and sped to the hospital to help. She found the baby, already named Allan, naked and alone in a bare room. He had clearly been laying in one position for a long time. Walsh urgently called for Haiselden, “to beg that the child be taken to its mother,” and dramatically recalled, “It was condemned to death, and I knew its mother would be its most merciful judge.”
Walsh pleaded with Haiselden not to kill the baby by withholding treatment. “It was not a monster—that child,” Walsh later told an inquest. “It was a beautiful baby. I saw no deformities.” Walsh had patted the infant lightly. Allan’s eyes were open, and he waved his tiny fists at her. She kissed his forehead. “I knew,” she recalled, “if its mother got her eyes on it she would love it and never permit it to be left to die.” Begging the doctor once more, Walsh tried an appeal to his humanity. “If the poor little darling has one chance in a thousand,” she pleaded, “won’t you operate and save it?”
Haiselden laughed at Walsh, retorting, “I’m afraid it might get well.” He was a skilled and experienced surgeon, trained by the best doctors in Chicago, and now chief of the hospital’s medical staff. He was also an ardent eugenicist.
Chicago’s health commissioner, Dr. John Dill Robertson, learned of the deliberate euthanasia. He went to the hospital and told Haiselden he did not agree that “the child would grow up a mental defective.” He later recollected, “I thought the child was in a dying condition, and I had doubts that an operation then would save it. Yet I believed it had one chance in 100,000, and I advised Dr. Haiselden to give it this one chance.” But Haiselden refused.
Quiet euthanasia of newborns was not uncommon in Chicago. Haiselden, however, publicly defended his decision to withhold treatment as a kind of eugenic expedient, throwing the city and the nation into moral turmoil amid blaring newspaper headlines. An inquest was convened a few days later. Some of Haiselden’s most trusted colleagues were impaneled on the coroner’s jury. Health Commissioner Robertson testified, “I think it very wrong not to save life, let that life be what it may. That is the function of a physician. I believe this baby might have grown up to be an average man.…I would have operated and saved this baby’s life.…”
At one point Haiselden angrily interrupted the health commissioner’s testimony to question why he was being singled out when doctors throughout Chicago were routinely killing, on average, one baby every day, under similar circumstances. Haiselden defiantly declared, “I should have been guilty of a graver crime if I had saved this child’s life. My crime would have been keeping in existence one of nature’s cruelest blunders.” A juror shot back, “What do you mean by that?” Haiselden responded, “Exactly that. I do not think this child would have grown up to be a mental defective. I know it.”
After tempestuous proceedings, the inquest ruled, “We believe that a prompt operation would have prolonged and perhaps saved the life of the child. We find no evidence from the physical defects that the child would have become mentally or morally defective.” The jurors concluded that the child had at least a one-in-three chance—some thought an “even chance”—of surviving. But they also decided that Haiselden was within his professional rights to decline treatment. No law compelled him to operate on the child. The doctor was released unpunished, and efforts by the Illinois attorney general to indict him for murder were blocked by the local prosecutor.
The medical establishment in Chicago and throughout the nation was rocked. The Chicago Tribune ran a giant banner headline across the width of its front page: “Baby Dies; Physician Upheld.” One reader in Washington, D.C., wrote a letter to the editor asking “Is it not strange that the whole country should be so shaken, almost hysterical, over the death of a babe never consciously alive?” But the nation was momentarily transfixed.
Haiselden considered his legal vindication a powerful victory for eugenics. “Eugenics? Of course it’s eugenics,” he told one reporter. On another occasion he remarked, “Which do you prefer—six days of Baby Bollinger or seventy years of Jukes?”--referring to a mythical family of degenerates fabricated by academicians to justify ethnic cleansing.
Emboldened, Haiselden proudly revealed that he had euthanized other such newborns in the past. He began granting high-profile media interviews to advertise his determination to continue passively euthanizing infants. Within two weeks, he had ordered his staff to withhold treatment from several more deformed or birth-defected infants. Haiselden would sometimes send instructions via cross-country telegraph while on the lecture tour that arose from his eugenic celebrity. Other times he would handle it personally, like the time he left a newly delivered infant’s umbilical cord untied and let it bleed to death. Sometimes he took a more direct approach and simply injected newborns with opiates.
The euthanasia of Allan Bollinger may have begun as one doctor’s controversial professional decision, but it immediately swirled into a national eugenic spectacle. Days after the inquest ruling, The Independent, a Hearst weekly devoted to pressing issues of the day, ran an editorial asking “Was the Doctor Right?” The Independent invited readers to sound off. In a special section, The Independent published supportive letters from prominent eugenicists, including Carnegie-funded eugenic kingpin Charles Davenport himself. “If the progress of surgery,” wrote Davenport, “is to be used to the detriment of the race…it may conceivably destroy the race. Shortsighted they who would unduly restrict the operation of what is one of Nature’s greatest racial blessings—death.”
Slaughterhouse in Lincoln Illinois
Haiselden continued to rally for eugenic euthanasia with a six-week series in the Chicago American. He justified his killings by claiming that public institutions for the feebleminded, epileptic and tubercular were functioning as lethal chambers of a sort. After clandestinely visiting the Illinois Institution for the Feebleminded at Lincoln, Illinois, Haiselden claimed that windows were deliberately left open and unscreened, allowing drafts and infecting flies to swarm over patients. He charged that Lincoln consciously permitted “flies from the toilets, garbage and from the eruptions of patients suffering from acute and chronic troubles to go at will over the entire institution. Worse still,” he proclaimed, “I found that inmates were fed with the milk from a herd of cattle reeking with tuberculosis.”
At the time, milk from cattle with tuberculosis was a well-known cause of infection and death from the disease. Lincoln maintained its own herd of seventy-two cows, which produced about 50,000 gallons of milk a year for its own consumption. Ten diseased cows had died within the previous two years. State officials admitted that their own examinations had determined that as many as half of the cows were tubercular, but there was no way to know which ones were infected because “a tubercular cow may be the fattest cow in the herd.” Lincoln officials claimed that their normal pasteurization “by an experienced employee” killed the tuberculosis bacteria. They were silent on the continuous handling of the milk by infected residents.
Medical watchdogs had often speculated that institutions for the feebleminded were really nothing more than slow-acting lethal chambers. But Haiselden never resorted to the term lethal chamber. He called such institutions “slaughterhouses.”
In tuberculosis colonies, residents continuously infected and reinfected each other, often receiving minimal or no treatment. At Lincoln, the recently established tuberculosis unit housed just forty beds for an estimated tubercular population of hundreds. Lincoln officials asserted that only the most severely infected children were placed in that ward. They stressed that other institutions for the feebleminded recorded much higher mortality rates, some as high as 40 percent.
Eugenicists believed that when tuberculosis was fatal, the real culprit was not bacteria, but defective genes. The Carnegie and Rockefeller-financed Eugenics Record Office, headquartered at Cold Spring Harbor, Long Island, kept special files on mortality rates resulting from hereditary tuberculosis. The data was compiled by the Belgian eugenicist Albert Govaerts, among others.
Tuberculosis was an omnipresent topic in textbooks on eugenics. Typical was a chapter in Davenport’s Heredity in Relation to Eugenics (1911). He claimed that only the “submerged tenth” was vulnerable. “The germs are ubiquitous,” he wrote. “Why do only 10 percent die from the attacks of this parasite? …It seems perfectly plain that death from tuberculosis is the result of infection added to natural and acquired non-resistance. It is then highly undesirable that two persons with weak resistance should marry.…” Popenoe and Johnson’s textbook, Applied Eugenics, devoted a chapter to “Lethal Selection,” which operated “through the destruction of the individual by some adverse feature of the environment, such as excessive cold, or bacteria, or by bodily deficiency.”
Some years earlier, the president of the National Conference on Charities and Correction had told his institutional superintendents caring for the feebleminded, “We wish the parasitic strain…to die out.” Even an article in Institution Quarterly, Illinois’s own journal, admitted, “It would be an act of kindness to them, and a protection to the state, if they could be killed.”
No wonder that at one international conference on eugenics, Davenport proclaimed without explanation from the podium, “One may even view with satisfaction the high death rate in an institution for low grade feeble-minded, while one regards as a national disaster the loss of… the infant child of exceptional parents.”
Haiselden himself quipped, “Death is the Great and Lasting Disinfectant.”
Haiselden’s accusations of deliberate passive euthanasia by neglect and abuse could neither be verified nor dismissed. Lincoln’s understaffed, overcrowded and decrepit facility consistently reported staggering death rates, often as high as 12 percent per year. In 1904, for example, 109 of its epileptic children died, constituting at least 10 percent and probably far more of its youth population; cause of death was usually listed as “exhaustion due to epileptic seizures.” Between 1914 and 1915, a bout of dysentery claimed eight patients; “heat exhaustion” was listed as the cause. During the same period, four individuals died shortly after admission before any preliminary examination at all; their deaths were categorized as “undetermined.”
For some of its most vulnerable groups, Lincoln’s death rate was particularly high. As many as 30 percent of newly admitted epileptic children died within eighteen months of admission. Moreover, in 1915, the overall death rate among patients in their first two years of residence jumped from 4.2 percent to 10 percent.
Tuberculosis was a major factor. In 1915, Lincoln reported that nearly all of its incoming patients were designated feebleminded; roughly 20 percent were classified as epileptics; and some 27 percent of its overall population was “in various stages of tubercular involvement.” No isolation was provided for infected patients until the forty-bed tuberculosis unit opened. Lincoln officials worried that the statistics were “likely to leave the impression that the institution is a ‘hot-bed’ for the spread of tuberculosis.” Officials denied this, explaining that many of the children came from filthy environments, and “the fact that feebleminded children have less resistance, account[s] for the high percentage of tuberculosis found among them.”
Lincoln officials clearly accepted the eugenic approach to feeblemindedness as gospel. Their reports and explanations were laced with scientific quotations on mental deficiency from Tredgold, who advocated euthanasia for severe cases, and others doctors who extolled the wisdom of castrations performed in Kansas. Lincoln officials also made clear that they received many of their patients as court-ordered institutionalizations from the Municipal Court of Chicago; as such, they received regular guidance from the court’s supervising judge, Harry Olson. Eugenical News praised Olson for operating the court’s psychopathic laboratory, which employed Laughlin as a special consultant on sterilization. Olson was vital to the movement and hailed by Eugenical News as “one of its most advanced representatives.” In 1922, Olson became president of the Eugenics Research Association.
Moreover, staff members at Lincoln were some of the leading eugenicists in Illinois. Lincoln psychologist Clara Town chaired the Eugenics Committee of the Illinois State Commission of Charities and Corrections. Town had helped compile a series of articles on eugenics and feeblemindedness, including one by her friend, Henry H. Goddard, who had invented the original classifications of feeblemindedness. One reviewer described Town’s articles as arguments that there was little use in caring for the institutionalized feebleminded, who would die anyway if left in the community; caring for them was little more than “unnatural selection.”
For decades, medical investigators would question how the death rates at asylums, including the one in Lincoln, Illinois, could be so high. In the 1990s, the average life expectancy for individuals with mental retardation was 66.2 years. In the 1930s, the average life expectancy for those classified as feebleminded was approximately 18.5 years. Records suggest that a disproportionate percentage of the feebleminded at Lincoln died before the age of ten.
Haiselden became an overnight eugenic celebrity, known to the average person because of his many newspaper articles, speaking tours, and his outrageous diatribes. In 1917, the film industry came calling. The film was called The Black Stork. Written by Chicago American reporter Jack Lait, it was given a massive national distribution and promotion campaign. Haiselden played himself in a fictionalized account of a eugenically mismatched couple who are counseled by Haiselden against having children because they are likely to be defective. Eventually the woman does give birth to a defective child, whom she then allows to die. The dead child levitates into the waiting arms of Jesus Christ. It was unbridled cinematic propaganda for the eugenics movement.
In many theaters, such as the LaSalle in Chicago, the movie played continuously from 9 A.M. until 11 P.M. National publicity advertised it as a “eugenic love story.” Sensational movie posters called it a “eugenic photoplay.” One advertisement quoted Swiss eugenicist Auguste Forel’s warning: “The law of heredity winds like a red thread through the family history of every criminal, of every epileptic, eccentric and insane person. Shall we sit still…without applying the remedy?” Another poster depicted Haiselden’s office door with a notice: “BABIES NOT TREATED.” In 1917, a display advertisement for the film encouraged: “Kill Defectives, Save the Nation and See ‘The Black Stork.’”
The Black Stork played at movie theaters around the nation for more than a decade.
Gassing the unwanted, the lethal chamber and other methods of euthanasia became a part of everyday American parlance and ethical debate some two decades before President Woodrow Wilson, in General Order 62, directed that the “Gas Service” become the “Chemical Warfare Service,” instructing them to develop toxic gas weapons for world war. The lethal chamber was a eugenic concept more than two decades before Nevada approved the first such chamber for criminal executions in 1921, and then gassed with cyanide a Chinese-born murderer, the first such execution in the world. Davenport declared that capital punishment was a eugenic necessity. Popenoe’s textbook, Applied Eugenics, listed execution as one of nine suggested remedies for defectives—without specifying criminals.
In the first decades of the twentieth century, America’s eugenics movement inspired and spawned a world of look-alikes, act-alikes and think-alikes. The U.S. movement also rendered scientific aid and legitimacy to undisguised racists everywhere, from race-tracking bureaucrat Walter Plecker in Virginia right across Europe. American theory, practice and legislation, were the models. In France, Belgium, Sweden, England and elsewhere in Europe, each clique of raceological eugenicists did their best to introduce eugenic principles into their national life; perhaps more importantly, they could always point to the recent precedents established in the United States.
Germany was no exception. German eugenicists had formed academic and personal relationships with Davenport and the American eugenic establishment from the turn of the century. Even after World War I, when Germany would not cooperate with the International Federation of Eugenic Organizations because of French, English and Belgian involvement, its bonds with Davenport and the rest of the U.S. movement remained strong. American foundations such as the Carnegie Institution and the Rockefeller Foundation generously funded German race biology with hundreds of thousands of dollars, even as Americans stood in breadlines.
Germany had certainly developed its own body of eugenic knowledge and library of publications. Yet German readers still closely followed American eugenic accomplishments as the model: a biological court, forcible sterilizations, detention for the socially inadequate, euthanasia debates. As America’s elite were describing the socially worthless and the ancestrally unfit as “bacteria,” “vermin,” “mongrels,” and “subhuman,” a superior race of Nordics was increasingly seen as the final solution to the globe’s eugenic problems.
Fan Mail from Germany
America had established the value of race and blood. In Germany, the concept was known as Rasse und Blut. Yet the catch phrase was developed by David Starr Jordan, the racist president of Stanford University. U.S. proposals, laws, eugenic investigations and ideology were not undertaken quietly out of sight of German activists. They became inspirational blueprints for Germany’s rising tide of race biologists and race-based hatemongers, be they white-coated doctors studying Eugenical News and attending congresses in New York, or brown-shirted agitators waving banners and screaming for social upheaval in the streets of Munich.
One such agitator was a disgruntled corporal in the German army. He was an extreme nationalist who also considered himself a race biologist and an advocate of a master race. He was willing to use force to achieve his nationalist racial goals. His inner circle included Germany’s most prominent eugenic publisher. In 1924, he was serving time in prison for mob action. While in prison, he spent his time poring over eugenic textbooks, which extensively quoted Davenport, Popenoe and other American raceological stalwarts. Moreover, he closely followed the writings of Leon Whitney, president of the American Eugenics Society, and Madison Grant, who extolled the Nordic race and bemoaned its corruption by Jews, Negroes, Slavs and others who did not possess blond hair and blue eyes. The young German corporal even wrote one of them a fan letter.
In The Passing of the Great Race, Madison Grant wrote: “Mistaken regard for what are believed to be divine laws and a sentimental belief in the sanctity of human life tend to prevent both the elimination of defective infants and the sterilization of such adults as are themselves of no value to the community. The laws of nature require the obliteration of the unfit and human life is valuable only when it is of use to the community or race.”
One day in the early 1930s, AES president Whitney visited the home of Grant, who was at the time chairing a eugenic immigration committee. Whitney wanted to show off a letter he had just received from Germany, written by the corporal, now out of prison and rising in the German political scene. Grant could only smile. He pulled out his own letter. It was from the same German, thanking Grant for writing The Passing of the Great Race. The fan letter stated that Grant’s book was “his Bible.”
The man writing both letters to the American eugenic leaders would soon burn and gas his name into the blackest corner of history. He would duplicate the American eugenic program—both that which was legislated and that which was only brashly advocated—and his group would consistently point to the United States as setting the precedents for Germany’s actions. And then this man would go further than any American eugenicist ever dreamed, further than the world would ever tolerate, further than humanity will ever forget.
The man who sent those fan letters to America was Adolf Hitler.
Posted on: Sunday, August 30, 2009 - 10:31
SOURCE: The American Task Force on Palestine (8-27-09)
The US has long seen itself as playing a crucial role in bringing about Israeli-Palestinian peace.
Yet, US policy toward Israeli actions in and around Jerusalem has shifted over time.
Initially, the Johnson administration took a strong line, with UN representative Arthur Goldberg explaining a week after the 1967 war ended that "the United States does not accept or recognise these measures as altering the status of Jerusalem."
Ironically, it was the administration of Jimmy Carter, who today says Israeli policies in Jerusalem are leading to apartheid, which first saw a significant change in US rhetoric.
It was his administration which moved away from calling on Israel to maintain the status quo toward recognising the desirability of maintaining Jerusalem "undivided" in any peace agreement.
This view was shared by the Reagan administration. In the words of President Reagan, Jerusalem’s final status "should be decided through negotiations."
By the time Bill Clinton, the former US president, took office in 1993, the US government no longer offered more than mild criticism for increasing Israeli settlement activities across the Eastern part of the city and the surrounding West Bank lands.
Most crucially, Clinton refused to allow the UN Security Council to address settlements in Jerusalem or elsewhere, arguing that what was once understood by the US, and the world at large, to be a clear violation of international law, should be left to bilateral Israeli-Palestinian negotiations.
The official US imprimatur for Israel's policies came during the George Bush presidency, when he wrote in a 2004 letter to Ariel Sharon, the then-prime minister, that: "In light of new realities on the ground, including already existing major Israeli populations centres, it is unrealistic to expect that the outcome of final status negotiations will be a full and complete return to the armistice lines of 1949, and all previous efforts to negotiate a two-state solution have reached the same conclusion."
It is not known what documentation or arguments led Bush to assume that all previous negotiations led inexorably to the understanding that Israel's constantly increasing control over East Jerusalem and the West Bank would be accepted as a fait accompli by Palestinians.
This is more bewildering given that the settlement system in the West Bank, of which the Jerusalem region is the heart, makes the creation of a Palestinian state geographically, politically, and economically impossible to achieve.
Nevertheless, Bush's words have placed Obama in something of a bind.
During the campaign, the then-candidate Obama caused a stir when, at the annual convention of the American Israel Public Affairs Committee lobby, he argued that "Jerusalem will remain the capital of Israel, and it must remain undivided." ...
Posted on: Sunday, August 30, 2009 - 09:40
SOURCE: The Wall Street Journal (8-27-09)
Over the course of a long and distinguished career, Sen. Edward Kennedy, who died Tuesday at the age of 77, was like a cat with nine lives who used every one of them. He came from a family touched by greatness, even as it was riddled with unfathomable tragedy. He was the torchbearer for liberalism, even when it was a fading voice on the political scene.
If his life was the stuff of rich biography—his memoir, for which he was reportedly paid $8 million, is due out in just over two weeks—the question remains: What will history think of him? Despite all the encomiums, it is too early to tell.
Surveying his impressive 47-year career as one of the lions of the Senate, it is hard not to recall the Senate's "Golden Age," an age before the Civil War when senators every bit as much as presidents were the stewards of the nation's future. Too often we forget this. We shouldn't. The renowned British Prime Minister William Gladstone once called the United States Senate "the most remarkable of all the inventions of modern politics."
Consider the great Senate debate of 1850 about the future of the American union, widely considered the most significant in Congress's history. There, in a tense drama spanning seven months of speechmaking and tireless cloakroom bargaining, was the flamboyant Senate triumvirate that had long overshadowed American presidents and whose names still ring out: Henry Clay, John Calhoun and Daniel Webster. Each had been born during the Revolution, each had devoted his career to preserving the edifice the Founders had built, and each sought to avert civil war. Like Kennedy, all had failed in their own bids for the presidency. And they violently disagreed with each other.
Clay and Webster rose day after day to speak as nationalists, while a dying Calhoun, representing the Southern sectionalists, was so weak he could no longer speak for himself. Emaciated and huddling in blankets, he sat mute as James Mason of Virginia read his twilight speech to the Senate, a Cassandra-like warning about impending disaster for the nation. In the end their maneuverings staved off civil war for a decade. Even in 2009, few know much about Presidents Zachary Taylor, Millard Fillmore or Franklin Pierce—but every educated student recalls Daniel Webster's immortal words, "I wish to speak today, not as a Massachusetts man nor as a Northern man, but as an American." Remarkably, that same Senate era also produced other such towering figures as Stephen Douglas, Jefferson Davis, William Seward and Salmon Chase.
Do Ted Kennedy's accomplishments rival those of these pre-Civil War debates? Or, for instance, the legendary actions of Sen. Arthur Vandenberg, the chairman of the Senate Foreign Relations Committee and an isolationist-turned-internationalist who became a central architect of the bipartisan foreign policy that led to the Truman Doctrine, the Marshall Plan and NATO? Probably not. Unlike these other Senate greats, he never held much sway over the all important questions of war and peace. Still, it is hard not to take notice of his distinguished 47 years of service—third longest in Senate history—or his long list of legislative accomplishments that touched millions of lives, in education, health care and civil rights...
Posted on: Friday, August 28, 2009 - 21:24
SOURCE: The Huffington Post (8-27-09)
Many things have happened in the PRC this year that echo phenomena discussed in China in 2008: A Year of Great Significance, a book I co-edited with Kate Merkel-Hess and Kenneth L. Pomeranz (both of whom, like me, are historians based at UC Irvine who sometimes write for the Huffington Post and are co-founders of the China Beat). The most recent example of a 2009 variation on a 2008 theme has been the renewal of Shanghai protests relating to train lines. As Associated Press reporter Elaine Kurtenbach notes in her valuable dispatch on the latest demonstration, the 2009 agitation has so far been on a smaller scale than the early 2008 one discussed in our book. The most recent protests have also been directed toward a more convetional kind of railway (albeit one that moves very fast) rather than a Magnetic levitation (Maglev) one.
Nevertheless, Kurtenbach's summary of the situation (in this case regarding a line that would head out of the city in order to link Shanghai to Hangzhou, as opposed to one that would run through the heart of the metropolis to connect its eastern and western districts) describes a familiar source of discontent. Here's how she puts it: "China's topdown style of governing and state-controlled media allow for scant public input, and increasing affluence has left many residents expecting more opportunities to be heard."
The developing situation seems similar enough overall that of the early 2008 anti-Maglev "strolls" (a term used by protesters to suggest a reasonable and non-confrontational call to be heard rather than a militant action) and some other urban struggles of the last couple of years (e.g., the 2007 Xiamen demonstrations trigged by plans to built a chemicle plant) that it seems useful to provide a few links here to commentaries on those events of the recent past. The use of the acronym "NIMBY, standing for "Not in My Backyard," seems appropriate again (it is a term that some of us commenting on the anti-Maglev protests used at the time), since the 2009 railway protests again involve homeowners and renters trying to protect the livability of neighborhoods and sometimes also the health of their children and their property values.
If you happen to have China in 2008 handy, you can find a good deal of background reading that helps put the latest railway protests into perpsective. On pages 15-21, for example, you will find two views of Chinese NIMBY protests--protests that, it is worth noting, have sometimes achieved at least some degree of success, delaying if not always derailing (pardon the cheap pun) the development plans to which the demonstrators involved objected.
The first of these two pieces from the book that I have in mind is a short commentary on the subject that I wrote in January 2008, which first appeared and is still available online at the Nation's website here, where you will find it accompanied by a Youtube clip of an anti-Maglev demonstration. (Much that I say there dovetails with what others wrote about the subject before or after I weighed in on it, but I think I am still the only one to have placed the Shanghai protests into a historical context that takes in not just the Xiamen ones of the previous year but the actions of rickshaw pullers worried about the introduction of streetcars that threatened their livelihoods early in the 1900s.)
The other relevant contribution to China in 2008 I was thinking of is a reprint of an interview that blogger and freelance journalist Angilee Shah did with political scientist Benjamin L. Read, who has been doing important work on homeowners' associations in both the PRC and Taiwan. That interview, which first appeared as a very early China Beat post, can be found online here.
Other valuable takes on the phenomenon that are just a click away include:
1) This smart piece on the anti-Maglev protests of early 2008 (again with a YouTube video accompaniment and nods back to Xiamen) by Maureen Fan.
2) This extended analysis of "strolls" and other forms of non-confrontational protests (and their possible impact over the long run) by two social scientists, George J. Gilboy and the aforementioned Read, which appeared in the Summer 2008 issue of The Washington Quarterly.
3) This look at Chengdu protests of May 2008, with nods back to Xiamen and Shanghai, by Jeremy Goldkorn of the invaluable Danwei.org site, who quotes liberally from a New York Times report but also makes some additional points of his own and lets the interested reader know the characters used for a couple of the terms mentioned (in case they were wondering what "stroll" looks like in Chinese, for example, though alas we do not get his gloss of NIMBY in Mandarin). There are also some interesting comments from readers appended to the piece.
4) This useful report by Jonathan Watts of the Guardian on Beijing NIMBY protests (by people who wore surgical masks to highlight their concern over pollution) in the aftermath of the Olympics.
5) This wide-ranging and thoughtful essay surveying the rise, during the years immediately preceding the Xiamen protests, of various forms of environmental activism, much of which relied on the use of new media of communication of the sort that have figured in all of the actions just mentioned. Assessing the potential of a new "green public sphere," this article was co-written by Guobin Yang (who also deals with many related issues in his important new book on the Chinese Internet) and Craig Calhoun (President of the Social Science Research Council and author of one of the best book-length studies of the Tiananmen Uprising of 1989).
Posted on: Friday, August 28, 2009 - 20:25
SOURCE: John Q. Barrett (8-27-09)
For the Jackson List:
In 1989, off-duty Savannah, Georgia, police officer Mark MacPhail, responding to a report that a homeless man was being beaten in a restaurant parking lot, was shot multiple times and died. Another man, Troy Davis, subsequently was arrested and charged with the murder. Although he at trial denied guilt and claimed that another man was the shooter, the jury heard and apparently believed witnesses who implicated Davis—the jury convicted him and sentenced him to death.
In subsequent years, seven of the State’s nine trial witnesses against Davis signed affidavits recanting their testimony. Another of the State’s trial witnesses reportedly confessed to shooting Officer MacPhail. The Georgia Supreme Court, the United States Court of Appeals for the Eleventh Circuit and the Georgia Board of Pardons each has, in various and complex proceedings, nonetheless rejected Davis’s claim of “actual innocence” and affirmed his conviction and death sentence.
* * *
Last week, the Supreme Court of the United States, acting during its summer recess, granted Davis’s request for federal court review of his innocence claim. A majority of the Justices, acting on Davis’s petition seeking a writ of habeas corpus directly from the Supreme Court itself, transferred his claim to a Federal District Judge in the Southern District of Georgia. The Supreme Court directed the District Judge to “receive testimony and make findings of fact as to whether evidence that could not have been obtained at the time of trial clearly establishes [Davis]’s innocence.”
Justice Antonin Scalia, dissenting from the Court’s action, filed an opinion calling Davis’s claim is a “sure loser.” Justice Scalia’s view is based on his understanding of the law, not the facts of Davis’s case. According to Justice Scalia, the Supreme Court has never held that the United States Constitution bars a State from executing an innocent person, so long as that person was convicted of capital murder and sentenced to death through trial processes that met all constitutional requirements. Because Davis is not contesting the fairness of trial procedures, wrote Justice Scalia (joined by Justice Clarence Thomas), the Federal District Judge is explicitly barred by the federal habeas corpus statute from granting Davis any relief. The Supreme Court has, according to Justice Scalia, sent the Federal District Judge “on a fool’s errand….”
Justice Scalia’s arguments in In re Davis, including his claim that a federal judge who finds a convicted prisoner to be actually innocent would be barred by the federal habeas corpus statute and under no constitutional duty to provide relief, did not go unaddressed in the Court. Justice John Paul Stevens, joined by Justices Ruth Bader Ginsburg and Stephen Breyer, filed a concurring opinion in which he explained his view that the District Judge could, if facts show Davis’s innocence, grant him relief, and that the Court’s decision to transfer the case to the District Court thus “is by no means ‘a fool’s errand.’” (Presumably at least two more Justices voted with these three to make up the Court’s majority. The Court announced that newly appointed Justice Sonia Sotomayor did not participate in the decision but did not specify how Chief Justice John Roberts, Justice Anthony Kennedy or Justice Samuel Alito voted.)
* * *
Neither Justice Scalia nor, in his rejoinder, Justice Stevens, indicated anything about the source and original context of the phrase, “a fool’s errand,” that is the contested characterization of their legal disagreement. Dictionaries define “a fool’s errand” as a “completely absurd, pointless or useless” task, as “a fruitless mission or undertaking.”
In our idiom, the phrase came from the title of a hugely popular, best-selling novel of 1879, A Fool’s Errand, written by an anonymous author who is identified on the book as merely “One of the Fools.” The novel tells the story of a former Union soldier who after the Civil War buys a decayed southern plantation, moves his family to this new home and becomes known and hated by his neighbors as a Yankee “carpetbagger” working with former slaves and against the Ku Klux Klan. Near the end of the story, the protagonist offers these reflections on his experiences:
The North and the South are simply convenient names for two distinct, hostile, and irreconcilable ideas,—two civilizations they are sometimes called, especially at the South. At the North there is somewhat more of an intellectual arrogance; and we are apt to speak of the one as civilization, and of the other as a species of barbarism. These two must always be in conflict until the one prevails, and the other falls. To uproot this one, and plant the other in its stead, is not the work of a moment or a day. That was our mistake. We tried to superimpose the civilization, the idea of the North, upon the South at a moment’s warning. …[W]e tried to build up communities there which should be identical in thought, sentiment, growth, and development, with those of the North. It was A Fool’s Errand.”
The author of the book, it came to be known, was Albion Winegar Tourgée, a lawyer and writer who achieved great prominence in the late 19th century. The book is Tourgée’s fictionalized account of and reflections on his experiences as a Reconstruction-era judge inNorth Carolina.
After the collapse of Reconstruction, Tourgée returned to the North from his “fool’s errand.” He bought a grand home in Mayville, Chautauqua County, New York. He wrote this novel and many other books and pursued publishing ventures. He also pursued legal work and other causes. Under Presidents McKinley and Theodore Roosevelt, Tourgée represented the United States abroad. He died in France in 1905.
* * *
All of this connects quite directly to Justice Robert H. Jackson. For most of the years 1913-33, Jackson practiced law in Jamestown, Chautauqua County, New York. Jackson often litigated cases in the county seat courthouse in Mayville, which is located at the north end of Chautauqua Lake opposite Jamestown. On a Wednesday afternoon in fall 1924, thirty-two year old Jackson, fellow Chautauqua County lawyers and others attended an auction in Mayville, in Albion Tourgée’s former home, of the late author’s effects. I do not know thatJackson bought anything that had once been Tourgée’s, but Jackson knew of the man and his work.
By 1950, Robert Jackson had been a United States Supreme Court justice for more than nine years (occupying the seat that is today occupied by Justice Scalia, who frequently states his admiration for Jackson). That spring, Justice Jackson and his fellow Justices were deciding three cases challenging the legality, including the constitutionality, of racial segregation laws pertaining to, respectively, a state university’s law school, another state’s graduate school, and interstate rail transportation. In his research, Jackson reread Plessy v. Ferguson, the Supreme Court’s 1896 decision affirming the constitutionality of Louisiana’s law requiring “separate but equal” railroad accommodations for black and white passengers. Jackson noticed, for the first time, the name of Homer Plessy’s attorney in the Supreme Court: Albion Tourgée. Jackson retrieved from Supreme Court archives and read Tourgée’s brief in the case. It of course had been, for him, another “fool’s errand”—the plaintiff Plessy and his lawyer Tourgée lost the case, their constitutional argument against racial segregation and, indeed, the votes of all but one Justice.
While the 1950 cases were pending, Justice Jackson confirmed confidentially with his former colleagues and continuing friends in Chautauqua County that Plessy’s lawyer Tourgée was the same man who had been their Mayville predecessor Tourgée. And within weeks, in those cases (Sweatt v. Painter, McLaurin v. Oklahoma State Regents, andHenderson v. United States) as in Brown v. Board of Education four years later, Justice Jackson and each of his Court colleagues voted to strike down the laws of racial segregation.
What had once been, in law, a “fool’s errand” was no longer.
Posted on: Friday, August 28, 2009 - 00:06
SOURCE: Common Dreams.org (8-27-09)
In the 1950s and early 1960s, it was the accepted view among many social scientists that, as ethnic assimilation advanced, ethnic group identities would fade away. But in fact, ethnicity continued to impact significantly upon political life. Why was that?
Acculturation and Assimilation
In 1967, I published an article in the American Political Science Review arguing that assimilation would not wipe out ethnic politics and ethnic identities in the foreseeable future because assimilation was not happening.
I suggested that we needed to distinguish between culture and social systems. A culture is a system of beliefs, values, images, lifestyles, and customary practices including language, law, arts, and the like. A social system consists of the structured relations and associations among individuals and groups both formal and informal: family, church, school, workplace, and other networks of roles and status. The culture is mediated through the social system or social structure, as it is sometimes called.
To become well practiced to a prevailing culture is to acculturate. To become absorbed into the dominant social structure is to assimilate. Since the beginning of the American nation the Anglo Protestant nativist population has wanted minority ethnic groups to acculturate but not necessarily assimilate. The “late-migration” Southern and Eastern Europeans were expected to discard their alien customs and appearances offensive to American sensibilities. A new verb was invented: they had to “Americanize.”
To make matters worse, these immigrants of the late nineteenth and early twentieth centuries settled mostly in the large urban centers of the Northeast and Midwest (where the jobs were), places that small town Protestant America already loathed as squalid and decadent hellholes.
The public schools became special agencies of acculturation to be imposed on the immigrant children. As a child in a classroom full of Italian-American grade-schoolers in New York City, I was treated to patriotic tales about George Washington, Nathan Hale, Paul Revere, and other of our “heroic founders.” We recited the Pledge of Allegiance and sang the “Star-Spangled Banner” and “America the Beautiful.” And I recall at least one of my teachers telling us in an annoyed tone: “Tell your parents to speak English at home.”
By the second-generation (children of the immigrants), the ethnics already had undergone a substantial degree of acculturation in language, dress, recreation, entertainment tastes, and other lifestyle practices and customs, while interest in old world culture became minimal if not nonexistent.
However, such acculturation was most often not followed by social assimilation. The group became Americanized in much of its cultural practices, but this says little about its social relations with the host society. In the face of widespread acculturation, ethnic minorities still maintained social group relations composed mostly of fellow ethnics.
The pressure to acculturate was not accompanied by any invitation to assimilate into Anglo Protestant primary group relations within the dominant social structure. It seems the nativist bigots well understood the distinction between acculturation and assimilation, even if they never actually used such terms. In a word, “You must Americanize but not in my social circle.”
Dual Identities and Group “Traumas”
Many of the crucial images that a marginalized ethnic group has of itself do not come from itself but from the dominant culture and dominant social order. For us Italians, the immigrant generation was reduced to a Luigi caricature, a simple soul who spoke in a pasta-ladened accent. Then came the perennial Mafia mobster, recently given new life with The Sopranos. Also still going strong are the television commercials portraying large boisterous Italian families gathered around the dinner table to shovel immense amounts of food into their mouths and at each other in what resembles an athletic contest.
Another enduring stereotype is of the Italian American as a working-class boor, a dimwit proletarian, visceral, violent, and thoroughly unschooled. There is nothing wrong with being working-class but there is plenty wrong with a vulgar class caricature that defames all working people (whatever their ethnic antecedents). Left out of such scripts are the realities and struggles of workers, a subject seldom treated in the mainstream news or entertainment media.
Media stereotypes aside, there exists a duality in the Italian- American self-identity: on the one hand, a strong in-group pride regarding our heritage and an assertion of our worth as Italians to counteract the wretched stereotypes, along with strong family involvements that remain ethnically tinged, to say the least.
On the other hand, there are the strenuous assertions of our “100 percent Americanism” as a way of overcoming social marginalization. This is what I have called cultural ambidexterity, the promotion of both ethnic pride and Americanism all at the same time, usually accompanied by a political conservatism.
It is an image duality that fits into the acculturation/assimilation model: we acculturate to the American identity, often with a compensatory militancy because of our being somewhat marginalized and unassimilated. This marginalization at the same time adds to our determination to hold to an Italian group awareness and loyalty.
There are ethnic groups that have sustained enormous historic trauma, leaving them with an enduring mega-narrative. For them, ethnic identity is an especially strong imperative. A few prominent examples:
African Americans who have braved centuries of slavery, plus a century of segregation and lynch-mob rule, and today the persistent poisonous sting of White racism.
Jewish Americans who have been victimized by two thousand years of Christianist inspired anti-Semitism and violence capped by the historic trauma of the Holocaust.
Latino-Americans and Asian Americans who would be much further along the assimilation track having been fairly early arrivals, but whose ranks have been lately infused with millions of newcomers, thereby raising fresh issues of acculturation, economic hardship, and even residential legality, all of which heightens a defensive awareness of group identity. In the case of Asian-Americans—and to some extent with Latinos also--there is the additional mark of racism with which to contend.
Arab Americans and Persian Americans in this country in relatively smaller numbers with less visibility but who in recent years have been unjustly harassed and stigmatized as being associated with terrorist groups.
In each of the above examples, group identity retains a special saliency because it is linked to past or present issues of discrimination and mistreatment. What I wrote in my 1967 article still seems to hold: while much ethnic militancy and group boosterism is considered clannish, it “is really defensive. The greater the animosity, exclusion, and disadvantage, generally the more will ethnic self-awareness permeate the individual’s feelings and evaluations.”
In addition, let it not be forgotten that people retain ethnic attachments not only because their group is under assault but because the attachment provides a nurturing social and cultural experience. So along with the negative-defensive identity we have the positive-enjoyment ethnic experience. This is as true of Italian-Americans as of any group in the United States.
The Future Has Arrived
Does assimilation happen to any ethnic group? Yes indeed. Those northern European Protestants who invaded this country in what are called the “early migrations” of the seventeenth, eighteenth and early nineteenth centuries, riddled as they were with sectarian enmities and national differences, pretty much blended into the nativist melting pot by the mid-19th century. Today persons of long settled and much mixed northern European Protestant lineage are often notably vague about their antecedents. Their ethnic identity is a matter of no great urgency and has a low saliency to their self identity.
Some white Protestant immigrants are relatively recent arrivals yet they have enjoyed a fast-track assimilation, given their linguistic, physical, religious, and tempermental resemblance to the Anglo-American Protestant prototype. British workers who migrated here at about the same time as Italians, Greeks, and Jews were more or less well assimilated within one generation.
The 1967 article I wrote about ethnic assimilation--or rather the absence of assimilation—focused on the Southern and Eastern European groups of the “late migrations” of 1870-1914. I concluded that ethnic identity would persist and would continue to play a role in political life well into the distant future. My article relied on data from the early 1960s but also from the 1940s and 1950s, in other words, as much as sixty or seventy years ago.
With this passage of time, one might say that the “distant future” has arrived, at least for the white ethnics: the Irish, Poles, Italians, Greeks, Portuguese, and others. Today the ethnic identities of these late migration groups have much less saliency. This can be seen most dramatically in the political realm where references to a candidate’s ethnicity have become quite rare unless the individual is African-American, Latino, or Asian.
Recall how John F. Kennedy’s Irish Catholic antecedents were such a touchy issue in the 1963 presidential contest. But by 2004 it no longer mattered that Democratic presidential candidate John Kerry was a Roman Catholic. And in the 2008 election, it went largely unmentioned that vice presidential candidate Joe Biden was Irish Catholic.
In 2006 the media took no notice that Nancy Pelosi was the first Italian-American Speaker of the House; instead attention dwelt almost exclusively on the fact that she was the first woman to occupy that post.
For years Italian-American organizations had called for an Italian-American appointment to the Supreme Court. By 2006 there were two Italian-Americans on the Supreme Court, Antonin Scalia and Samuel Alito, both conservatives. Progressive Italian Americans like myself were not dancing in the streets bursting with pride. If anything, we opposed both nominations. Obviously the politics of such jurists were of more significance to us than their ethnic antecedents.
In fact, as far as I could observe, no one took note that there were two Italians on the High Court except for the several conservative Italian-American organizations that ran full-page ads in 2006 designed to misrepresent those who opposed Alito on political grounds as being opposed to him out of ethnic bias. Thus the ads argued that Alito was being derisively called “Scalito” (true) because of some anti-Italian prejudice (untrue). Actually the conflation of names was invited by their similarity and impelled by the fact that Alito was a right-wing reactionary twin of Scalia’s.
Pre-election opinion polls and exit polls published in the mainstream press reveal what groups are still in the public consciousness and what groups have faded into the background. After the 2008 presidential election, the New York Times devoted an entire page to how various groups in America voted. The Times broke down the electorate by income, gender, education level, region, and religion, but no ethnic groups other than Latinos and African-Americans.
Gone were the old time Harris-poll and Gallup-poll reports on how Italians, Irish, Poles, Germans, Hungarians, Portuguese, Greeks, and the like have voted. The white ethnics of the late migrations seem to have assimilated into the electoral mainstream, at least as distinct voting groups.
Those of us who are Italian-Americans might ask, is assimilation our ultimate fate? And does assimilation mean disappearance as a distinct ethnic entity? Is it our destiny to be melted down by the melting pot, going the way of the earlier Anglo-Protestant groups?
Be aware that social assimilation also leads to a high degree of biological assimilation, in other words, intermarriage and interbreeding with increasing numbers of offspring who are of mixed heritage. This growing tendency toward intermarriage has been a source of concern among some Jewish organizations that posit the question: “Will intermarriage succeed in doing what 2000 years of oppression could not do?” (namely bring about the disappearance of the Jewish people).
For Italian Americans at the present time ethnic awareness still retains a significant saliency even among those who attain high levels of education and professional training--as demonstrated by the rich offering of scholarly papers presented at the yearly meetings of the American Italian Historical Association.
There is no reason to assume that a person’s identity choices are mutually exclusive rather than multifaceted. Multiculturalism can obtain not only in society but within individuals. And individuals of mixed descent can enjoy multiple identities and loyalties.
In addition, as noted earlier, ethnic identity is not only reactive but proactive, not only a defense against derogatory stereotypes, not only a compensatory assurance of group worth, but a positive enjoyment, a celebration of our history and culture in this country and in Italy. It is a way of connecting with others in what too often is a friendless and ruthless market society, a nurturing identity that is larger than the self yet smaller than the nation.
It would do well if we could bring more of a social content to our ethnic identity. The Italian-American Political Solidarity Club has just published a book whose title urges as much: Avanti Popolo: Italian-American Writers Sail Beyond Columbus. The book urges that on Columbus Day instead of celebrating conquest we should acknowledge those who fought for the rights of all immigrants and for social justice.
Indeed Italian Americans need to bring substance to the symbolic politics that have been fed to us. We do not need another statue to Columbus. Some such as Diane Di Prima, Tommi Avicolli-Mecca, and Juliet Ucelli have organized “Dumping Columbus” readings and other events that challenge the iconic image of the Great Navigator and instead commemorate the Native Americans he enslaved and murdered.
Philip Cannistraro and Gerald Meyer (of German Protestant lineage) edited a book, The Lost World of Italian-American Radicalism, that reclaims some of the history of radical Italian-American immigrants, labor leaders, union organizers, antiwar activists, and political protesters, a history long neglected or repressed.
To frame the Italian-American experience within a context of struggle for social justice and economic survival is to give it a dimension that goes beyond nostalgia and sentimentality, and flies in the face of the stereotypes that weigh down upon us Italians. Thus do we not only realize more of ourselves but we connect to more of the world, especially to the class realities that compose so much of life yet remain too often unmentioned and unnoticed.
Posted on: Thursday, August 27, 2009 - 23:54
SOURCE: The Huffington Post (8-26-09)
Two front-page New York Times (August 25, 2009) stories appeared concerning our treatment of alleged terrorists and past practices of torture. First, Attorney General Eric Holder announced the naming of a special prosecutor to investigate the long-ignored, long-suppressed, and now much redacted Inspector General's report of the Central Intelligence Agency's physical and mental abuses of detainees. (May we say,"torture?")
The news emphasized the attorney general's agreement with the president's oft-repeated insistence that he was unwilling to investigate and prosecute past misdeeds that occurred under the previous administration. But Holder stated he was compelled to"follow the law" and appoint a prosecutor. Attorneys general are not noted for their courage in acting contrary to the president's wishes. Perhaps Holder is signaling that the president wants us to know that his heart is in the right place.
But more consistent with President Obama's zigs and zags of late, the Times's second story reported that the administration will maintain the Bush-Cheney policy of sending terrorism suspects to third countries for detention and interrogation. The Times quickly added the administration's press release, stating its pledge"to closely monitor their treatment to ensure they are not tortured." Apparently, the FBI will do the monitoring -- which should be interesting.
The qualification of monitoring is at odds with the president's statement in 2007, just after he announced his candidacy for president. We must, he then insisted,"end the practice of shipping away prisoners in the dead of night to be tortured in far-off counties, ... [and] of maintaining a network of secret prisons to jail people beyond the reach of the law." Several months ago, the president announced that the secret network of CIA-run prisons would end, but the news now is to continue the harsh interrogations in foreign-run prisons. A distinction with no difference.
Common sense is in order here: why bother with the time and expense of dispatching suspects to third countries for further interrogation -- if not because we prefer that others dirty their hands, and not our folks. If they are going to be properly questioned, why not in the United States? What will we monitor abroad -- Syrians serving coffee to their prisoners, or Egyptians serving tea? Must we continue to use unemployed eastern European secret police from the Communist past to facilitate our interrogations?
The administration's obvious contradiction probably will be ignored by reporters covering the White House or the Department of Justice. Will they dare ask such questions in the face of their imagined fears of losing"access?" The president should be challenged. His campaign words on torture are empty, rendered meaningless with his"new" policy on rendition.
Posted on: Thursday, August 27, 2009 - 21:48
SOURCE: OpEdNews.com (8-23-09)
The last years' developments in the former Soviet Union were fascinating for the specialist and might have puzzled the layman: Why have Europe's two largest countries developed in such different ways? Russia has returned to authoritarianism while Ukraine seems to be maturing towards a real democracy. How did this happen – in spite of these nations' similar Eastern Slavic Orthodox cultures and intertwined histories?
Ukraine's Cultural Divisions versus Russia's Marginal Minorities
Valuable answers by social scientists have often focused on the particular circumstances of Ukraine's and Russia's transformation since 1990. The Ukrainian nation's division into two regional and political cultures, recently compared by Ivan Katchanovski with the more tragic split of Moldova, creates numerous problems, but had also the effect of supporting pluralism. The stalemate between the historically distinct regions of pro-Western Galicia, Transcarpathia, Volhynia and Bukovina, on the one side, and of pro-Russian Eastern and Southern Ukraine, on the other, has meant that the country's political landscape has become, asLucan Way put it, “pluralistic by default.” While not per se a democracy-promoting factor, the geographical differentiation of Ukraine's population constituted, as Paul D'Anieri showed, a hindrance to excessive centralization of power.
Post-Soviet Russia, in contrast, is culturally more homogeneous. Often seen as a multi-national country, the Russian Federation's population is, in fact, dominated by its 80% of ethnic Russians among whom there is little cultural-regional differentiation. The remaining 20% are split among small nationalities and diasporas who play an important role for Russia's self-definition as a poly-ethnic state. Yet, these minorities do not represent a consolidated political force exerting direct influence on Moscow's foreign and domestic policies comparable to that of the South Eastern or North Western parts of Ukraine. In addition, Russia has suffered from an, as Michael McFaul argued, “unfinished revolution” under Boris Yeltsin. Russia's incomplete transition of the early 1990s, especially the underdevelopment of its new democratic institutions, had always been a liability. Ultimately, it led to Putin's autocratic restoration. Further arguments explaining Russia's unsuccessful reforms have focused on the democracy-subverting role of Yeltsin's Chechnia adventure of 1994, or dubious political repercussions of Russia's large energy reserves.
While these explanations are valid, they do not fully answer the question why Ukraine is, so far, the only 1922 USSR founding republic on the way toward a consolidated democracy. Additional explanation can be found in Ukraine's history or rather in Kyiv's historical mythology.
Ukraine's Pro-Democratic Historical Mythology
First and foremost, Ukraine's centuries-old struggle for political autonomy and independence from foreign dominance – Mongol, Muscovite, Polish, Lithuanian, Ottoman, Austro-Hungarian and Soviet – is the guiding idea of modern Ukrainian historiography. The preservation of sovereignty is supported, to one degree or another, by most relevant decision-makers and intellectuals. As Taras Kuzio has argued, Ukraine can be seen as a post-colonial country where a largely emancipatory, anti-imperial and liberationist nationalism supports rather than counteracts democratic tendencies. In spite of having a relatively low 3-4% barrier in its parliamentary elections (conducted according to proportional representation), the Verkhovna Rada, Ukraine's parliament, has never featured a radically right-wing, ultra-nationalist faction since the country's independence in 1991. This is in stark contrast to most other East European and even many West European countries of the post-Cold War era.
Moreover, identity politics in contemporary Ukraine more often than not refers to pre-Soviet proto-democratic experiences seen as constitutive for the Ukrainian nation and demonstrating its embeddedness in Europe.
Thus, the idea of democratic rule is traced back to the era of Kyiv Rus from the 9th til 12th century considered the Golden Age in Ukrainian pre-national statehood. Kyiv Rus is seen as a state having grass-roots democracy in local assemblies (vicha), developing the relatively sophisticated legal code of Ruska Pravda, and making tentative attempts to establish an elective monarchy. With the rise of the Cossack Hetmanate in the 16th century, there emerged another Ukrainian proto-state playing an important role in contemporary national identity. Being armed peasants, the Cossacks formed, along the Dnipro riverbank, a military republic with a male assembly – the Rada – that chose its Hetman, a military leader, by election. The Cossack's love of freedom and semi-democratic rule influences Ukrainian self-images until today. The Cossack Hetman Pylyp Orlyk drafted, as early as 1710, one of the world's first constitutions that sought to transform the Hetmanate into an electoral monarchy. Orlyk's basic law never entered into force, and is, by contemporary standards, a simplistic text. Yet, in its time, this document gained recognition as a serious document, and served as a blueprint for future constitutional designs.
During the turmoils of the revolution of 1917-1918, there emerged briefly the Ukrainian People's Republic with an assembly composed of delegates from all relevant political parties. This short, yet by today's standards already largely democratic experiment was particularly noteworthy for the commitment of its leaders to the rule of law.
Even more important than the actual course of history is that these and some other trends inform historical myths defining national identity today. Thus, Ukrainians see themselves as having a tradition of individualism and love of freedom, and their country as always having been diverse, lacking strong rule, and even ungovernable. Sometimes cited as factors explaining Ukraine's seeming inability to secure national self-determination, these features have, more recently, been supportive of democratic transition. They promoted the moderate and consensus-seeking elite behavior demonstrated in the 2004 Orange Revolution and 2007 confrontation between parliament and the president.
The Russians' View of Russia's Imperial Legacy
This is different from the historiography and autostereotypes dominant in Russia today. To be sure, Muscovite history too had a number of proto-democratic tendencies. Russia can also lay claim to the heritage of Kyiv Rus and Cossack self-rule. Moreover, in the Middle Ages, the famous city states of Novgorod and Pskov featured a collective ruling organ representing the nobility (veche), elections of executive power holders, as well as rudimentary checks and balances. Later, the Zemsky Sobor (Assembly of the Lands) elected the first Romanov Tsar Mikhail. His descendant Alexander II, in 1861, started the so-called Great Reforms. Alexander's comprehensive project included liberation of the serfs, legal reforms, introduction of local self-government, and creation of a legislative organ. Eventually, the transformation started by the “Tsar Liberator” and continued in the 1905-1907 revolution might have led to the emergence of a constitutional monarchy along West European models. Yet, Russia's first attempt to transit to democracy was subverted by the devastating effects of World War I.
While these facts are, of course, well-known in Russia, they play a relatively minor role in Russian national mythology and self-perception. Instead, affirmative assessments of figures like Alexander Nevsky or Peter the Great, and, partly, even of Ivan the Terrible and Josef Stalin dominate popular historical memory today. These men were successful military leaders and often modernizers of sorts. Yet, they also concentrated power and did not tolerate checks on their prerogatives. Even the most pro-Western among Russia's Tsars, the almost universally adored Peter the Great, played an ambivalent role in Russian history: The modernized state that Peter left was also highly centralized, if not proto-totalitarian.
In a way, Putin's meteoric (if oil-price driven) rise can be explained against this background. Although a lucky rather than great leader, the current prime-minister and de facto ruler of Russia seems to fit the image of a new Peter – an authoritarian, yet (seemingly) effective modernizer.
It is less the glorious history of the Russian people and their many geniuses (more often than not, harassed by their government) that define Russians' view of their fatherland's history. Instead, Russia's imperial legacy of military might, territorial expansion and victories in wars is what, many Russians feel, makes them unique. Most Ukrainians see the Dnipro Cossack republic not as a militaristic order (that it, in some ways, was), but as a stronghold of freedom. Kyiv's elite welcomed the break-up of the Soviet Union as liberation. In contrast, the Russians are currently rediscovering Muscovy's Byzantine legacy of caesaropapism. Their deep frustration about the loss of their empire and super-power status in 1989-91 has some observers led to speak of a “Weimar syndrome” and to compare post-Soviet Russia with pre-fascist Germany.
History is obviously not everything. As post-fascist Germany's development shows, countries can change rapidly. However, as long as Russia and other post-Soviet republics will keep a national mythology that pays little attention to proto-democratic beginnings in their history, they will remain trapped in their authoritarian traditions. Ukraine provides an example of how a country can break with an unusable past, and create a pluralistic polity drawing on appropriate (if, sometimes, idealized) precedents in its national history.
Posted on: Thursday, August 27, 2009 - 11:43
SOURCE: Informed Comment (8-26-09)
Of course, the Likud government of Israel is all about preying on the Palestinians' land and resources and is die-hard opposed to a two-state solution. Israel is strangling the Palestinian economy.
The Israelis are restricting Palestinians' water supply and essentially using their water at a rate 4 times that of the Palestinians.
Dozens of Palestinians in East Jerusalem have been pushed out of their property by Israeli squatters and are now forced to sleep in the streets.
Israeli illegal immigrants into the Palestinian West Bank routinely act like thugs, beating up on Palestinians and stealing from them.
Israel has 11,000 Palestinians behind bars, and has repeatedly blocked family visits to prisoners, which the Red Cross has called a violation of the Geneva Conventions and international law more generally.
The Israeli military justified the attack on an unarmed American peace protester as a 'justifiable act of war.' (He is in a perhaps permanent coma).
Much of the US press, as usual, is ignoring the belligerent statements of Likudniks in the Israeli government and misrepresenting the Palestinians, whose statelessness (and consequent lack of human and legal rights) is imposed on them by a brutal Israeli military occupation and/or perpetual siege and blockade.
If there is going to be a two-state solution, as Obama insists and toward which the Fatah government in the West Bank is now moving quickly, it will depend on level-headed Israelis who recognize that the occupation of the Palestinians is actually a threat to Israel. I'm not optimistic that the rumored turn to a harder line against Iran by Obama in return for a Likud acquiescence in a Palestinian state will actually work. A gasoline boycott on Iran won't be effective, and the Likud will likely drag its feet so that in the end it will get everything its leaders want-- no real Palestinian state, continued subjection and exploitation of the Palestinians, and plus bad US-Iran relations and sanctions on Iran.
Now may be a time for Avi Shaked, the multi-billionaire internet entrepreneur, to make another offer.
On April 4, 2009, Haaretz revealed that Avi Shaked had been an extremely influential figure in the Geneva Accord Track II negotiations between liberal Israelis and the Palestinians.
In 2006, Shaked made headlines by offering then Prime Minister Ehud Olmert and Palestinian Authority President Mahmoud Abbas a billion dollars. (H/t to this poker site for preserving this news item) if only they would just make peace already. Given how corrupt both governments are, it is amazing that no one took Shaked up on his offer.
I'm being a little tongue in cheek, of course. But influential and prominent pro-peace Israelis like Shaked do exist, and even the right of center Kadima Party has accepted the need for two states. I keep hoping that Netanyahu's government will fall and that those Israelis who want to do the right thing, for themselves and for the Palestinians, will get in in time for Obama to finally settle this dispute, which has poisoned the Middle East against the United States and generates enormous violence and tragedy on both sides.
Posted on: Wednesday, August 26, 2009 - 21:35
SOURCE: The Weekly Standard (8-21-09)
Comparisons between our current efforts in Afghanistan and the Soviet intervention that led to the collapse of the USSR are natural and can be helpful, but only with great care. Below are a number of key points to keep in mind when thinking about the Soviet operations, especially when considering the size of the U.S. or international military footprint.
War did not begin in 1979 when the Soviets invaded. It started in 1978 following the Saur Revolution in which Nur M. Taraki seized power from Mohammad Daoud. Taraki declared the establishment of the Democratic Republic of Afghanistan and set about bringing real socialism to the country.
Soviet advisors recommended that Taraki proceed slowly with social and economic reforms. They recognized that the socialist party (People's Democratic Party of Afghanistan or PDPA) had the support of a tiny minority. They feared that Taraki's plans for aggressive "modernization" would generate an awful backlash. They were right.
The PDPA instituted a number of critical reforms after its seizure of power, including eliminating the "brideprice" that a bride's family received from the groom's family and redistributing land on a large scale. These reforms struck at the heart of Afghan society by destroying key pillars in the social structure. Land redistribution upended rural tribal relations, and the elimination of "brideprice" payments destroyed an important traditional method for bonding families following a marriage. It also struck at the role of women in Afghan society, a broader theme the PDPA pushed that alienated wide sections of a very conservative country. In general, there is no faster way to antagonize a population than by attacking property rights and the status of women. Taraki did both.
By early 1979, the Afghan countryside was in revolt against the PDPA. Forces that would become the mujahideen were already mobilizing across the country to fight against the Taraki government even before the Soviets became involved. Afghan army units in Herat mutinied in March 1979, briefly seizing the city on behalf of Ismail Khan.
Factionalism within the PDPA weakened the government, leading in September 1979 to the assassination of Taraki and his replacement by the brutal and incompetent Hafizullah Amin. The insurgency continued to grow. Insurgents attacked government and military convoys on the roads, and interdicted movement from Kabul to the north along the Salang road. In October, the U.S. Embassy in Kabul cabled: "When government troops and their armor do occasionally venture forth out of their defensive positions to show the flag, the rebels repossess the real estate after they have passed, like the waters of the Red Sea closing in behind Moses and his followers."
By December 1979 the Soviets had reluctantly decided that the PDPA government would either fall or throw in its lot with the United States if they did not intervene decisively. Their intervention took the form of a brilliantly-executed regime take-down at the end of the year during which they killed Amin and installed Babrak Karmal as his successor. They intended to stay briefly and then hand responsibility to Karmal and the Afghan military...
... In sum, neither insurgency nor violence in Afghanistan results primarily from opposition to external forces. It results instead mainly from internal problems related to the collapse of Afghan society and governance following the Saur Revolution of 1978. The presence of foreign forces and external support to insurgents has raised or lowered the level of violence and its effectiveness, but it has not been the cause of that violence in the last three decades. Nor is the footprint of foreign forces at issue.
The Soviet invasion followed the collapse of security in a period when the USSR maintained only a few thousand advisors. The first months of the Soviet "occupation" saw deliberate and systematic attempts by the Red Army to put the Afghans out in front and support them from fixed bases. The Limited Contingent was drawn into direct combat operations only when that strategy had clearly failed.
The Limited Contingent maintained relatively little force among the rural population in Afghanistan at any time--most of its efforts were focused on securing the lines of communication and the major cities. Most Afghans encountered the Soviets only through the Limited Contingent's deliberate terrorist campaign, waged both from the air and from the ground.
For all of these reasons, there is absolutely no basis for assessing that an increased ISAF/US military presence along the lines being considered will result in some kind of "tipping point" at which local Afghans turn against us because they see us as a Soviet-style occupation force.
Posted on: Wednesday, August 26, 2009 - 20:57
SOURCE: Daily Mail (UK) (8-27-09)
Sometimes it is right to speak ill of the dead. The truth matters, even when it is deeply unsavoury. The truth about Ted Kennedy is certainly unsavoury.
Not that you'd know it from yesterday's tributes, dominated by sycophantic humbug.
'A great and good man,' said a fawning Tony Blair. 'A true and constant friend of the peace process in Northern Ireland,' said Northern Ireland Secretary Shaun Woodward.
Gordon Brown was 'proud to have counted him as a friend and proud that the United Kingdom recognised his service earlier this year with the award of an honorary knighthood'.
Proud? He should be ashamed. Kennedy was a formidable and Machiavellian political operator in the U.S., but he was no friend of Britain. In fact, he was one of our most committed and unrelenting enemies on Capitol Hill.
In his anti-British sentiments, he took after his father, Joseph P. Kennedy, who was unable to hide his bigoted views during a shameful spell as U.S. ambassador to Great Britain.
Ted did his father proud. As a politician dependent on Irish-American votes, this master of empty rhetoric had no scruples about spreading the bitter message of Irish republicanism, especially if there was an election at stake.
Indeed, his pro-republican record was unblemished, though he was never in favour of violence. When Northern Ireland descended into violence, it was Kennedy who, in 1971, gave aid and comfort to the IRA by comparing British attempts to prevent civil war with the U.S. invasion of Vietnam.
He, like the IRA, supported the republican Troops Out movement, and demanded that Ulster Protestants opposed to a united Ireland should 'go back to Britain'.
He also blamed the 1981 hunger strikes on the 'insensitivity' of the Thatcher government rather than cynical republican leaders sacrificing prisoners for electoral advantage.
Later in life, as he came under the influence of the Irish government, he began to moderate his stance on Northern Ireland. But he remained a vociferous critic of the British government and the Royal Ulster Constabulary.
It was never enough that he occupied a safe Senate seat, 'inherited' from his father, who had bought it with a fortune made from bootlegging.
He believed he was entitled to the U.S. presidency, too. And had he not caused the death of Mary Jo Kopechne at Chappaquiddick in 1969, he may well have been able to ride his brothers' reputations all the way to the White House.
While it cost him the presidency, it failed to dampen his obscene sense of entitlement...
Posted on: Wednesday, August 26, 2009 - 19:55
SOURCE: New Republic (8-26-09)
Over the past 40 years, Edward Moore Kennedy was the grand statesman of the Democratic liberalism that emerged out of the 1960s. He was a loyalist to his principles even when those principles fell completely out of fashion. He overcame personal flaws and searing travails to become a masterful legislator, congressional infighter, and builder of unlikely coalitions. Ironically, he achieved all of this only after he had surmounted the political entitlement that made his career possible in the first place. In an even deeper irony, of the three famous and powerful Kennedy brothers, it was youngest brother, Teddy the latecomer, who arguably had the greatest impact on American politics and government. These travails and these ironies gave Kennedy's life dimensions of American triumph as well as of prideful, classical tragedy.
When Kennedy first ran for the Senate in 1962, at the age of thirty, for the seat that JFK vacated when he was elected president, the rap against him was that if his name were Edward Moore his candidacy would be a political joke. It was true, of course: Although he was hardly a reluctant candidate, he was very much the dutiful, slightly wayward, much younger brother in what some viewed warily as a family that was busily acquiring a new firm, otherwise known as the federal government of the United States. As the assistant district attorney for Suffolk County three years out of law school, Kennedy's surname was really all he had going for him, and everybody knew as much--but that name, and the presumptions about the politics and connections that went with it, were magic in Massachusetts. Kennedy turned out to be an effective campaigner on the stump, and his supporters even took some knowing delight in upbraiding his high-minded critics. At one tense meeting in Cambridge, the eminent law professor Mark DeWolfe Howe waxed apoplectic about the young Kennedy's manifest lack of qualifications. "Relax, Mark," replied Kennedy supporter Arthur M. Schlesinger, Jr. "Ted's a candidate for the United States Senate, not the faculty of Harvard Law School." Kennedy won his primary and handily defeated his Republican opponent, the scion of another Massachusetts political dynasty, George Cabot Lodge II. Never again, through eight re-election campaigns, would Kennedy fail to win less than 55 percent of the vote.
Amid catastrophe and near-catastrophe, Kennedy continually summoned the will to transcend himself, even as he struggled with self-doubt about living up to his brothers' examples. He had been in the Senate barely a year when the world collapsed in Dallas; and then the world kept on collapsing for the rest of the 1960s, beginning with his own near death (and permanent injury) in a plane crash only seven months after President Kennedy's assassination. Surviving the crash, though, awakened a new sense of purpose and seriousness in the young senator. He consulted with members of JFK's brain trust about how he might most usefully spend his long convalescence in reading and writing projects (much as his brother had during his own medical ordeals). He began taking an active interest in the inadequacies of the nation's health care system, an issue he would make one of his own. But another pattern also developed: As Kennedy matured into a leading force on Capitol Hill, he repeatedly drifted and dashed against the rocks.
For many years, he did not understand how the incident at Chappaquiddick in July 1969 foreclosed the possibility that he would ever succeed JFK to the presidency or fulfill the promise of RFK's presidential campaign in 1968. In part, this was because he would never be able to explain his actions and inactions of that night adequately (except, perhaps, to the forgiving voters of Massachusetts). But the events also marked the beginning of what would become a convergence of celebrity scandal mongering and cynical prurience that forever changed the rules of American political journalism--and from which Kennedy, with his personal demons, would not escape for decades.
The disgrace of Chappaquiddick helped cost Kennedy his position as Senate Majority Whip in 1971; and though he seriously toyed with the idea, he also turned aside his admirers' stubborn hopes that he would challenge President Richard Nixon in the 1972 election. Yet Kennedy picked himself up once again, and by concentrating on his Senate committee work and focusing on some key issues, he accomplished a great deal in the '70s. As chair of the Senate subcommittee on health care, he helped lead the fight for what became the National Cancer Act in 1971, and pushed hard, albeit fruitlessly, for more ambitious health care reform. He also became a leading voice for campaign finance reform, for the overdue deregulation of the trucking and airlines industries, for negotiating a settlement to the sectarian fighting in Northern Ireland, and for aiding escapees from political and military turmoil around the globe.
Kennedy's decision to unseat the incumbent Democratic president Jimmy Carter in 1980 was the worst political error he ever made. He was incensed at what he considered Carter's betrayal: specifically, of health care reform, and, more generally, of liberal principles. The effort was unmistakably a Kennedy restoration campaign, of a kind that some Democrats had been awaiting for more than a decade. Yet at first, Kennedy sounded strangely ineffectual, unable, when caught off-guard by a television interviewer, to articulate a coherent rationale for his candidacy. He also carried the weight of a collapsing marriage, as well as of the public's lingering outrage about Chappaquiddick. After taking his campaign all the way to the Democratic convention, Kennedy made a strong impression with a concession speech that stirred liberal idealists--but he then spoiled the political moment by snubbing Carter. The public slight widened a split in the party that contributed both to Ronald Reagan's victory in November, and to the Democrats' loss of the majority in the Senate for the first time in more than a quarter of a century...
Posted on: Wednesday, August 26, 2009 - 19:03
SOURCE: WSJ (8-25-09)
What is at stake in the debate over health care is more than the mere crafting of policy. The issue is now the identity of the Democratic Party.
By now we know that Democrats can bail out traditional Republican constituencies like Wall Street, but it remains to be seen whether they can enact a convincing version of their own signature issue, health-care reform.
At this point, it's fair to ask whether Democrats remember why health care is their issue in the first place. As health-care debates always have done, this one has pushed to the fore all the big questions about the rightful role of government, and too many Democrats have sought to avoid them with mushy appeals to consensus and bipartisanship. The war is on and if Democrats want to win they need to start fighting.
In the early years of the campaign for national health insurance, the battle lines were more clearly drawn. Back in the '40s, the issue was part of an "economic bill of rights," a grand Rooseveltian idea pushed by President Harry S. Truman.
Truman had a knack for populist phrasing. "In 1932 we were attacking the citadel of special privilege and greed," he declared in accepting the Democratic presidential nomination in 1948. "We were fighting to drive the money changers from the temple. Today, in 1948, we are now the defenders of the stronghold of democracy and of equal opportunity, the haven of the ordinary people of this land and not of the favored classes or the powerful few."
The Democrats won that particular battle with "the powerful few" but, fighting among themselves as usual, failed to enact national health insurance. Health-care reform nonetheless remained their great cause, their high-voltage appeal to average voters, even those who otherwise saw them as a Harvard-and-Hollywood elite. And even during feeble reform campaigns like President Bill Clinton's 1993 attempt, the opposite half of the populist melodrama—in that case, the insurance industry—duly acted out its corporate bad-guy role.
This year things were supposed to be different. Democrats hold good-sized majorities in both houses of Congress and are led by an eloquent president who won an undeniable mandate last November. This time, the Democrats got the traditional opponents of health-care reform on board: "Ex-Foes of Health-Care Reform Emerge as Supporters" declared a headline in the Washington Post in March, over a story describing a friendly summit meeting between Mr. Obama and various health-care industry representatives.
This time the health-care fight was to be what official Washington loves: An act of cold consensus, not of hot idealism or Trumanesque populism...
Posted on: Wednesday, August 26, 2009 - 17:39
SOURCE: CNN (8-26-09)
In one of the funniest scenes in the film "The Big Lebowski," the hot-headed Vietnam veteran Walter Sobchak, played by John Goodman, explains to the Dude, played by Jeff Bridges, how much he hates nihilists because they don't believe in anything, they have no "ethos."
Unfortunately, many Americans share these sentiments about our politicians. Too often, the political system seems biased toward elected officials who only care about re-election. Politicians are eager to please interest groups who contribute to their campaign funds and activist organizations who will deliver the vote.
Americans suspect that a majority of politicians are willing to switch their position on any given day, depending on which way the political winds are blowing. Everyone, we sometimes fear, is a flip-flopper.
This was certainly not the case with Sen. Edward "Ted" Kennedy. He was a refreshing presence in Washington for many Americans, even those on the right who hated the political ideas that he championed. Love him or hate him, as Walter Sobchakmight say, at least Kennedy stood for something.
Since his election to the Senate in 1962, Kennedy was a steadfast champion of modern liberalism. Throughout the 1960s, Kennedy pushed for civil rights legislation and immigration liberalization. In the aftermath of Watergate in 1974, Kennedy was one of the most vocal supporters of the campaign finance legislation that introduced public funds and contribution limits into the process.
When Democrats moved to the center, Kennedy was always there to remind them of what they were leaving behind. In the 1980 Democratic primaries, he challenged President Carter on the grounds that he had abandoned liberals on issues such as health care.
In a stirring speech at the 1980 Democratic National Convention, Kennedy told delegates, "The commitment I seek is not to outworn views but to old values that will never wear out. Programs may sometimes become obsolete, but the ideal of fairness always endures. Circumstances may change, but the work of compassion must continue.
"It is surely correct that we cannot solve problems by throwing money at them, but it is also correct that we dare not throw out our national problems onto a scrap heap of inattention and indifference. The poor may be out of political fashion, but they are not without human needs. The middle class may be angry, but they have not lost the dream that all Americans can advance together." The crowd gave him a standing ovation. Madison Square Garden was filled with chants of "Ted-dy!"
Nor did Kennedy become timid when Ronald Reagan was in the White House. He was the voice of liberalism in the age of conservatism. The senator remained at the forefront of the Democratic opposition to the administration even when liberalism was no longer cool.
When Bill Clinton and a group of Democrats formed the Democratic Leadership Council to push the party toward the center, Kennedy refused to budge. He was a key ally of the nuclear freeze movement, which placed immense pressure on the White House to enter into arms negotiations with the Soviets and to cool down the vitriolic rhetoric that President Reagan used in his first years.
When Reagan nominated Robert Bork to serve on the U.S. Supreme Court, Kennedy spearheaded the opposition on ideological grounds. He rejected Bork's conservative philosophy, warning that it posed dangers to the country....
Posted on: Wednesday, August 26, 2009 - 13:36