Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: Pajamas Media (10-19-10)
[Ronald Radosh is an Adjunct Senior Fellow at The Hudson Institute, and a Prof. Emeritus of History at the City University of New York's Queensborough Community College. He is a Presidential appointment to the Public Information Declassification Board, for a term extending from 2007 to 2010. He is the author or co-author of 14 books, including The Rosenberg File (1983 and 1997) and Commies: A Journey through the Old Left, the New Left and the Leftover Left (2001).]
Having spent a good deal of time writing about the crude left-wing history of our country by charlatans like Howard Zinn and Oliver Stone, I have become wary of politicized history in general, whether it comes from the precincts of the far Left or the far Right.
This time the culprits are on the Right, one of the biggest examples being Glenn Beck. On this website, some time ago, I wrote about Beck’s failure to understand Martin Luther King, Jr. A Senior Editor of Reason, my friend Michael Moynihan, wrote about Beck’s history and insightfully pointed out that a “tiny bit of knowledge…combined with an enormous Fox News constituency and an unflappable trust in one’s own wisdom, is a dangerous thing. Beck doesn’t demonstrate the perils of auto didacticism, but the perils of learning the subject while at the same time attempting to teach it.”
Now, from the precincts of the left, come two important critiques of both Beck’s and the Tea Party’s historical narrative. The first is a new book from Jill Lepore, a Harvard historian of America’s colonial and revolutionary period. Her book, The Whites of Their Eyes: The Tea Party’s Revolution and the Battle over American History, should be required reading....
Posted on: Tuesday, October 19, 2010 - 16:16
SOURCE: Yale Daily News (10-18-10)
At the turn of the 18th century, Yale was founded to stop a Harvard-based “social network” from diverting its holy Puritan mission toward one emphasizing worldly “works” and wealth in a society connected, but flattened, by commerce.
The world isn’t flat, Yale’s founders insisted. It has abysses, and students need a faith that can plumb them: one that can defy worldly power in the name of a Higher one. Harvard was losing that faith and turning society into a slippery swamp of contracts and deals. Yale, sanctimonious and inward-turning, produced Jonathan Edwards 1720, Nathan Hale 1773 and other dissenters, up through Yale chaplain William Sloane Coffin Jr. ’49, Doonesbury creator Garry Trudeau ’70, Howard Dean ’71 and, maybe, you....
The drive for fame and power also troubled a News editorialist in 1955, during the Cold War, as he pondered another Yalie’s advice on the Opinion page. “The man who clocks his business mind out with his time card at night should not enter the sales end of the brokerage business,” his classmate argued. “You have to eat, drink, play and perhaps even more, with your customers without seeming commercial about it.”
But now, things are different. Yale teaches that the world is flat, thanks to globalized engines of wealth creation, driven by rational investors and consumers and guided by grand strategists. “One thing the Cold War did accomplish was to vindicate democracy and capitalism,” wrote professor John Lewis Gaddis in 1999. “These institutions are now sufficiently deeply rooted that we can view the future with confidence. The only people who doubt this reality lack the power to do anything about it.”...
But Yale’s economic-determinist confidence in materialism would horrify our founders, Adam Smith and even Marx, whose materialism has indeed invaded Wall Street and “The Social Network.” The real “social network” is collapsing along with millions of American homes and jobs amid road rage; lethal store-opening rampages; extreme or “cage” fighting; TV shows that gloat over others’ humiliation; rising crime in New Haven; and rising Christine O’Donnells and Linda McMahons, who bypass Americans’ brains and hearts on the way to our lower viscera, wallets and post-republican despair.
Yale’s founders and the 1955 News editorialist are warning those who can hear them that this can’t last and that, when an emperor has no clothes, we need enough faith to say so and to stop giving him false drapery. Fortunately, Yale has a long tradition of Truth-telling from which to draw.
Posted on: Monday, October 18, 2010 - 21:25
SOURCE: TomDispatch (10-17-10)
[Tom Engelhardt, co-founder of the American Empire Project, runs the Nation Institute's TomDispatch.com. His latest book, The American Way of War: How Bush’s Wars Became Obama’s (Haymarket Books), has recently been published. You can catch him discussing war American-style and his book in a Timothy MacBain TomCast video by clicking here. This was originally a talk given to students attending Hofstra University's lecture series, The International Scene.]
When you look at me, you can’t mistake the fact that I’m of a certain age. But just for a moment, think of me as nine years old. You could even say that I celebrated my ninth birthday last week, without cake, candles, presents, or certainly joy.
I’ve had two mobilized moments in my life. The first was in the Vietnam War years; the second, the one that leaves me as a nine-year-old, began on the morning of September 11, 2001. I turned on the TV while doing my morning exercises, saw a smoking hole in a World Trade Center tower, and thought that, as in 1945 when a B-25 slammed into the Empire State Building, a terrible accident had happened.
Later, after the drums of war had begun to beat, after the first headlines had screamed their World-War-II-style messages (“the Pearl Harbor of the 21st century”), I had another thought. And for a reasonably politically sophisticated guy, my second response was not only as off-base as the first, but also remarkably dumb. I thought that this horrific event taking place in my hometown might open Americans up to the pain of the world. No such luck, of course.
If you had told me then that we would henceforth be in a state of eternal war as well as living in a permanent war state, that, to face a ragtag enemy of a few thousand stateless terrorists, the national security establishment in Washington would pump itself up to levels not faintly reached when facing the Soviet Union, a major power with thousands of nuclear weapons and an enormous military, that “homeland” -- a distinctly un-American word -- would land in our vocabulary never to leave, and that a second Defense Department dubbed the Department of Homeland Security would be set up not to be dismantled in my lifetime, that torture (excuse me, “enhanced interrogation techniques”) would become as American as apple pie and that some of those “techniques” would actually be demonstrated to leading Bush administration officials inside the White House, that we would pour money into the Pentagon at ever escalating levels even after the economy crashed in 2008, that we would be fighting two potentially trillion-dollar-plus wars without end in two distant lands, that we would spend untold billions constructing hundreds of military bases in those same lands, that the CIA would be conducting the first drone air war in history over a country we were officially not at war with, that most of us would live in a remarkable state of detachment from all of this, and finally -- only, by the way, because I’m cutting this list arbitrarily short -- that I would spend my time writing incessantly about “the American way of war” and produce a book with that title, I would have thought you were nuts.
But every bit of that happened, even if unpredicted by me because, like human beings everywhere, I have no special knack for peering into the future. If it were otherwise, I would undoubtedly now be zipping through fabulous spired cities with a jetpack on my back (as I was assured would happen in my distant youth). But if prediction isn’t our forte, then adaptability to changing circumstances may be -- and it certainly helps account for my being here today.
I’m here because, in response to the bizarre spectacle of this nation going to war while living at peace, even if in a spasmodic state of collective national fear, I did something I hardly understood at the time. I launched a nameless listserv of collected articles and my own expanding commentary that ran against the common wisdom of that October moment when the bombing runs for our second Afghan war began. A little more than a year later, thanks to the Nation Institute, it became a website with the name TomDispatch.com, and because our leaders swore we were “a nation at war,” because we were indeed killing people in quantity in distant lands, because the power of the state at home was being strengthened in startling ways, while everything still open about our society seemed to be getting screwed shut, and the military was being pumped up to Schwarzeneggerian dimensions, I started writing about war.
At some level, I can’t tell you how ridiculous that was. After all, I’m the most civilian and peaceable of guys. I’ve never even been in the military. I was, however, upset with the Bush administration, the connect-no-dots media coverage of that moment, and the repeated 9/11 rites which proclaimed us the planet’s greatest victim, survivor, and dominator, leaving only one role, greatest Evil Doer, open for the rest of the planet (and you know who auditioned for, and won, that part hands down)!
Things That Go Boom in the Night
I won’t say, however, that I had no expertise whatsoever with a permanent state of war and a permanent war state, only that the expertise I had was available to anyone who had lived through the post-World War II era. I was reminded of this on a recent glorious Sunday when, from the foot of Manhattan, I set out, for the first time in more than half a century, on a brief ferry ride that proved, for me, as effective a time machine as anything H.G. Wells had ever imagined. That ferry was not, of course, taking me to a future civilization at the edge of time, but to Governor’s Island, now a park and National Monument in the eddying waters of New York harbor and to the rubble of a gas station my father, a World War II vet, ran there in the early 1950s when that island was still a major U.S. Army base.
On many mornings in those years, I accompanied him on that short ride across the East River and found myself amid buzzing jeeps and drilling soldiers in a world of Army kids with, among other wonders, access to giant swimming pools and kiddy-matinee Westerns. As a dyed-in-the-wool city boy, it was my only real exposure to the burbs and it proved an edenic one that also caught something of the exotically militarized mood of that Korean War moment.
As on that island, so for most Americans then, the worlds of the warrior and of abundance were no more antithetical than they were to the corporate executives, university research scientists, and military officers who were using a rising military budget and the fear of communism to create a new national security economy. An alliance between big industry, big science, and the military had been forged during World War II that blurred the boundaries between the military and the civilian by fusing together a double set of desires: for technological breakthroughs leading to ever more efficient weapons of destruction and to ever easier living. The arms race -- the race, that is, for future good wars -- and the race for the good life were then, as on that island, being put on the same “war” footing.
In the 1950s, a military Keynesianism was already driving the U.S. economy toward a consumerism in which desire for the ever larger car and missile, electric range and tank, television console and submarine was wedded in single corporate entities. The companies -- General Electric, General Motors, and Westinghouse, among others -- producing the large objects for the American home were also major contractors developing the big ticket weapons systems ushering the Pentagon into its own age of abundance.
More than half a century later, the Pentagon is still living a life of abundance -- despite one less-than-victorious, less-then-good war after another -- while we, increasingly, are not. In the years in-between, the developing national security state of my childhood just kept growing, and in the process the country militarized in the strangest of ways.
Only once in that period did a sense of actual war seem to hover over the nation. That was, of course, in the Vietnam years of the 1960s and early 1970s, when the draft brought a dirty war up close and personal, driving it into American homes and out into the streets, when a kind of intermittent warfare seemed to break out in this country’s cities and ghettos, and when impending defeat drove the military itself to the edge of revolt and collapse.
From the 1970s until 2001, as that military rebuilt itself as an all-volunteer force and finally went back to war in distant lands, the military itself seemed to disappear from everyday life. There were no soldiers in sight, nothing we would consider commonplace today -- from uniforms and guns in train stations to military flyovers at football games, or the repeated rites of praise for American troops that are now everyday fare in our world where, otherwise, we largely ignore American wars.
In 1989, for instance, I wrote in the Progressive magazine about a country that seemed to me to be undergoing further militarization, even if in a particularly strange way. Ours was, I said, an “America that conforms to no notions we hold of militarism… Militarization is, of course, commonly associated with uniformed, usually exalted troops in evidence and a dictatorship, possibly military, in power. The United States, by such standards, still has the look of a civilian society. Our military is, if anything, less visible in our lives than it was a decade ago: No uniforms in the streets, seldom even for our traditional parades; a civilian elected government; weaponry out of sight… the draft and the idea of a civilian army a thing of the past.
“In the Reagan-Bush era, the military has gone undercover in the world that we see, though not in the world that sees us. For if it is absent from our everyday culture, its influence is omnipresent in corporate America, that world beyond our politics and out of our control -- the world which, nonetheless, plans our high-tech future of work and consumption. There, the militarization of the economy and the corporatization of the military is a process so far gone that it seems reasonable to ask whether the United States can even be said to have a civilian economy.”
Of course, that was then, this is now. Little did I know. Today, it seems, our country is triumphant in producing only things that go boom in the night: we have a near monopoly on the global weapons market and on the global movie market, where in the dark we’re experts in explosions of every sort. When I wrote in 1989 that the process was “so far gone,” I had no idea how far we still had to go. I had no idea, for instance, how far a single administration could push us when it came to war. Still, one thing that does remain reasonably constant about America’s now perpetual state of war is how little we -- the 99% of us who don’t belong to the military or fight -- actually see of it, even though it is, in a sense, all around us.
From a remarkable array of possibilities, here are just a few warscapes -- think of them as like landscapes, only deadlier -- that might help make more visible an American world of, and way of, war that we normally spend little time discussing, questioning, debating, or doing anything about.
As a start, let me try to conjure up a map of what “defense,” as imagined by the Pentagon and the U.S. military, actually looks like. You can find such a map at Wikipedia, but for a second just imagine a world map laid flat before you. Now divide it, the whole globe, like so many ill-shaped pieces of cobbler, into six servings -- you can be as messy as you want, it’s not an exact science -- and label them the U.S. European Command or EUCOM (for Europe and Russia), the U.S. Pacific Command or PACOM (Asia), CENTCOM (the Greater Middle East and a touch of North Africa), NORTHCOM (North America), SOUTHCOM (South America and most of the Caribbean), and AFRICOM (almost all of Africa). Those are the “areas of responsibility” of six U.S. military commands.
In case you hadn’t noticed, on our map that takes care of just about every inch of the planet, but -- I hasten to add -- not every bit of imaginable space. For that, if you were a clever cartographer, you would somehow need to include STRATCOM, the U.S. Strategic Command charged with, among other things, ensuring that we dominate the heavens, and the newest of all the “geographic” commands, CYBERCOM, expected to be fully operational later this fall with “1,000 elite military hackers and spies under one four-star general” prepared to engage in preemptive war in cyberspace.
Some of these commands have crept up on us over the years. CENTCOM, which now oversees our wars in Afghanistan and Iraq, was formed in 1983, a result of the Carter Doctrine -- that is, of President Jimmy Carter’s decision to make the protection of Persian Gulf oil a military necessity, while both NORTHCOM (2002) and AFRICOM (2007) were creations of the Global War on Terror.
From a mapping perspective, however, the salient point is simple enough: at the moment, there is no imaginable space on or off the planet that is not an “area of responsibility” for the U.S. military. That, not the protection of our shores and borders, is what is now meant by that word “defense” in the Department of Defense. And if you were to stare at that map for a while, I can’t help but think it would come to strike you as abidingly strange. No place at all of no military interest to us? What does that say about our country -- and ourselves?
In case you’re imagining that the map I’ve just described is simply a case of cartographic hyperbole, consider this: we now have what is, in essence, a secret military inside the U.S. military. I’m talking about our Special Operations forces. These elite and largely covert forces were rapidly expanded in the Bush years as part of the Global War on Terror, but also thanks to Secretary of Defense Donald Rumsfeld’s urge to bring covert activities that were once the province of the CIA under the Pentagon’s wing. By the end of George W. Bush’s second term in office -- think of that map again -- Special Operations forces were fighting in, training in, or stationed in approximately 60 countries under the aegis of the Global War on Terror. Less than two years later, according to the Washington Post, 13,000 Special Operations troops are deployed abroad in approximately 75 countries as part of an expanding Global War on Terror (even if the Obama administration has ditched that name); in other words, Special Ops troops alone are now operating in close to 40% of the 192 countries that make up the United Nations!
And talking about what the Pentagon has taken under its wing, I’m reminded of a low-budget sci-fi film of my childhood, The Blob. In it, a gelatinous alien grows ever more humongous by eating every living thing in its path, with the exception of Steve McQueen in his debut screen role. By analogy, take what’s officially called the “IC” or U.S. Intelligence Community, that Rumsfeld was so eager to militarize. It’s made up of 17 major agencies and outfits, including the Office of the Director of National Intelligence (ODNI). Created in 2004 in response to the intelligence dysfunction of 9/11, ODNI is already its own small bureaucracy with 1,500 employees and next to no power to do the only thing it was really ever meant to do, coordinate the generally dysfunctional labyrinth of the IC itself.
You might wonder what kind of “intelligence” a country could possibly get from 17 competing, bickering outfits -- and that’s not even the half of it. According to a Washington Post series, Top Secret America, by Dana Priest and William Arkin:
“In all, at least 263 organizations have been created or reorganized as a response to 9/11… Some 1,271 government organizations and 1,931 private companies work on programs related to counterterrorism, homeland security and intelligence in about 10,000 locations across the United States… In Washington and the surrounding area, 33 building complexes for top-secret intelligence work are under construction or have been built since September 2001. Together they occupy the equivalent of almost three Pentagons or 22 U.S. Capitol buildings -- about 17 million square feet of space.”
Oh, and keep in mind that more than two-thirds of the IC’s intelligence programs are controlled by the Pentagon, which also means control over a major chunk of the combined intelligence budget, announced at $75 billion (“2 1/2 times the size it was on Sept. 10, 2001,” according to Priest and Arkin), but undoubtedly far larger.
And when it comes to the Pentagon, that’s just a start. Massive expansion in all directions has been its m.o. since 9/11. Its soaring budget hit about $700 billion for fiscal year 2010 (when you include a war-fighting supplemental bill of $33 billion) -- an increase of only4.7% in otherwise budget-slashing times -- and is now projected to hit $726 billion in fiscal year 2011. Some experts claim, however, that the real figure may come closer to the trillion-dollar mark when all aspects of national security are factored in. Not surprisingly, it has taken over a spectrum of State Department-controlled civilian activities, ranging from humanitarian reliefand development (aka “nation-building”) to actual diplomacy. And don’t forget its growing roles as a domestic-disaster manager and a global arms dealer, or even as a Green Revolution energy innovator. You could certainly think of the Pentagon as the Blob on the American horizon, and yet, looking around, you might hardly be aware of the ways your country continues to be militarized.
With that in mind, let’s consider another warscape, one particularly appropriate to a moment when numerous commentators are pointing out that the U.S. seems to be morphing from a can-do into a can’t-do nation, when the headlines are filled with exploding gas lines and grim reports on the country’s aging infrastructure, when a major commuter tunnel from New Jersey to Manhattan, the sort of project that once would have been tattoo-ably American, has just been canceled by New Jersey’s governor.
Still, don’t imagine that the old can-do American spirit I remember from my childhood is dead. Quite the contrary, we still have our great building projects, our pyramid- and ziggurat-equivalents. It’s just that these days they tend to get built nearer to the ruins of actual ziggurats and pyramids. I’m talking about our military bases, especially those being constructed in our war zones.
I mean, no sooner had U.S. troops taken Baghdad in April 2003 than the Pentagon and the crony corporations it now can’t go to war without began to pour billions of taxpayer dollars into the construction of well fortified American towns in Iraq that included multiple bus routes, PXes, fast-food joints, massage parlors, Internet cafés, power plants, water-treatment plants, sewage plants, fire stations, you name it. Hundreds of military bases, micro to mega, were built in Iraq alone, including the ill-named but ginormous Victory Base Complex at the edge of Baghdad International Airport, with at least nine significant sub-bases nestled inside it, and Balad Air Base, which -- sooner than you could say “Saddam Hussein’s in captivity” -- was handling air traffic on the scale of O'Hare International in Chicago, and bedding down 40,000 inhabitants including hire-a-gun African cops, civilian defense employees, Special Ops forces, the employees of private contractors, and of course tons of troops.
And all of this was nothing compared to the feat the Pentagon accomplished in Afghanistan where the U.S. military now claims to have built something like 400 bases of every sort from the smallest combat outposts to monster installations like Bagram Air Base in a country without normal resources, fuel, building materials, or much of anything else. Just about all construction materials for those bases and the fuel to go with them had to be delivered over treacherous supply lines thousands of miles long, so treacherous and difficult in fact that, by the time a gallon of fuel reaches Afghanistan to keep those Humvees and MRAPs rolling along, it’s estimated to cost $400.
At some level, of course, all of this represents a remarkable can-do achievement and tells you a great deal about American priorities today, about where our national treasure and can-do efforts are focused.
Ziggurats or Tunnels?
And I could go on. The Pentagon and the military make going on easy. After all, the list is unending, the militarization of our American world ongoing, and it’s all happening in your time, on your watch. This is the world you are going to walk out into. I may be nine years old in TomDispatch terms, but I’ve been around for 66 years and this won’t be my world for so long.
So let me ask you: Are you sure that you want the U.S. military to be concerned with every inch of the planet? Are you sure that you want your tax dollars to go, above all, into building pyramid-equivalents in Iraq or Afghanistan instead of tunnels at home, or into fighting a multigenerational war on terror planet-wide, instead of into putting the unemployed to work here? If you can’t imagine reducing the American military mission and “footprint” on this planet significantly, then, of course, it’s probably best to ignore this talk. But rest assured: you won’t save our country that way, you’ll destroy it.
A decade ago, when I was born as TomDispatch.com, many of you were only ten or eleven years old, as were many of our soldiers now in Afghanistan and Iraq. A decade from now, if the war in Afghanistan (and increasingly Pakistan) is still being fought, most of you will be entering your fourth decade on this planet and you may even have a 10 year-old of your own. A decade from then, if -- as some top Washington officials insist -- the global war on terror is “multigenerational,” that child may be fighting in Pakistan or Yemen or Somalia or some other military “area of responsibility” somewhere on the planet. A decade from then…
Of course, whatever skills we may lack when it comes to predicting the future, all things must end, including the American war state and our strange state of war. The question is: Can our over-armed global mission be radically downsized before it downsizes us? It will happen anyway and it won’t take forever either, not the way things are going, but it will happen in an easier and less harmful way, if you’re involved, in whatever fashion you choose, in making it so. Had I had a birthday cake with candles on it for that ninth birthday of mine and blown them out, that, I think, would have been my wish.
Posted on: Monday, October 18, 2010 - 17:18
SOURCE: Dissident Voice (10-18-10)
Madeleine Albright is infamous for her reply to the question posed by 60 Minutes’ Lesley Stahl about the sanctions against Iraq in May 1996.
“We have heard that a half million children have died,” stated Stahl. “I mean, that’s more children than died in Hiroshima. And, you know, is the price worth it?”
“I think this is a very hard choice,” replied Albright, “but the price–we think the price is worth it.”...
Albright in her memoirs expresses regret for her “it was worth it” statement in the 1996 interview. And she told Newsweek in 2006, “I’m afraid that Iraq is going to turn out to be the greatest disaster in American foreign policy—worse than Vietnam.” But she bears partial responsibility for the December 1998 bombing of Iraq (“Operation Desert Fox”), a prelude to the 2003 invasion. She helped produce the disaster....
Throughout the last decade the neoconservatives have been the leading warmongers. But they have no monopoly on imperialist arrogance, contempt for truth and indifference to human life. Madeleine Albright is proof of that.
Posted on: Monday, October 18, 2010 - 15:16
SOURCE: Counterpunch (10-15-10)
The recent victories for the Tea Party brigade have brought an enduring
and fascinating ideological battle in America back into focus.
Ever since the web of American ideas of freedom was spun - the Declaration of Independence, the Constitution, the Bill of Rights, the Mayflower Compact and the concept of the American Dream itself – there has been a contest for the inherent meaning of the freedoms and rights that engender so much ferocious pride.
The current surge by the Tea Party is another chapter in this battle for the very idea of what America stands for. The 'American dream' can be claimed by the political right as a championing of individual rights and personal liberty, an 'anyone can make it if you work and play by the rules', 'pull yourself up by the bootstraps' mentality. Its mantra is small government and low taxes.
One of the many reasons that the Tea Party has been provoked into such a frenzy is the fact that President Obama appears to them to represent the very essence of the alternative, more leftist, meaning inherent within the American Dream, and so the ideological struggle for American identity has become acutely polarised. The other meaning is a much more collective interpretation of what the country represents, that it's the land of the free, a melting pot, a country that lives the creed of all men being born equal (and treated equally, no matter whether they are, say, non-white or Muslim), and sees federal government as a way of helping to realise equality. Within this version of a more open, tolerant America, Tea Partiers are becoming as concerned with race, ethnicity and the disappearance of a white majority, often perceiving the implications of this as a green light for the Islamification of the country, as they are about economic issues....
Posted on: Monday, October 18, 2010 - 15:14
SOURCE: CNN.com (10-18-10)
President Obama has recently blasted the influx of money from undisclosed donors flowing into the midterm campaigns. He repeated a claim, which major media outlets have not been able to substantiate, that foreign funds may have been used in the United States.
At a recent rally in Philadelphia, Pennsylvania, the president said "American people deserve to know who is trying to sway their elections."
"You don't know: It could be the oil industry. It could even be foreign-owned corporations. You don't know because they don't have to disclose."
In making these attacks Obama is returning to a central theme that animated his 2008 campaign: the need to change the campaign finance system. As a candidate, Obama railed against the way that money influenced politics. He reiterated a long-standing theme of reform-candidates that unless the political process changed, policies would remain the same and Americans would never gain confidence in their government.
But Obama broke from these principles almost as soon as he made the argument. During the campaign, Obama disappointed many campaign reform advocates when he announced that he would not use public funds in the general election campaign so that he could raise an unlimited amount of money in his race against Sen. John McCain....
Posted on: Monday, October 18, 2010 - 15:12
SOURCE: CHE (10-17-10)
When critics decry the "corporatization" of the university, they are referring to a number of trends that have transformed the character of higher education in recent years. Outsourcing a variety of nonacademic jobs that were once performed by university employees to large, external companies; forging entangling relationships with the corporate world; encouraging the growing presence of corporations to run bookstores, food services, etc. on campus; providing salaries for college officials that mirror those of corporate executives; imposing work speedups that exploit adjunct instructors and other low-paid employees; downsizing staffs; busting trade unions (including those designed to protect graduate students)—all those developments point to the overt application of marketplace logic to the practice of higher learning.
The fact that the culture of corporations has steadily leached into the veins of academe should come as no surprise. The university has always been, in part, a business. At the very least, colleges and universities in the West have long played a critical role in rationalizing and legitimizing the expropriation of the world's human and mineral wealth for the private profit of the elites who sponsor such institutions in the first place. The Western university has always relied upon bureaucratic management to fulfill its bourgeois aims. It has always produced more reactionaries than revolutionaries. Of course, the university has also generated some of the most trenchant critics of the social order.
What has happened in recent years is the dissolution of the innate tension between the university as the bastion of reaction and the cradle of dissidence. In place of that tension has arisen an institution that has enthusiastically dedicated itself to the orthodoxy of neoliberalism—the idea that the essential task of the state and of all social authorities is to smooth the path of transnational corporations....
Posted on: Monday, October 18, 2010 - 11:27
SOURCE: CS Monitor (10-15-10)
My brother recently e-mailed me to express his disappointment with President Obama, whom he and 67 million other hopeful Americans voted for in 2008. Two years later, he isn't the only one disillusioned. "I thought we were getting FDR," he wrote. "We did," I responded. "Just not the one we thought we were getting."
The 67 million of us who voted for Obama two years ago did so for a variety of different reasons. Some cast their vote because he is a black man, some because of his eloquence, some because he opposed the Iraq War, some because of his policies benefitted the poor and middle classes, and some simply because he seemed the antithesis of George W. Bush.
Within weeks of Mr. Obama's election, Time magazine had imposed his image onto one of Franklin D. Roosevelt sitting in the back of his car, iconic cigarette holder jutting out from his grinning lips. The magazine's headline declared the arrival of "The New, New Deal," referring primarily to the economic crisis that Obama inherited and was now responsible for ending.
A product of nostalgia, not reality
The FDR that Time alluded to is the one that most of us know – the charming man who repaired the US economy, conquered the fascists, defended the rights of minorities, and had the support of just about everyone in the United States. The problem is, that FDR is the product of nostalgia. In reality (as is often the case with reality), things were a whole lot more complicated.
In fact, FDR's actual record raises criticisms very much akin to the posthype gripes about Obama....
Posted on: Sunday, October 17, 2010 - 13:28
SOURCE: Dissent (11-1-10)
On June 21, residents of Fremont, a small meatpacking town just outside Omaha, Nebraska, voted by 57 percent to deny work and shelter to undocumented immigrants. Why Fremont, Nebraska, and why now? Some observers, not knowing the Fremont measure was cooked up by the same coalition that passed Arizona’s law—Kansas City lawyer Kris Kobach, for example, was involved in both measures—are calling it a homegrown, heartland, good ole Nebraskan approach to solving the immigration problem. The fact is that numerous dynamics have combined to make immigration particularly explosive in Fremont: ambitious politicians across Nebraska and nationwide; widespread economic turmoil combined with fast-paced globalization; and neoliberal policies that limit governments’ abilities, both in Mexico and the United States, to respond to these widespread transformations. Tying all of it together is the global journey of one transformative commodity: corn. Following Nebraska corn as it travels across the United States, to foreign countries like Mexico and back to meatpacking plants in Nebraska, illuminates the forces that made immigration a hot-button issue in Fremont.
Starting with corn comes naturally to me. I grew up surrounded by it, on our family farm about thirty miles southwest of Fremont. Back in 1891, my German great-grandparents acquired the farm, buying the land from the man who had homesteaded it. We occasionally find arrowheads and flint lying around in our fields, left by the Pawnee men and women who called the place home long before the Homestead Act. When I was growing up, my immigrant grandparents could still be heard speaking German—especially if they didn’t want us kids to understand what they were saying.
Like farms across Nebraska, these days, ours grows mostly corn. Corn is the undisputed king—not since the 1930s has the crop so dominated agriculture in the state. Its popularity is partly due to demand for ethanol, but also due to the fabulous market conditions that exist for U.S. corn around the world. Early twentieth century developments in the hybridization of corn, more recent genetic modifications (85 percent of U.S. corn seed is now genetically modified), and the use of fertilizer products made with petro-chemicals have radically increased the productivity of corn farms over the last fifty years. In 1932, Nebraska produced 250 million bushels of corn; by 2009 that figure had risen to 1.5 billion bushels, while the amount of acreage devoted to corn production dipped slightly. Meanwhile, massive government subsidies allow farmers to sell their corn for much less than it costs to produce it. Our farm receives more than $10,000 in direct government subsidies, plus another $15,000 or so for conservation techniques such as planting grass buffers or using GPS technology for maximal efficiency when we spray herbicide across our 450 acres. This $25,000 means that some years as much as one-third of our profit comes from the federal government.
Posted on: Friday, October 15, 2010 - 11:36
SOURCE: Huffington Post (10-13-10)
Gutenberg started this craze for ink-and-print. Before then, we had woodcuts and copyists, laboriously transcribing the prophets, the words of the Almighty, the words about the Almighty, and almighty words, mostly onto parchment... After Gutenberg, though, sales of books about non-religious subjects began to proliferate.
Finally, five hundred years after Gutenberg, in the wake of World War II, everyone went to college, either on the GI Bill or his or her parents' wallet, and a vast new market for non-fiction books mushroomed out of what had been paper-marshland. It was like agribusiness -- with a proliferation of publishers' imprints. Soon, every self-respecting graduate (whether of college or the school of knocks) wanted to "be published." It was Everyman's fifteen hours of fame calling to us, long before Andy Warhol reduced it to minutes for the pictorati.
New technology aided and abetted this publishing trend -- simplifying typesetting, mass binding, and printing itself. The advent of paperbacks allowed publishers to appeal to two classes of consumers: libraries and the "general public" -- with richer customers still buying the library hardbacks for their private libraries. Despite radio, film and television the dream of "being published" lived on in our culture -- indeed the rewards of being published were, in hindsight, remarkable. Where very few writers in history had ever been able to live off of their royalties as authors, suddenly there was a whole class of post-WWII "professional writers" who often supplemented their pay with journalism and teaching, but primarily relied heavily on the fruits of their book-labors -- fictional and non-fictional.
Money was both a lure and a problem, though, for the publishers -- and just as whole industries saw the rise of mergers, takeovers and "multi-nationals" that came to dominate what had once been multi-owned, geographically dispersed, independent firms, so mainstream publishing gradually devolved onto a few international parent companies, who amassed subsidiary imprints and benefitted from the supposed economies of scale. Even bookselling succumbed to the trend -- Barnes & Noble, Borders and other chains crowding out the independents.
For authors who were successfully "marketed" by such conglomerates (such as the Bertlesmann, Hachette, News Corporation, Pearson, etc groups) the pickings were wonderful while they lasted. Rather like the housing market -- and Wall Street!
But then came the dark cloud: a cloud that has seen the publishing industry go into sudden crisis mode -- unseen, unreported and unknown to the general public.
Have you heard of the Frankfurt Book Fair, the world's largest annual publishing get-together fest? Heard that it took place last week? Heard it reported on television, radio or in our newspapers. Heard anything about it?
No? Small wonder. The Frankfurt Book Fair, this year, has gone unreported in America, even though the bleak future that was discussed there behind closed doors (and in publishers' booths) represents probably the biggest change in writing and publishing since Johannes Gutenberg, a goldsmith, experienced what he called "a ray of light," and started up his first-ever movable-type press in 1440 in Strasbourg, then in Mainz.
Five hundred and seventy years later Frankfurt-am-Main, on October 5, 2010, found publishers from across the globe "in buoyant mood," according to London's Guardian newspaper last week -- having banished 2009's "mood of austerity," and busy toasting Alfred Knopf's advance of $2.5 million to a second-time Indian novelist Kiran Desai. (Heard of her?) This offer was based on a four-page proposal given out by Andrew Wylie, the New York literary agent. Brilliant!
But wait a minute! Wasn't it Andrew Wylie who, in July, said he was going to sell digital rights to his authors' books direct to Amazon, without bothering with publishers -- like Knopf -- at all?
Mmm. When in doubt, read the small print. On the web, that is. Track down, if you will, the London Times' brave reporter, Helen Rumbelow, who wrote a piece called "Dead Or Alive" last week from Frankfurt itself. She said the people "in charge of the world's books" had gathered for a great junket -- and had encountered instead "a bloodbath"! "Nearly a quarter of a million people will arrive today at the glowering conference hall in Germany with an unprecedented mixture of fear and excitement," she wrote. "The reason is digital." She quoted one agent talking of "an industry in total flux and chaos," another saying: "it's like wrestling in fast-setting concrete." And one previous, best-selling, chair of the Society of Authors opining: "I hope to God I'm being apocalyptic, but I'm deeply worried for the writers of the future."
Summarizing, Helen noted: "The role of agents, publishers and retailers is up for grabs." Print goes down the sink, digital takes over among young people -- but without them being willing to pay the sums people did in the old days for a hardback book, and without the same number of books actually being read. Ergo: little or nothing left (at 10 per cent) for the poor author!
One literary agent declared the secret mood of Frankfurt as "vague hysteria" -- with no-one having any idea what to do, or how to do it, "in a market changing so quickly."
Well, authors: welcome to the same world that recording musicians have known for some time! It's not the end of music, or even the end of people listening to music. It's the end of being paid to make it! None of our author-societies -- or newspapers here in America -- is willing to tell the plain truth, but it is staring us in the face, and it's called ruin by any other name!
An editor from a university press gave a confidential talk to my biographers group, here in Boston, a couple of weeks ago; she said her university press was reduced to printing only 300 copies of a new hardback.
Three hundred copies? Anyone wanting to live on (let alone fund research on) the 10 per cent royalties from 300 copies, please raise your hand!
Understand why I've been in a funk since visiting with publishers and agents in London last spring, as I explained in an earlier blog (Born Again Biographer)?
And my epiphany, as the garage door squeaked and protested, but finally opened over my four-year old, dusty hybrid (a trusty Ford Escape that I and my dog, Harvey, love)?
The printed book is dead -- at least, the book as we have known it since Gutenberg. It's going digital not only among the young, but even the ancient. And nothing can halt that -- any more than medieval copyists and aficionados of hand-made parchment Bibles could in Gutenberg's day. E-reading is a'comin'. The party's over -- and authors will have to adapt.
Part of that adaptation involves re-thinking the roles of agents and publishers. Did authors in Gutenberg's day employ such middlemen? No - the printer acted as publisher/bookseller, securing advance subscriptions from interested customers. So what is to stop the modern -- or postmodern -- author from getting Amazon to print, market and distribute his or her work, from the manuscript e-text? Even audio-book it, if the author is willing to read it onto a digital tape? Why bother with an agent? Why bother with a publisher?
The editor who spoke to us the other week predicted the wholesale collapse of publishers, as an industry, in America in the next 24 to 48 months. Out of the ashes, yes, there will be niche areas of paper publishing -- as in educational textbooks, perhaps -- where specialist knowledge of a defined market will prevent the complete collapse of an imprint. But in terms of general printed books, fiction and non-fiction? Their future is science fiction, metaphorically as well as literally. It's a brave new world in which the author is guaranteed: Nothing! You will be your own agent/publisher -- responsible for your own digital editing, your own digital typesetting/formatting, your choice of e-jacket, your "book's" advance publicity, and its marketing -- the latter in co-operation with Amazon. Or Google, once they go into distribution.
My epiphany -- such a big word for such a simple realization -- is that my days of plenty are over. The years of the locust lie ahead. Out of the back of my beloved Ford hybrid I shall, grey-haired as I am, be in the future encouraging people to buy and download my digitized work, or hauling boxes of instant-printed books to sell at readings/talks I shall give around the country, into my dotage, on my chosen topics: the American presidency, military history, German literature, biography...
"Hang on to your day jobs!" one of our members remarked, at our biographers meeting. (Each of us is devoted to resurrecting a chosen life in biographical form, and we gain comfort from sharing our common concerns with fellow biographers.)
You know what: she's right. Writing's a great and wonderful craft, like painting or pottery, or playing jazz in front of aficianados. A few, like Kiran Desai, may actually make it big in the looming digi-world, especially if film rights attach; but for the rest of us, it's going to be a matter of returning to our roots as authors rather than as commodities -- and that may not be a bad thing. If we know at whom we're aiming our work, we can surely match 300 copies -- even exceed that modest total. We won't get rich -- but we'll be published. And proud.
Posted on: Friday, October 15, 2010 - 10:34
SOURCE: Truthdig (10-13-10)
Mercifully, the midterm election cycle is nearing its end. Both parties, we learn, are planning their “postmortem assessments.” The Daily Beast’s recent headline is a sign of the times: “Why Obama Can’t Lose in 2012.” Plan ahead....
In the 1934 midterm elections, two years after the launching of Franklin D. Roosevelt’s New Deal, the president and Democrats vigorously defended their programs. No, they had not solved the Depression—not by a long shot—but nevertheless they fought hard to retain their authority.
FDR burst on the scene with his nomination acceptance speech in 1932, boldly announcing “a new deal for America.” After his election he brought new ideas and new faces to Washington. After serving in three previous administrations, Andrew Mellon, the self-advertised “greatest secretary of the treasury since Alexander Hamilton,” was gone. FDR appointed no Summers, Geithner or Bernanke to continue the failed policies of the past.
FDR said in his inaugural address, “Our primary task is to put people to work.” Along the way, he offered a cast of villains he believed responsible for the economic disaster, and he never let his audiences forget. Americans had clear, constant reminders of Herbert Hoover, the “money-changers in the temple” and “economic royalists.” He knew the perps, accomplices and accessories that “caused” the Great Depression. Such attacks today would be almost unthinkable—unless one gave up campaign contributions....
Democrats in 1934 routed the Republicans, increasing their margin in the House from 313 to 322 and their Senate majority from 60 to 69 (of 96 members), with the GOP losing 10 seats, including that of Robert La Follette of Wisconsin, who shifted to a “Progressive” label. The new Democratic ranks included a young Sen. Harry Truman. Today’s Republican scare machine has stirred the passions about Obamacare and the Democratic “socialist” program. But when Republican candidates are caught advocating the privatization of Social Security, they hastily retreat, promising to “save” Social Security. And then we have the well-financed tea party folk howling that government must keep hands off their Medicare. Social Security and Medicare are great historical achievements. Democrats dutifully defend them, so why now shy from activist, interventionist programs? Rush Limbaugh, Mitch McConnell, et al., doth make cowards of them all....
Posted on: Thursday, October 14, 2010 - 20:01
SOURCE: Yale Global (10-13-10)
IRVINE: Since the late 1980s, China’s leaders have embraced globalization in a bid to remake the nation, and it has recently sought to leverage its growing economic power in a grand re-branding exercise. China, the exercise tried to show, was no longer Mao’s backward revolutionary country, but a modern superpower. Unfortunately for China, the same interconnected world that enabled its economic surge has sometimes stymied the nation’s public relations efforts.
The re-branding drive has overlapping, but somewhat different domestic and international ambitions. President Hu Jintao and his comrades strive to convince China’s citizens that they can simultaneously raise living standards, maintain stability and garner international respect. The emphasis abroad, meanwhile, has been on convincing residents of foreign countries, including people of Chinese ancestry, that the People’s Republic of China 2.0, though still run by a Communist Party, has been utterly transformed.
The biggest successes of this re-branding drive have depended on Beijing’s ability to ride the tide and take advantage of distinctively global aspects of the current era. Without far-flung supply chains and fast-flowing foreign investment – including that of ethnic Chinese in Taiwan and other locales – China could not have surged past Japan to become the world’s second largest economy. And without satellite television and the internet, the visually stunning opening ceremony of the Beijing Olympic Games and strong showing by Chinese athletes could not have had the dramatic impact they did in 2008, helping dispel at last the lingering visions of China as a technologically backward “sick man” of Asia....
Posted on: Thursday, October 14, 2010 - 19:27
SOURCE: Guardian (UK) (10-13-10)
Norway's Nobel peace prize committee has done the right thing in awarding this year's prize to Liu Xiaobo. The furious reaction of the Chinese state shows just how complicated doing the right thing will become as we advance into an increasingly post-western world.
Liu Xiaobo is exactly the kind of person who deserves this prize, alongside Andrei Sakharov, Aung San Suu Kyi and Nelson Mandela. For more than 20 years, he has consistently advocated nonviolent change in China, always in the direction of more respect for human rights, the rule of law and democracy. He has paid for this peaceful advocacy with years of imprisonment and harassment. Unlike last year's winner, Barack Obama, who got the prize just for what he had promised to do, Liu gets it for what he has actually done.
The Chinese government tried hard to prevent him getting it. They directly threatened the Nobel committee with negative consequences for Chinese-Norwegian relations . They have since described the award as an "obscenity", forbidden any mention of it in the censored Chinese media, placed Liu's wife under house arrest, detained other critical intellectuals, cancelled talks about Norwegian fishery exports to China – and are now doubtless debating, at the highest level, how to play it from here. Will they, for instance, allow his wife, the photographer Liu Xia, to travel to Oslo to receive the prize on behalf of her imprisoned husband?
Meanwhile, in the capitals of the west, many are quietly questioning whether this really was such a good decision. These questions are important and need to be addressed, but one hypocritical or self-deceiving argument must be demolished at once. This is the claim that it will not be good even for the dissidents if a leading dissident receives the Nobel prize. One used to hear a similar case made by western politicians who, for example, declined to meet Sakharov, Lech Walesa or Václav Havel. Commenting on an American elder statesman's visit to Moscow, one Russian writer told me: "He says it would not be good for Sakharov if they met, but what he really means is that it would not be good for him if he met Sakharov."
It is for the dissidents to decide what is good for the dissidents...
Posted on: Thursday, October 14, 2010 - 19:25
SOURCE: Politico (10-14-10)
According to POLITICO’s John Harris, just as former White House chief of staff Rahm Emanuel has departed from Washington, many of his congressional recruits from the class of 2006 — when he chaired the Democratic Congressional Campaign Committee — may soon be gone as well.
Emanuel’s strategy of recruiting more Democrats from conservative “red” districts swam against the tide of history. For almost four decades, Democrats and Republicans had been sorting themselves out ideologically, so that there were fewer moderates in either caucus....
This was a big contrast from the state of the nation’s political parties throughout much of the 20th century.
Until the 1970s, Democrats were sharply divided between Southern and Northern wings — with members from Dixie being much more conservative on matters related to race relations and unionization. The GOP had been divided between Northeastern liberals and Midwestern conservatives.
In the 1950s and 1960s, according to political scientist Sarah Binder, approximately 30 percent of House and Senate members were identified as centrists....
Posted on: Thursday, October 14, 2010 - 15:11
SOURCE: National Review (10-14-10)
We will learn in November just how angry the public is about a lot of things, from higher taxes to massive unemployment.
But the popular uproar over those issues pales in comparison with the sense of humiliation over the fact that we Americans are quite broke. In 2008, the public was furious at George W. Bush, not because he was too much of a right-wing tightwad, but because he ran up a series of what were then thought to be gargantuan deficits. The result was that under a supposedly conservative administration, and despite six years of an allegedly small-government Republican Congress,the national debt nearly doubled, from $3.3 trillion to $6.3 trillion, in just eight years....
...[T]here is a growing sense of despair that even vastly increased income taxes cannot cover the colossal shortfalls. At least the high Clinton tax rates of the 1990s balanced the budget. But should we bring them back, we would still run a deficit of more than $1 trillion in 2011 — given the vast increases in federal spending.
That bleak reality creates hopelessness — and anger — among voters, who feel they are being taken for fools by their elected officials. Americans oppose tax hikes not because they don’t wish to pay down the debt, but because they suspect the increased revenue will simply be a green light for even greater deficit spending....
We are humiliated by what we owe. If we cannot pay it back, we at least want political payback.
It’s that simple this year.
Posted on: Thursday, October 14, 2010 - 13:15
SOURCE: National Review (10-14-10)
It is certainly time that the West considered systematically whether it has irreconcilable differences with Islam. The belligerence of many Islamic spokesmen and the unassimilable quality of many Muslim immigrants in the West, as well as the spectacular terrorist provocations of extreme Islamic groups, make this a very legitimate question. But it is not so easy to answer. Some passages of the Koran, and some of Muhammad’s more purposeful remarks, certainly incite the inference that mortal conflict is inevitable, an impression heightened by the neurotic obsession of a great many Muslims with the red herring of Israel. It is hard for Westerners to know what to make of Islam. It speaks through an infinite number of clerical and secular leaders, and in a range of vocabularies from fraternal to genocidally hostile....
To many Westerners, there is an ingrained Muslim caricature of the swarthy peasant raising sinew-lean arms to the heavens, having been commanded to do so by a voice from a minaret loudspeaker; the serried ranks of men pressing their foreheads to the floor and elevating their posteriors in a gesture that is, in our culture, unserious; shady, long-unsuccessful nationalities; and recent, and not overly dynamic, colonies. Many Western Muslim populations are sinister and fractious, and their spokesmen are often unbecomingly hostile to the host nations. Their conditions are inferior, but so are their standards of civic participation....
Non-Muslim countries and regions should make it clear that we are not prepared to be condescended to as infidels, that the Judeo-Christian traditions of the West antedate those of Islam (we are all Abrahamists and Gabriel called on our preceptors first), and that the widespread mistreatment of Christian minorities in some Muslim countries should produce proportionate retaliation, but not at the expense of the civil rights of our own Mulsim minorities. The Muslim massacre of a million Christian blacks in the Sudan should have received a much more energetic and righteous response than it has. And the mad idea of a large mosque almost adjacent to the World Trade Center site should never have gained any traction at all. That debate makes our entire society look like idiots, with Michael Bloomberg, Maureen Dowd, Katie Couric, et al. all thoughtfully holding hands as proverbial “useful idiots.” The less house-trained Islamists who now frolic in and degrade the United Nations and some of its agencies and commissions should be sent packing. Militant Islam should be recognized as an antagonist, and moderate Muslims should be courted, much more systematically than they have been;Indonesia should be treated as a major power in the world, despite having a (very talented) president who rejoices in the name of Bambang. The debate should not be between ourselves about how to deal with Muslims, it should be between Muslims about the unwisdom of provoking us all.
Posted on: Thursday, October 14, 2010 - 13:13
SOURCE: HistoryNet (10-13-10)
In the words of that great screenwriter Cameron Crowe, I'm "almost famous."
Last week I was contacted by Joshua Green, Senior Editor at Atlantic Monthly. Seems there is a candidate running for Congress in northwestern Ohio who has been part of a Waffen-SS re-enactor group. Their aim, like that of re-enactors everywhere, was to "live history," in this case the history of the 5th SS Panzer Division, a multinational mechanized formation nicknamed "Wiking." Green wanted to know my thoughts about the Division and those who would re-enact it. I said some negative things, and I stick by them...
I'd like to remind my re-enactor friends, though, to beware of the company they keep. I don't personally have the re-enactor gene, but I have, over the years, been a member of another misunderstood community that has had to endure its share of mockery. I am a wargamer: board wargames, that is, the ones with hexagon maps and all those little cardboard counters. I own somewhere between 100 and a bazillion, from all the classic companies–Avalon Hill, SPI, GDW–as well as their numerous modern successors.
While I had a ball, especially back in graduate school when I actually had time to set up and play a monster game like Drang nach Osten, I can tell you one thing about those days. There was a fringe element in the hobby that worshiped the Wehrmacht, the Waffen-SS, and, I sometimes suspected, Hitler himself. Ask anyone who was wargaming back in the 70's and 80's, and I'm sure they'll confirm what I'm saying. The number of wargames back in the day that seemed to be channeling the Wehrmacht on their box covers–usually with a cover image of a German army or SS officer in a heroic pose–was a topic discussed constantly in the wargaming press....
Posted on: Thursday, October 14, 2010 - 12:28
SOURCE: Foreign Policy Journal (10-14-10)
The Camp David Accords signed between Egypt and Israel in 1978 have endured despite a long period of extreme volatility in the region. In 1993, the Oslo Mutual Recognition Pact between Israel and the PLO took a further step toward overall peace between Israel and its Arab neighbors. Over the past three decades, the United States Government has assumed the role of guarantor and active promoter of these agreements visualized as the foundations of peace. In order to assure the smoothness of the process, it designed and managed a multi-faceted program intended to build trust and friendship between peoples on opposite sides of the conflict. The contention of this article is that the fundamental design of certain aspects of that program were flawed and their effects contrary to the goal of bringing the two sides closer together. In fact, we argue that U.S. efforts to bolster peace through programs of economic development called “regional cooperation” actually promoted many of the inequalities and hostilities they were trying to mitigate....
The Price of Peace
Camp DavidWhen Israel and Egypt signed the Camp David Peace Accords on March 26, 1979, it was expected that this landmark treaty would lead to an unprecedented “normalization” of relations, the fruits of which would include cultural exchange, trade, exchange of ambassadors, and transfer of technology. A brief overview of the architecture of U.S. assistance shows that the American peace assurance plan came in three parts. First, using some perverse logic, it provided billions of dollars of military assistance to both Israel and Egypt, which, by 2007, amounted to more than 102 billion dollars. Second, two hours after the signing of the peace treaty on March 26, 1979, the United States signed a separate “memorandum of agreement” with Israel that included a pledge that it “will take such remedial measures as it deems appropriate, which may include diplomatic, economic and military measures” in Israel’s defense. And finally, it promised non-military assistance to Israel that by 2007 amounted to more than fifty-three billion dollars, while it committed to Egypt delivery of the largesteconomic development package in history. By 2007, U.S. non-military aid to Egypt totaled close to forty billion dollars. Each year since 1979, Israel has deposited a check into its national treasury, using a portion to purchase U.S. Treasury notes. For the Egyptians, much of their economic development package was used to purchase Western science and technology in the form of American training, technical assistance and American-made goods and services.
A very small part of this massive aid, hardly noticeable among all of the hundreds of American projects being launched by the United States Agency for International Development (USAID) in Egypt, was a tiny program funded annually at a level of five to seven million dollars. Its purpose was to encourage cooperation between high-level Israeli scientists working at scores of well-funded Israeli research institutions, on the one hand, and a handful of top Egyptian research scientists struggling with very limited resources on the other. This program, known as MERC (Middle East Regional Cooperation), is the subject of our inquiry.
The assumption made by the technocrats who crafted the peace package was that the Israelis possessed a technological and scientific research capacity that their Arab neighbors were lacking. It was believed that the sharing of this knowledge would be an important contribution to the “normalization” process. Moreover, the cooperation at the scientific level would be accompanied by friendly exchanges that would open closed borders and allow Israelis and Egyptians to travel back and forth with ease. In retrospect, we see major flaws in this modest, good-faith, rapport-building initiative that led to unanticipated and disappointing consequences. The American effort to expose a small cadre of Arab academic elites to better-endowed Israeli scientific institutions was fated to achieve little in the way of trust and cooperation. Instead it accentuated a perception of technological inequality, threatened to alienate Arab academics from their own societies, and increased regional anxieties regarding Western cultural and economic hegemony....
Posted on: Thursday, October 14, 2010 - 11:52
SOURCE: Daily Caller (10-13-10)
From 1935 until 1947, it was legal for closed shops to exist. If you wanted a job in a unionized factory, you had to join the union. Congress then passed the Taft-Hartley Act, restricting the power of union political action committees and allowing states to pass right-to-work laws. Taft-Hartley has been the law governing labor relations ever since.
Labor unions have been trying to repeal Taft-Hartley since 1947, but they have been unable to do so as a coalition of Southern Democrats and Republicans blocked repeal. Sherman’s new legislation can be seen as a continuation of that cat-and-mouse game in Congress....
Unions blame right-to-work laws for their plight. But increasingly the number of union jobs declined because the companies where unions were dominant — the Big Three auto makers for instance — could not remain competitive under the old economic model. High wages, pension and health benefits hurt the ability of companies governed by the closed shop to compete. Steve Miller, chairman of Delphi Corporation (a General Motors spinoff) when it was going through bankruptcy, said the company simply couldn’t compete with its $65-per-hour “all-in” labor cost (pay and benefits for current and retired employees).
Posted on: Thursday, October 14, 2010 - 11:47
SOURCE: NYT (10-14-10)
NEARLY 63 years after the United Nations recognized the right of the Jewish people to independence in their homeland — and more than 62 years since Israel’s creation — the Palestinians are still denying the Jewish nature of the state. “Israel can name itself whatever it wants,” said the Palestinian Authority president, Mahmoud Abbas, while, according to the newspaper Haaretz, his chief negotiator, Saeb Erekat, said that the Palestinian Authority will never recognize Israel as the Jewish state. Back in 1948, opposition to the legitimacy of a Jewish state ignited a war. Today it threatens peace.
Mr. Abbas and Mr. Erekat were responding to the call by the Israeli prime minister, Benjamin Netanyahu, for the Palestinians to recognize Israel as the nation-state of the Jewish people, enabling his government to consider extending the moratorium on West Bank construction....
The core of the Israeli-Palestinian conflict has been the refusal to recognize Jews as a people, indigenous to the region and endowed with the right to self-government. Criticism of Israeli policies often serves to obscure this fact, and peace continues to elude us. By urging the Palestinians to recognize us as their permanent and legitimate neighbors, Prime Minister Netanyahu is pointing the way out of the current impasse: he is identifying the only path to co-existence.
Posted on: Thursday, October 14, 2010 - 09:07