MythicAmerica explores the mythic dimension of American political culture, past, present, and future. The blogger, Ira Chernus, is Professor of Religious Studies at the University of Colorado at Boulder and author of Apocalypse Management: Eisenhower and the Discourse of National Insecurity.
To receive periodic email summaries of the blog, send an email to email@example.com, with “Update” in the subject line. You can communicate directly with Ira at the same address.
Image via Shutterstock.
I’m old enough to remember when Social Security was the “third rail” of American politics -- too dangerous for even the most conservative politician to touch. You’re probably old enough to remember that, too. It wasn’t very long ago. As recently as the 2012 Republican primaries, Mitt Romney defended Social Security against attacks from other candidates (notably Rick Perry), and Romney emerged the GOP standard-bearer.
How things have changed in just a year. It’s not merely that a Democratic president is offering, very publicly, to cut Social Security benefits. There’s something much more important: In the mainstream of American political conversation, this revelation was not treated as very big news.
Oh, it made headlines. But it shared equal billing at best -- and was often buried beneath -- a host of other stories. You could easily get the impression that the most important thing Barack Obama did on April 5 was to remark on the good looks of California’s state attorney general and then apologize to her.
As I wrote this, his budget proposal including the Social Security cuts was number five on the Washington Post website’s “most viewed stories” list; his apology to that attorney general was number one. The very idea of cutting Social Security benefits is no longer a big deal in the media, so it’s no longer a big deal to the public. And that change is a very big deal indeed in the mythology of American politics.
Until recently, successful Republican as well as Democratic politicians have avoided any hint of tampering with Social Security. It’s a bipartisan tradition going back to Dwight Eisenhower. When he was just a private citizen he warned that the liberal Democrats, who had created Social Security a decade earlier, wanted to “advance us one more step toward total socialism, just beyond which lies total dictatorship.” In letters to wealthy friends he pledged to “combat remorselessly all those paternalistic and collectivistic ideas."
Once in the White House Eisenhower changed his tune. Even though his Republican party controlled both houses of Congress, he told aides that it would be politically impossible to change the Social Security system; the public would never stand for it.
That was precisely the way Franklin D. Roosevelt had planned it. Knowing conservatives were eager to pounce on every New Deal program, he designed Social Security so that no one could attack it as a “government giveaway.” The benefits would not -- and still do not -- come out of the government treasury. They come out of a special pool of money funded by payroll taxes. To hammer home that point, FDR made sure the first benefits would not flow until some time after the first taxes were paid in.
In this way FDR created the myth of Social Security: When you retire or are disabled and stop working, the checks you get in the mail don’t come from the government. They are your own money -- the money you’ve set aside over the years -- being returned to you, fair and square. No American worker is going to let the government touch his or her own money. That’s how FDR made sure Social Security would be the “third rail.”
Like most myths, this one is compounded of fact and fiction. It’s true that Social Security is a separate fund and an insurance program of sorts. You don’t take out of that fund unless you’ve paid in. But the amount you take out is rarely directly proportional to what you’ve put in. And the separation between Social Security fund and federal treasury disappeared long ago, because the government has raided the fund regularly.
However the proportion of fact to fiction has little to do with the power of a myth. The myth of Social Security has been such a staple of American political life because it is so simple and seems to capture so well the basic idea of fairness and equity: If you work hard and set aside some of your earnings every paycheck, when you can no longer work you will still have enough to live a decent life. The government won’t give you the money; it’s your money. But government will guarantee that the money will be there.
That sounds a lot like the recent rhetoric of Barack Obama: If you work hard and play by the rules, it’s the government's responsibility to make sure you can live a decent middle-class life, no matter how long you live.
Yet now Obama’s policies have diverged from his rhetoric, as well as from the policies of all successful politicians since FDR’s day -- and it’s not very big news at all. How can that be?
The short answer is that the old myth of Social Security no longer has the power it has held since FDR’s day. We are watching a time-honored political myth begin to die.
Myths don’t die because they are debunked by facts. Myths die when new, more persuasive myths come along to take their place. Soon, the demise of the old myth just doesn’t seem so important any more.
The new myth, in this case, is a story about the baby-boomers who will soon all be retired or disabled. They’ll draw down huge amounts of Social Security money, the myths says, far more than they have put in and far more than the younger generation can provide for them. If we don’t cut their benefits, the nation will go broke.
This myth thrives despite the mountain of facts that contradict it. Social Security is infine financial shape until the youngest baby-boomers are at least 90, maybe 100. If the system needs more money after that, there’s a simple solution: Raise the cap on the Social Security payroll tax. Right now, no matter how many millions you may earn, you pay that tax on only the first $113K of your income. Start taxing income over $113K (which is only the top 5% of wage-earners) and the Social Security fund begins to swell, covering all future needs. It’s a solution that gets huge support from the public, when pollsters ask about it.
But, to repeat, myths don’t die because they are debunked by facts. They die when they are eclipsed by new, more powerful myths. The myth that Social Security benefits must be cut is one part of the myth of the “debt crisis,” which is, in turn, just part of the much larger mythology of homeland insecurity -- the narrative that says America is teetering on the cliff, ready to be plunged into disaster, very possibly even extinction, by some evil force or other. That’s been America’s master narrative, our most powerful myth, for decades.
The fact that Social Security is sound, and could be made more sound by raising that cap on the payroll tax, doesn’t make headlines or get much air time because it doesn’t fit into our ruling myth of insecurity. And Obama’s decision to propose Social Security cuts doesn’t make a huge political wave because it fits so well into that ruling myth.
Sure, polls consistently find people opposing those cuts. But the more important point is that -- judging from the media coverage of Obama’s decision and the weakness of the opposition to it -- the public now accepts those cuts as inevitable, because the myth of insecurity is the political water we all swim in.
Cutting Social Security benefits even a small amount will be another victory for the myth of homeland insecurity. Once that myth comes to control the public view of Social Security, there’s no limit to the cuts in benefits. Everything and anything must be given up for the sake of national security: That’s been America’s creed for a very long time. When Social Security cuts become routine news, that creed gets a huge boost, opening the door to even larger benefit reductions.
So there’s much more at stake in the Social Security debate than the amounts of the monthly payments. It’s really a fundamental debate about the mythology that shapes American political life. Will we maintain the New Deal myth, the basic social contract, which Barack Obama voices so eloquently: If you work hard and set aside part of each paycheck, you’ll be guaranteed a decent life after you stop working?
Or will we scrap it in favor of the insecurity myth, which creates such a different kind of social contract: Since danger threatens America so massively and so imminently, each of us must sacrifice to save a whole nation on the brink of collapse -- and the poorer you are, the more you must sacrifice? That’s the direction Obama’s policy is taking us in.
Yet both options remain open. Like FDR and Eisenhower, Obama will go whichever way the political wind blows. The choice is up to us.
Should the United States government be allowed to assassinate its own citizens? That question was in the air briefly not long ago. April 4 is an excellent day to revive it: On April 4, 1968, the government was part of a successful conspiracy to assassinate the Rev. Dr. Martin Luther King, Jr.
That’s not just some wing-nut conspiracy theory. It’s not a theory at all. It is a fact, according to our legal system.
In 1999, in Shelby County, Tennessee, Lloyd Jowers was tried before a jury of his peers (made up equally of white and black citizens, if it matters) on the charge of conspiring to kill Dr. King. The jury heard testimony for four full weeks.
On the last day of the trial, the attorney for the King family (which brought suit against Jowers) concluded his summation by saying: “We're dealing in conspiracy with agents of the City of Memphis and the governments of the State of Tennessee and the United States of America. We ask you to find that conspiracy existed.”
It took the jury only two-and-half hours to reach its verdict: Jowers and “others, including governmental agencies, were parties to this conspiracy.”
I don’t know whether the jury’s verdict reflects the factual truth of what happened on April 4, 1968. Juries have been known to make mistakes and (probably rather more often) juries have made mistakes that remain unknown.
But within our system of government, when a crime is committed it’s a jury, and only a jury, that is entitled to decide on the facts. If a jury makes a mistake, the only way to rectify it is to go back into court and establish a more convincing version of the facts. That’s the job of the judicial branch, not the executive.
So far, no one has gone into court to challenge the verdict on the King assassination.
Yet the version of history most Americans know is very different because it has been shaped much more by the executive than the judicial branch. Right after the jury handed down its verdict, the federal government’s Department of Justice went into high gear, sparing no effort to try to disprove the version of the facts that the jury endorsed -- not in a court of law but in the “court” of public opinion.
The government’s effort was immensely successful. Very few Americans are aware the trial ever happened, much less that the jury was convinced of a conspiracy involving the federal government.
To understand why, let’s reflect on how history, as understood by the general public, is made: We take the facts we have, which are rarely complete, and then we fill in the gaps with our imaginations -- for the most part, with our hopes and/or fears. The result is a myth: not a lie, but a mixture of proven facts and the fictions spawned by our imaginings.
In this case, we have two basic myths in conflict.
One is a story Americans have been telling since the earliest days of our nation: Back in not-so-merry old England, people could be imprisoned or even executed on the whim of some government official. They had no right to prove their innocence in a fair, impartial court. We fought a bloody war to throw off the British yoke precisely to guarantee ourselves basic rights like the right to a fair trial by a jury of our peers. We would fight again, if need be, to preserve that fundamental right. This story explains why we are supposed to let a jury, and only a jury, determine the facts.
(By odd coincidence, as I was writing this the mail arrived with my summons to serve on a local jury. The website it directed me to urged me to feel “a sense of pride and respect for our system of justice,” because “about 95 percent of all jury trials in the world take place in the United States.”)
Then there’s another myth, a story that says the federal government has only assassinated American citizens who were truly bad people and aimed to do the rest of us harm; the government would never assassinate an innocent citizen. Most Americans devoutly hope this story is true. And most Americans don’t put MLK in the “bad guy” category. So they resist believing what the legal system tells us is true about his death.
Perhaps a lot of Americans would not be too disturbed to learn that the local government in Memphis or even the Tennessee state government were involved. There’s still plenty of prejudice against white Southerners. But the federal government? It’s a thought too shocking for most Americans even to consider. So they fill in the facts with what they want to believe -- and the myth of James Earl Ray, “the lone assassin,” lives on, hale and hearty.
Since that’s the popular myth, it’s the one the corporate mass media have always purveyed. After all, their job is to sell newspapers and boost ratings in order to boost profits. Just a few days after the trial ended the New York Times, our “newspaper of record,” went to great lengths to cast doubt on the verdict and assure readers, in its headline, that the trial would have “little effect” -- an accurate, though self-fufilling, prophecy.
Imagine if the accused had been not a white Southerner but a black man, with known ties not to the government but to the Black Panther Party. You can bet that the trial verdict would have been bannered on every front page; the conspiracy would be known to every American and enshrined in every history book as the true version of events.
None of this necessarily means that the federal government and the mass media are covering up actual facts. Maybe they are, maybe they aren’t. Again, I don’t claim to know what really happened on April 4, 1968.
But there surely were people in the federal government who thought they had good reason to join a conspiracy to get rid of Dr. King. He was deep into planning for the Poor People’s Campaign, which would bring poor folks of every race and ethnicity to Washington, D.C. The plan was to have them camp out on the Mall until the government enacted major economic reforms to lift everyone out of poverty. That meant redistributing wealth -- an idea that made perfect sense to Dr. King, who was a harsh critic of the evils of capitalism (as well as communism).
It also meant uniting whites and non-whites in the lower income brackets, to persuade them that the suffering they shared in common was stronger than the racial prejudice that divided them. Dr. King did not have to be a prophet to foresee that the longer whites blamed non-whites, rather than the rich, for their troubles, the easier it would be to block measures for redistributing wealth. The unifying effect of the Poor People’s Campaign spelled trouble for those whose wealth might be redistributed.
At the same time, Dr. King was the most famous and respected critic of the war in Vietnam. By 1968 he was constantly preaching that the war was not just a tragic mistake. It was the logical outgrowth of the American way of life, based on what he called the inextricably linked “triplets” of militarism, racism, and materialism. Had he lived, the Poor People’s Campaign would have become a powerful vehicle for attacking all three and showing just how inseparable they are.
Yes, plenty of people in the federal government thought they had good reason to put an end to the work of Dr. King. But that hardly proves federal government complicity in a conspiracy to kill him.
So let’s assume for a moment, just for the sake of argument, that the jury was wrong, that James Earl Ray did the shooting and acted alone. The federal government would still have good reasons to suppress the conspiracy story. Essentially, all those reasons boil down to a matter of trust. There is already immense mistrust of the federal government. Imagine if everyone knew, and every history book said, that our legal system has established as fact the government’s complicity in the assassination.
If the federal government has a convincing argument that the jury was wrong, we all deserve to hear it. There’s little advantage to having such uncertainty hanging in the air after 45 years. But the government should make its argument in open court, in front of a jury of our peers.
In America, we have only one way to decide the facts of guilt or innocence: not through the media or gossip or imagination, but through the slowly grinding machinery of the judicial system. At least that’s the story I want to believe.
FDR with Ibn Saud, first king of Saudi Arabia, in February 1945.
In a column I’ve just posted on Tomdispatch.com I summarized the tremendous task Barack Obama seemed to commit himself to, in his recent Middle East trip, as he once again took on the role of peacemaker:
[He] must satisfy (or mollify) both the center-left and the right in Israel, strike an equally perfect balance between divergent Israeli and Palestinian demands, march with [Israeli Prime Minister Benjamin] Netanyahu up to the edge of war with Iran yet keep Israel from plunging over that particular cliff, calibrate the ratcheting up of punishing sanctions and other acts in relation to Iran so finely that the Iranians will, in the end, yield to U.S. demands without triggering a war, prevent the Syrian civil war from spilling into Israel, which means controlling Lebanese politics too -- and do it all while maintaining his liberal base at home and fending off the inevitable assault from the right.
That’s a tall order, indeed. But in American political culture we expect no less from any president. After all, he is “the most powerful man in the world” -- so he should be able to walk such a high wire adroitly, without fretting too much about the consequences should he fall.
Whenever an American president travels abroad, his overriding plan is to act out on the world stage the fantasy that so many Americans love: Their leader, and the nation he embodies, have unlimited power to control people and events around the globe.
In this imaginary scenario, the president can do all because he knows all. He is above every fray, understanding the true needs of both sides in every conflict. That’s why he can go anywhere and tell the locals what is true and right and how they should behave.
With his awesome wisdom and omnipotence, the mythic president can deftly maneuver his way across the most challenging and dangerous situations and settle every dispute with god-like justice. He can be all things to all people. So he never has to make painful sacrifices or suffer any losses, as he proves that the American way must eventually triumph over all.
Historians should wonder: How did this mythic image of the all-powerful president arise? Already in late eighteenth-century writings we can find confident claims that the fledgling United States of America is destined to play a unique role in bringing peace to the world.
But the idea that the president would personally have such power has its earliest seed in Theodore’s Roosevelt’s successful mediation to end the Russo-Japanese War in 1905. TR was probably motivated mostly by concern that the war would interfere with burgeoning U.S. trade interests in east Asia. But the Nobel Peace Prize he received seemed to mark him as less interested in national power than world peace.
That image of a disinterested pursuit of peace and justice was magnified manifold by Woodrow Wilson, who truly founded the myth of the omnipotent president on the global stage. Wilson deftly blended appeals to the idealism of the Progressive era (and the Christian Social Gospel) with warnings that Americans would never be safe until the world was “safe for democracy.”
To what he extent was he an idealist, and to what extent did his idealistic words mask a crafty pursuit of U.S. interests? Historians will probably debate that question forever. But there’s no debating his profound influence on the image of the presidency as an office responsible for bringing peace and justice to all lands.
That image languished during the Republican presidencies of the 1920s, waiting to be revived by Franklin D. Roosevelt. He always cited “cousin Teddy” and Wilson as his two great political heroes. So it’s not surprising that FDR followed their lead. In private conversations and letters, he promoted a vision of a unified democratic capitalist order spanning the entire globe, with America leading the way. And he was more than ready to play the role of omnipotent dispenser of peace and justice to maintain that order.
But he knew that the American people were hardly ready to see the nation, or the president, take on that level of international involvements. Even when war broke out in Europe, FDR had to confront a public deeply divided on whether they and their should get involved. Like Wilson, FDR mounted a major public relations campaign to gain support for his efforts to use America’s mighty power to control events around the world. Like Wilson, he appealed to both idealistic traditions and vivid depictions of threats to U.S. interests and American lives.
Once the U.S. entered World War II, resistance to this new global role pretty much evaporated. But FDR continued to worry that, once the war ended, the public would revert to its “isolationist” tendency to ignore issues of peace and justice in the rest of the world.
Roosevelt had underestimated his own achievement. By the war’s end, his skillful rhetoric had persuaded nearly all Americans that their own safety depended on their government's -- and especially their president’s -- ability to control events everywhere. With all the other major powers devastated, the U.S. had such preponderant power that the fantasy of total control seemed quite realistic.
Josef Stalin’s Soviet Union quickly burst that bubble. But by the late 1940s American public discourse had settled on a seemingly comfortable consensus that lasted through the Cold War era: The U.S. would control everything of significance that happened in the “free world,” on our side of the Iron Curtain, while exercising enough control over the communist bloc to “contain” it.
Once the Cold War ended FDR’s vision of a single global order seemed genuinely within reach. So there was even more reason to embrace the mythic vision of the president’s unlimited power.
There’s a good argument to be made that the most important results of U.S. foreign policy ever since the 1940s -- for better and for worse -- have flowed directly from this image of the omnipotent president, representing the omnipotent nation, trying to exercise unlimited control.
The most vivid lessons came from presidential (some call it imperial) overreach, most notably in Vietnam and Iraq. Yet despite these remarkable evidences to the contrary, many Americans still cling to the mythic narrative of “the most powerful man in the world,” able to control events in every corner of the globe. Why?
The question can be answered in many ways. If we stay strictly within the confines of the study of myth, one explanation seems most compelling.
The claims for presidential control have always grown hand in hand with fears about what we now call homeland security. There’s a straight line leading from Wilson’s warning that Hun victory would spell the end of all civilized (read: American) values to the Obama administration's warnings about North Korea’s nuclear weapons and the Syria’s chemical weapons. All the fears built up along the way created what I call the myth of homeland insecurity: the conviction that the very existence of America is constantly in peril.
The best way -- perhaps the only way -- to allay that fearful belief has been, and apparently still is, to accept the myth of the omnipotent president: “The most powerful man in the world” can manage every situation, no matter how perilous, with wisdom and skill. He can give a cleverly calculated prod here and a perfectly calibrated nudge there, pull all the strings with unfailing precision, without ever losing his perfect balance. Thus he can guarantee a safe outcome for America.
How reassuring it must be to believe that. And how predictable it is that, as long as this mythic story prevails, presidents will continue to overreach their true, limited power, with results that most of America will come to regret.
Yet the irony is obvious: The more regret, the more insecurity; the more insecurity, the more powerful the appeal of the myth of all-powerful president. And the cycle just keeps on turning.
Barack Obama and John Kerry at the Church of the Nativity in Bethlehem. Credit: U.S. State Department.
The real Barack Obama was clearly on display in his quick trip to Israel and Palestine. Wherever you are on the political spectrum, he always gives you something you want with one hand, while he takes away something equally important with the other hand.
When Obama spoke in Jerusalem, I cheered as loudly as the audience of liberal Jewish students who shared my views, which the president voiced so eloquently: The occupation is really bad for Israel; Prime Minister Benjamin Netanyahu must lead his nation to a just peace with an independent, viable Palestinian state.
I cheered most when I heard Obama say words that I never thought I’d hear an American president say in Israel: The occupation is not merely harmful to Israel’s national interests, it’s downright immoral: “It is not fair that a Palestinian child ... lives with the presence of a foreign army that controls the movements of her parents every single day. ... It is not right to prevent Palestinians from farming their lands ... or to displace Palestinian families from their home.” Bravo!
But Obama is no starry-eyed idealist. He crafts such idealistic words for practical political purposes. In this case he was pushing Israeli liberals and centrists further toward the peace camp, widening the gap between them and the Netanyahu-led right wing. Down the road, he can use the political tensions he stirred up to move Israel toward the kind of peace agreement he wants.
The pundits who declared him finished with the peace process were obviously wrong. (Even Thomas Friedman can make mistakes.) The president gave me something I want very much: A promise of more American pressure on Israel to make a just peace, for moral as well as practical reasons.
Predictably, though, at the same time Obama took away something equally important: his demand that Israel stop the main roadblock to peace, its expansion of settlements in the West Bank. Instead he fell back on the vague language we’ve heard from many presidents before: “We do not consider continued settlement activity to be constructive, to be appropriate”; “Settlement activity is counterproductive to the cause of peace.”
In Ramallah, standing alongside Palestinian Authority president Mahmoud Abbas, Obama called the settlements merely an “irritant.” He urged Abbas not to use settlements as an “excuse” to refuse direct negotiations.
There’s some evidence that the PA had already received and perhaps accepted this message from Washington. Talking points prepared for Abbas suggested that he should agree to negotiations after getting only private assurances from Netanyahu on stopping settlement expansion. How much could those assurances be worth?
Back down on moral principle and tolerate an evil for the sake of a greater good: That seems to be Obama’s message now on the settlements. As usual, the president gives and at the same moment takes away. Does it make him just another crass politician, maneuvering to score the next victory, bereft of any principle?
Not necessarily. Biographies of Obama suggest that, from his college days, he has been a devotee of a consistent set of principles: the script laid out over 80 years ago by the famous theologian Reinhold Niebuhr in his classic book, Moral Man and Immoral Society -- though Niebuhr supposedly said, years later, that he should have called it “Immoral Man and Very Immoral Society.”
Indeed. Because Niebuhr’s basic point is that we are all doomed to tolerate and even embrace evil. We are all selfish, always out to get more than the other guy, simply because we are human. It’s the old story of original sin -- the myth of Adam and Eve eating the forbidden fruit, expelled forever from paradise -- dressed up in twentieth-century clothing.
If individuals are bound to be nasty and brutish to each other, it’s worse in relations among nations, Niebuhr argued. Never expect anything from a nation except greed and lust for power. Even on the rare occasion that a nation pursues a relatively good aim, it’s bound to use evil means. And that includes Niebuhr’s homeland, the good old US of A.
It pained him to see his theology become the dominant narrative of Cold War America with one huge twist: U.S. presidents and policymakers exempted America from the universal stain of sin -- at least in public, where they insisted that America would, and could, do no wrong.
In private, the cold warriors acted upon (and occasionally admitted to each other) the principle that Niebuhr said all nations will inevitably use: accepting evil means to pursue even the best goals. The twenty-first-century warriors against terrorism, Democrats as well as Republicans, have followed the same Niebuhrian script. Now Obama has brought it to the Middle East.
In fact Americans have always practiced such hypocrisy, Niebuhr argued, although they generally denied it and claimed that their nation was as pure as Eden. That’s The Irony of American History (as he titled his other most famous book).
Obama surely understand this irony very well. He never quite comes out and admits that he is embracing evil for the sake of a greater good. But he doesn’t boast of America’s perfect purity in the way the early cold warriors, or his predecessor George W. Bush, did.
Obama addresses almost every issue in the Niebuhrian way he spoke of the settlements: “The politics there are complex ... It’s not going to be solved overnight,” because there is no absolute good or evil; we always deal in shades of gray; we all make compromises; sooner or later, we all become hypocrites.
But I wonder whether Obama ever stops to think about the other irony of American history, since Niebuhr became a guiding light of its foreign policy.
When he wrote Moral Man and Immoral Society, Niebuhr thought he was showing a better path toward hope and change than the idealistic Christian liberalism of the Progressive era. You’ve got to get your hands dirty in the political process if you want to improve the world: That was the essence of the myth that he intended to create.
History played a trick on him, though -- just as his own theory predicted. The main message that American readers and leaders took from his book is that the world is a dangerous place; everyone is out to get us; self-protection is the name of the international game; so do evil unto others before they do it unto you.
This is the foundation of what I call the American mythology of homeland insecurity. It’s the narrative that dominates U.S. foreign policy -- and Israeli foreign policy too, though the Zionists didn’t need Niebuhr to teach them. They developed their own myth of insecurity before he ever wrote a word.
The same narrative dominated Obama’s rhetoric in Israel. He wrapped his calls for peace in endless recitation of the supposed dangers that Israel faces, dangers that are largely imaginary. He may have meant it as a pragmatic move, to convince Israeli Jews that he really does care about their fate.
But irony always wins in the end, Niebuhr taught. So Obama’s powerful reinforcement of Israel’s insecurity is likely, in the end, to undermine his call for Israel to compromise and take risks for peace. As long as the Israeli Jews, and their supporters here in the U.S. (mostly gentile conservatives), believe that they are as endangered as Obama says, they are not likely to take any risks at all. They are more likely to do evil to others, because their fearful imaginings tell them that others are about to do evil to them.
Myths of insecurity always block the path to hope and change. Barack Obama, the faithful Niebuhrian, always gives hope and change with one hand and takes it away with the other.
“The United States is taking the threat of a ballistic missile attack from North Korea very seriously,” Melissa Block informed us on NPR the other day, sounding very serious herself. To protect us from that threat, the U.S. will station 16 more anti-missiles missiles in Alaska.
“The big questions, of course, are this,” NPR’s Tom Bowman explained: “Would North Korea actually launch a missile against the United States, and would these missile interceptors work? And frankly nobody knows for sure, but the Pentagon says, we have high confidence.”
High confidence that the missile defense will work or that the North Koreans would attack the U.S.? No doubt Bowman meant the former (though “the testing has been a bit spotty,” as he tactfully put it).
But the whole project, with all its ballyhoo and its $1 billion price tag, makes no sense unless the Pentagon also has high confidence that North Korea might indeed attack the United States.
Seriously? North Korea is unlikely to have the technical means to attack the U.S., at least not for many years to come. If they ever get that capacity, their nuclear arsenal, like their military capability as a whole, would still be infinitesimal compared to ours. It takes a microscope -- or whatever equivalent CIA analysts use -- even to see it. That’s not going to change.
Any gesture of attack would give the U.S. license to devastate the small, poverty-stricken Asian land. The exercise would be rather effortless for the U.S. North Korea’s leaders must know that attacking the world’s mightiest nation would mean instant national suicide.
The whole idea of the North Korean mouse attacking the American elephant seems rather absurd, to say the least. Yet the “threat” is widely reported in the U.S. mass media as if it were an undeniable fact. Why?
Bowman offered an important clue when he said that U.S. anti-missile missiles “are the ones that would actually hit an incoming enemy missile from, let's say, North Korea.” His “let’s say” implies that we are defending against a generic threat, of which North Korea is merely one example. North Korea is just the current actor filling the role of “enemy attacker” that the generic script calls for.
It’s much the same mythic scenario that white Americans have been acting out, and basing policy on, ever since the first colonial militias were formed to fend off the Indians -- the scenario 20th century Americans came to know (and often love) as “cowboys versus Indians.” Now, some nation or other has to play the role of Indians.
Since the United States was created, only one other nation -- Great Britain -- has actually launched an attack on U.S. soil. That was a full two centuries ago. But the mythology of homeland insecurity, with its picture of an America constantly at risk of enemy invasion, remains powerful. This deeply-rooted and long-regnant mythology -- with America playing the cowboys and some (any) enemy nation the Indians -- is the lens through which the mainstream of American culture sees the world. It seems totally natural. That’s one reason it’s so easy for the U.S. media, and so many American people, to believe in “the North Korean menace.”
What’s more, in our traditional national narrative the “bad guy” enemy is, by definition, “savage” and thus bereft of reason. So he might well do something as totally self-destructive as attacking us. How often have we heard that North Korea’s leaders are erratic, irrational, and indeed “crazy”?
Digging deeper we find other, more paradoxical, sources for this old myth’s staying power.
It’s getting harder to see the world through the familiar lens of fear. After the cold war ended, the Chairman of the Joint Chiefs of Staff, Colin Powell, complained: “I’m running out of demons.” He knew that an enormous military budget needs “demons” to sustain it.
A mythology of insecurity needs those “demons” just as badly. American culture is deeply invested in its mythology. Many millions of Americans can hardly imagine what it would mean to be a patriotic American if we did not have potential attackers to resist at all costs. In America, the sense of security that comes from a taken-for-granted mythic narrative needs some nation or other to play the Indians.
But if Powell worried about the absence of “demons” twenty years ago, how much more might a Chair of the Joint Chiefs worry now. The whole tradition of courageous resistance to enemy nations may soon be just a quaint relic of a bygone era, unless we keep on finding “threats to our national security.” North Korea may be just what we need to save the worldview of American patriotism.
Let’s go another level deeper. I’ve been re-reading Alan Trachenberg’s fine study of the “Gilded Age,” The Incorporation of America. Trachtenberg makes a key point about the “cowboys versus Indians” narrative. The story depends on a cowboy using his unique combination of skill and courage to save a whole (white) community from “savagery.” The mythic cowboy is a throwback to the knightly bravado of Launcelot and Galahad. He is popular culture’s way of celebrating the same heroic individualism that Frederick Jackson Turner celebrated academically in his famous “frontier thesis”: “that dominant individualism ... that masterful grasp of material things, lacking in the artistic but powerful to effect great ends.”
Guns were certainly high on the list of material things that the frontier hero masterfully grasped -- and used, though only in self-defense, to effect great ends, as the story assures us.
Here’s the paradox that Trachtenberg points out: The “cowboys versus Indians” narrative first came to dominate popular culture precisely when the era of rugged individuals determining the destiny of anyone or anything in the West was in rapidly dying. The real emerging power in the post-Civil War era was the corporation: the ever-growing army of anonymous technicians, managers, and accountants who, each day, gained more and more power over the resources, the culture, and the lives of people in the West.
The growing supremacy of corporations triggered a cultural crisis because it raised such a fundamental question: How could Americans continue to base their lives on their familiar worldview and values? Those had grown up at time when Americans still had reason to believe that they might control a substantial part of their lives through their own choices and actions. Would that old way of life have to be abandoned altogether? Or could some part of it be saved?
One way to save it, Trachtenberg argues -- perhaps the only way -- was in imagination, by creating the mythic tale of the heroic cowboy: “Through such popular fictions, the West in its wildness retained older associations with freedom, escape from social restraint, and closeness to nature.” The ultimate, though unseen, point of the story was to hold on to an old worldview precisely because new realities were rapidly rendering it irrelevant.
That may well be the point of today’s popular story, too -- the one that casts North Korea as the “bad guy” who must be defeated by the heroic U.S. military. Enemy “demons” have disappeared because resistance to U.S. - led multinational corporate capitalism has been largely extinguished in the few places it remained: Serbia, Libya, Iraq, Afghanistan. Apart from North Korea, only Iran remains on America’s list of inarguably evil nations. And the Obama administration still insists that it might be possible to negotiate our difference with Iran.
That leaves North Korea as the sole irredeemable “bad guy,” the only nation left to play the role of Indian in our long-cherished national tale of global “cowboys versus Indians.”
In a more theoretical vein, analysts of world affairs have been debating for many years whether nation-states will remain significant actors on the global stage in an age of multinational corporate capitalism. Some argue that even now national borders have become, for all practical purposes, irrelevant as they’re swallowed up by the multinational monoliths. Headlines announcing an increase in U.S. anti-missile defenses to meet the North Korean threat may seem to prove them wrong.
But the lesson Alan Trachtenberg draws from the 19th century cowboy narratives teaches just the opposite: Stories may well become prominent precisely because they are irrelevant to, and stoutly deny, the actual facts of life.
That lesson is all the more convincing if we look at another, closely related aspect of the “Gilded Age”: the growing call for a more “muscular” American military. The president who trumpeted that call the loudest was the “Rough Rider,” Theodore Roosevelt. TR saw the military as one crucial way to revive “the strenuous life” of rugged individualism and its masculine virtues, which he claimed to have learned from the cowboys on the South Dakota frontier.
It’s no coincidence that TR was also the first president to fight against the monopolistic practices of corporations. For him and many of his generation, corporations threatened to sap the individualistic vigor that was essential to the American way of life. Tales of heroic cowboys and soldiers, both defending Americans against savages, pointed the way toward averting the threat posed by the corporate way of life.
Now, with this threat grown global, stories of a muscular American military response to a savage enemy may serve much the same purpose: reassuring Americans (even if unconsciously) that they, as individuals, still matter and still have some control over their ever more corporatized lives.
The ultimate irony, which was already becoming evident in the “Gilded Age,” is totally obvious today: In the story that Americans tell, their security depends on highly technological weapons built by multinational corporations and wielded by anonymous, bureaucratized military managers. The 21st century military “cowboy,” the mythic figure so many depend on to resist total corporate domination, has been completely corporatized.
Image via Shutterstock.
While the two major parties plot strategy for the next battle in the federal debt-reduction war, another war rages among economists over the question, “Is debt really the federal government's biggest problem?” Some insist that unless Washington cuts spending substantially to reduce the debt quickly, we are headed for disaster. Others insist with equal fervor that growth is the number one priority: Aggressive pro-growth policies will reduce the debt in the long run with far less pain.
If the pro-growth economists could gain public support they would give liberal Democrats a powerful weapon to resist the Republican’s budget-slashing ax. But the pro-growth faction makes little headway in the public arena because the political wind is blowing so strongly against it. Why should the wind blow that way?
It’s not because voters have studied the competing theories carefully and concluded that the debt-crisis faction has the stronger case. When it comes to economic theory, few of us draw any conclusions at all. We get lost in the esoteric arguments so quickly that we give up trying to understand. Politicians know this; most of them are probably just as lost as the rest of us amidst the esoteric arguments.
But the best politicians know something else: Few voters care much about theories at all. Few of us make up our minds through careful logical analysis of the facts. Instead we rely on myths to organize of the vast barrage of information constantly bombarding us.
The two economic theories represent two myths, deeply rooted in American political culture, that have competed for dominance throughout our history. The debt-crisis theory is so powerful now because it evokes the more compelling myth.
(Again, to clarify: when I speak of myth, I don’t mean an outright lie. Like most historians of religion, I take myths to be the stories -- compounded of fact and fiction -- that we take for granted, stories we use, often unconsciously, to make sense out of life and turn it into meaningful experience.)
The pro-growth view is summed up by its most prominent spokesman, Paul Krugman: The size and danger of the debt are overstated. And the best way to reduce the debt we do have is to spend more government money now to stimulate growth. Yes, it will raise the federal debt for a while. But soon the expanded economy will be putting enough back in the government coffers to erase that increased debt.
Here we have the cherished American myth of progress, or, as Barack Obama rechristened it, the myth of hope and change: America’s mission is to make a better life for all its people and, in the process, for the whole world. Head out to a new frontier. Believe in your vision of the future. Invest in it. When that future arrives, things are bound to be better than they are now. So take some risk. Show some courage. That’s what America is all about.
But the debt-crisis party won’t buy it. The heart of their economic theory is fear, especially of federal debt. As Republican economic guru Bruce Bartlett put it, “The debt limit is the real fiscal cliff.” Like individuals or families who spend beyond their means, the party can continue just so long, they say. Then comes the day of reckoning, when it’s impossible to pay back the borrowed money: Bankrupt!
No one can predict when that tipping point will come, the prominent columnist Robert Samuelson has written. Like any cliff, it can remain unseen until we go over it, suddenly and unexpectedly.
He quotes economist Barry Eichengreen, “a leading scholar of the Great Depression,” who warns that if the U.S. debt grows large enough bond traders will stop funding it: “This scenario will develop not gradually but abruptly. Previously gullible investors will wake up one morning and conclude that the situation is beyond salvation. They will scramble to get out. … The United States will suffer the kind of crisis that Europe experienced in 2010, but magnified.”
Americans are very familiar with such warnings of a surprise attack, though not from bond traders but from foreign evildoers, be they fascists, communists, or terrorists. To draw on the current parlance, it’s our myth of homeland insecurity: America is constantly at risk. Its chief mission is to protect itself from forces that would destroy it.
So let’s arm ourselves well, circle the wagons, and proceed with utmost caution. Any morning we may wake up and find our nation under attack. Any misstep might plunge us over the cliff into the abyss of national catastrophe.
Economists like Krugman can argue with the most compelling logic that a nation is not like a family. Governments don’t have to repay all their debts. They need only “ensure that debt grows more slowly than their tax base.” And a nation’s debt is largely “money we owe to ourselves.” They can point out that the current federal debt is no higher (perhaps lower) that it was in the post-World War II years, when the U.S. was beginning its greatest economic boom ever.
But the myth of homeland insecurity is a formidable foe. Like any deep-seated myth, it’s largely impervious to logic.
Most historians agree that Franklin D. Roosevelt’s warnings about German bombers attacking Kansas City were exaggerated, to say the least. So were the warnings about a Soviet nuclear “bolt out of the blue” from Eisenhower, Kennedy, and Reagan. Nevertheless, the sense of permanent insecurity they created has become a firm pillar -- perhaps the central pillar -- of American political culture.
It’s the foundation on which the Republicans build virtually all their rhetoric and policies. And they can tie the rising federal debt to traditional fears of foreign foes by citing warnings that the debt is “the most significant threat to our national security” (Admiral Mike Mullen) and puts the U.S. “at risk of squandering its global influence” (New York Times analyst David Sanger).
Of course even the most conservative Republicans still pay lip service to the American faith in progress. But even most Democrats agree that, when it comes to making policy, national security trumps every other concern. There’s a bipartisan consensus that we must always be on the alert for threats, old and new, and ready to resist them by any means necessary. That consensus is bound to keep us insecure, constantly ready to see every new development as a potential crisis, which gives a clear edge to “homeland insecurity” in the war of the myths.
Barack Obama has paid homage to the myth of homeland insecurity ever since he first won the presidency by promising to stave off an impending financial collapse. He has promoted his major policies not only in the name of hope and change but, even more often, as ways to prevent things from getting worse. Now he constantly reassures the public that he takes the idea of an impending debt crisis very seriously and is dedicated to resolving it.
Most Democrats follow the president’s lead, insisting that debt reduction is at the top of their agenda, just as Republicans insist it must be. This bipartisan consensus is a testament to the enduring dominance of the myth of homeland insecurity. It’s the power of this mythology, not any facts or logic, that deprives the pro-growth view of any serious public hearing. Every day that the debate over debt reduction dominates the headlines cements America more deeply into its long-standing dominant mythology.
Every society has the right to choose its myths. But every right has a correlated responsibility. As we grow more self-conscious about the role of mythology in political life, we also have a growing responsibility to recognize the consequences of our choices. America’s obsession with homeland security has already had grave consequences. The debate between debt-crisis and pro-growth economists gives us a chance, from an unexpected quarter, to consider whether we want to dig ourselves deeper into that hole.
You remember those Chinese hackers, the ones we are all supposed to be so terribly worried about just a few days ago? They’ve disappeared from the headlines; apparently we’re not supposed to worry about them any more, at least for now. But they’re bound to be back back in the headlines sooner or later, and probably sooner. So we ought to take a close look at the story.
The joke is on the hackers, says Washington Post wonk blogger Ezra Klein. They’ve been suckered in by a great myth -- the myth that there’s some secret plan hidden somewhere in Washington, the script for everything that the American government and American corporations do. The Chinese think that if they hack enough computers, somewhere buried in that mountain of data they’ll find the master key that unlocks the plan.
The joke is that there’s no key because there’s no plan, says Klein. Everyone in Washington and in corporate America is just bumbling along, buffeted by each day’s endless compromises and unexpected twists and turns that make our system run.
I imagine Klein is describing the American system pretty accurately. But I suspect the joke is on most Americans too, including many in positions of power, because we too have been suckered in by a great myth -- a myth about China that’s a mirror image of the myth Klein says the Chinese hold about us.
All the headlines I’ve seen about the alleged hacking suggest that there’s some great monolithic monster out there -- some say it’s the Chinese army, some say the Chinese government, some just call it “China” -- doing the electronic snooping. Its tentacles are everywhere across America (though they’re especially dense inside the D.C. Beltway), scooping up vast amounts of precious data (the story would hardly matter if the data weren’t precious) and feeding it all into some superbrain that is plotting something.
Who knows what? We certainly can’t know. After all, the monster is Oriental; thus, as an age-old American tradition tells us, inscrutable.
But the Chinese monster surely knows what it’s plotting. It must have a plan. Otherwise why go to all the trouble of doing all that hacking? And the plan must be nefarious. I mean, we’re talking about foreign spies here. Whatever they’re up to, it’s always no good. Whoever heard -- or more likely saw, in the movies or on television -- a story about foreign spies who didn’t pose a threat to us?
If the spies are inscrutable Orientals, who think in ways we can’t ever understand, they are all the more threatening. And if those Orientals are agents of the world’s largest nation – a nation so vast we can scarcely comprehend its size, much less its motives – why that’s the most threatening image of all.
Wait, it gets worse.
I’m writing this a few days behind the headlines because I’ve been traveling. You learn a lot traveling. For example, from reading the nation’s most respected, influential newspapers I had learned that the Chinese hackers are a threat to our national security. But in the airport I learned that I had a lot more to worry about than that.
On every newsstand the message leaped out at me from Bloomberg BusinessWeek: “YES, THE CHINESE ARMY IS SPYING ON YOU” -- spelled out against a blood-red background in big bold yellow letters. Very yellow.
It’s appropriate that my destination was California. I’m writing this piece in the state that did more than any other to promote the notion of the “yellow peril” and the Chinese Exclusion Acts of the 1880s and 1890s.
In light of this sad history, I was hardly surprised that there would be so much fear of the Chinese spying on our government and corporations. But now Bloomberg tells me that the “yellow peril” is coming after little old me -- and you and you and you -- scooping up all the most personal private data that each one of us has stored on our home computers.
Who would have imagined it, even in the heyday of the “yellow peril” panic? No matter how much you fear the foreign evildoers at your doorstep, it seems, there’s always more to be afraid of.
If the Chinese are absorbing data from each and every one of us, they are obviously looking for more than just Washington’s master plan. Surely they know that nothing on personal computers in Keokuk or Kankakee can yield the mythical key. No, it’s obvious that they want everything that America has to offer, absolutely everything, wherever it is.
I mean, think about it. China keeps on expanding in every way imaginable -- just like the Borg. So, like the Borg, it must intend to absorb us all. Absorbing our precious data is merely the first step in the plan.
And while we Americans merely bumble along chaotically, the Chinese Borg must have a plan. All this hacking can’t be happening at random. Even Ezra Klein, who usually makes his living promoting healthy skepticism, assumes that everything in China is guided by some master plan.
Indeed, that’s the vital difference between the two great powers, Klein concludes, and the key to America’s superiority. Since we don’t have any great plan, when things get screwed up (as they inevitably do) we are flexible enough to get along pretty well anyway.
But the Chinese depend so heavily on “the plan” that when it gets screwed up they are stuck. In other words, the Chinese are just too regimented, as if the whole nation were one big army. As in any army, when things get snafu’d no one knows what to do next, because the underlings have never learned to think for themselves. Everyone must simply follow orders. Resistance is futile.
Which brings us back to the “China as the Borg” myth. But this Borg isn’t brave enough to confront us boldy, declaring with supreme self-confidence, “Resistance is futile.”
No, this is a Chinese Borg. So it must be not merely inscrutable but sly, devious, duplicitous. Real (read: white) Americans have always been fascinated by what they imagined went on in the alleyways of our Chinatowns, the kitchens and back rooms of our Chinese restaurants, and -- the most fascinating mythic realm of all -- the smoky, gauzy netherworld of the opium dens.
We could never know what they were doing. But whatever it was, like the deeds of foreign spies, it could bode no good for real Americans.
Now, it seems, that inscrutable “yellow peril” has morphed into a perilous yellow cyber-Borg, aiming its e-tentacles at each and every one of us. For lovers of political mythology, China is the gift that keeps on giving.
Seriously, I’d bet the farm (if I had a farm) that the joke is on us. Of course China has hackers who scoop up as much data as they can from us -- just like the American hackers who are aiming their e-tentacles at China. When the New York Times first reported that the Chinese hackers were probably working for the army (probability was all they could offer) their only source was “American intelligence officials who say they have tapped into the activity of the army unit for years.” I should hope so. That’s what we pay them, with our tax dollars, to do.
And just as our government and military analysts sift and organize the spy-collected data in endless ways, constantly arguing about what it all really means, no doubt the Chinese do the same, bumbling along in the same way with the same random, chaotic results.
It’s spy versus spy. I imagine it has been going on ever since the Sumerians faced off against the Akkadians well over four thousand years ago. It’s the obvious thing for great powers to do too. And for all those four millennia, I suspect, every spy has tried to convince his (or her; see “Homeland”) boss that s/he’s got the master key to the enemy’s grand plan. How else can a spy hope to get promoted or given a juicier assignment?
Mad Magazine’s “Spy Versus Spy” cartoons captured the essential comedy of the game for a whole generation of youth growing up in the heyday of the cold war. No doubt, if the Sumerians or Akkadians had a Mad Magazine the basic joke would have been the same.
But while my generation was laughing at “Spy Versus Spy,” our elders were avidly supporting a government that was heading toward the Cuban missile crisis, when the president’s closest advisor (his brother) could only estimate the rough odds on avoiding an all-out nuclear holocaust. No joke.
So while we laugh at the funny side of the “Chinese hackers” panic, we also need to take it, and talk about it, very seriously. Understanding its deep roots in American mythology is a useful first step. Resistance to even the most deeply entrenched myth is never futile.
Dick Cheney in 2011, and the infamous Bulletin of Atomic Scientists Doomsday Clock.
I also love those brave (fictional) CIA analysts, Maya and Carrie. They see a huge danger ahead that everyone else is blind to, and they insist on crying out a warning, regardless of the risk -- just like the biblical prophets. What’s not to love?
In fact they’ve inspired me to cry out a warning of my own. It’s not the threat of “another terrorist attack,” but the threat of America being seized once again by “war on terror” fever. I know that seems crazy, because hardly anybody worries seriously about the “terrorist” threat any more. In the last year, when pollsters asked about the single most important issue facing the nation, they usually didn’t even list “terrorism” as an option. When they did, it consistently showed up at the bottom of the list.
But Zero Dark Thirty and “Homeland” reminded me that one sector of the American populace ranks “terrorism” right up at the top, way above any other national concern, and obsesses about the “threat” night and day. It’s not just the CIA but the entire “Homeland Security Complex” -- what the Washington Post called, in 2010, “Top Secret America: A hidden world, growing beyond control.”
The Post’s article began by saying that the HS Complex is “so large, so unwieldy and so secretive that no one knows how much money it costs, how many people it employs, how many programs exist within it or exactly how many agencies do the same work.”
It went on to estimate that there were “some 1,271 government organizations and 1,931 private companies work on programs related to counterterrorism, homeland security and intelligence in about 10,000 locations across the United States,” with “33 building complexes for top-secret intelligence work built or under construction since September 2001” -- with total space bigger than three Pentagons -- in the DC area alone.
That’s a lot of people, spending a lot of money, focusing like a laser on the single goal of “defeating the terrorists.” And that was three years ago. Think how the HS Complex has grown since then.
As those “most important issue” polls show, there’s a huge disconnect between the HS Complex and the rest of the nation, which doesn’t think much about “the terrorist threat” at all any more -- except as an exciting theme for suspenseful movies and television shows.
We’re back in a situation much like the early and mid-1970s, when “détente” was the watchword in U.S. relations with the Soviet Union and the People’s Republic of China. The debate over Vietnam generally treated that country as an isolated hot spot, largely detached from its global cold war context. So the global cold war purred along quietly in the background of American life, little more than a plot device for the entertainment industry -- except in the huge Military-Industrial-Intelligence Complex, which devoted itself night and day to waging and worrying about the conflict between the U.S. and the whole of the “communist bloc.”
The M-I-I Complex was a kind of fuel waiting to be ignited, to put the global cold war once again at the top of the national priority list. The fuse that did the job was the election of Ronald Reagan in 1980. It re-ignited the cold war, which dominated much of the nation’s life for the next decade (and vastly inflated the federal budget).
The resurrection of the cold war was a triumph for a group of high-powered conservative and neoconservative politicos, banded together under the banner of the Committee on the Present Danger (CPD).
One of their icons, Norman Podhoretz, explained that they worked hard to counter a “national mood of self-doubt and self-disgust” triggered by the debacle in Vietnam. Americans were crippled by “failure of will” and “spiritual surrender,” the neocon writer lamented. They would no longer make the sacrifices needed to “impose their will on other countries.”
The only way to counter this national transformation, as the CPD saw it, was to revive cold war brinksmanship. A nuclear buildup, bringing increased risk of nuclear war, was not merely a price worth paying; it was a way to teach the public to accept sacrifice as the route to national and spiritual greatness.
The CPD’s diagnosis and prescription were extreme, to say the least. But many Americans did feel what President Jimmy Carter called a “a crisis of confidence”; many others called it a “malaise.” And the CPD understood correctly that it was triggered, above all, by the Vietnam war -- the first war that America had ever lost -- which did so much to undermine faith in the narrative of American exceptionalism.
The 1970s teach us a vital lesson: The “foreign threat” narrative has a much better chance to prevail when large numbers of Americans are unsure that the familiar structures of public life are secure.
We had already learned that lesson in the late 1930s. The Great Depression had created so much anxiety about American life for so long that the public was ripe for a “foreign threat” narrative. In the late 1970s there was also economic anxiety, powerfully reinforced by the all-too-fresh memory of the defeat in Vietnam.
It could happen again! The fuel pile -- the Homeland Security Complex -- grows bigger every day. The neocon heirs of the CPD still have their well-honed public relations machine in high gear, eager to see that fuel explode.
In an economy that is slowly recovering but still perceived as feeble, even a minor “terrorist attack” -- or any incident that could plausibly have that label pinned on it -- would give the neocons the fuse to ignite the fuel. We might well be thrown back a decade, to a time when the “war on terror” dominated national life in a way that teenagers of today can hardly imagine.
The HS Complex, as big as it is, could grow by leaps and bounds. Presidents Eisenhower, Kennedy, and Reagan all showed that an M-I-I Complex widely seen as immense could actually grow much bigger.
There’s one other vital piece in this ticking time bomb: the weakness of any alternative narrative to interpret a “terrorist attack.” Whenever there’s sudden, unexpected violence, people will demand some narrative or other to make sense out of it. That’s why we have news media: to “give us the story.” Right now, the only story the media can even think about using is the “war on terror” tale.
It wasn’t always thus. When an attempt was made to bomb the World Trade Center in 1993, the Clinton administration treated it as a criminal attack. The media followed suit. And eventually people were convicted for the crime through due process of law.
In the weeks after the 9/11 attack, there was some effort to invoke the same “crime and justice” narrative. But it was quickly eclipsed by the story that labeled the attack an “act of war,” and that view has prevailed ever since.
As long as there is no other narrative on the scene to effectively challenge the “war” story, the “war on terror” time bomb will keep on ticking. The longer it ticks, the more likely it is to explode, plunging us back into the world that Dick Cheney once assured us would be “the new normal” forever.
Claire Danes in "Homeland."
In our home the State of the Union address was not followed by the Republican reply. We skipped Marco Rubio’s rebuttal in favor of watching a DVD of old “Homeland” episodes. We’re finally catching up on the first season of the “CIA versus terrorists” drama that everyone else has been watching and raving about for the past two years.
The incongruity of watching the SOTU and “Homeland” in the same evening was a stark reminder of how much has changed in America in just a few years. “Homeland” would have made a wholly congruous nightcap to any SOTU speech by George W. Bush.
That’s not to say Obama’s “war on terror” policies are so different from W.’s. The similarities as well as differences have been parsed at length by the pundits, and similarities there are a-plenty. But the tone of American life has changed so much now that we have a “hope and change” president instead of a “war president.”
“Homeland” takes us back to the dramatic world that W. invited us into: a world where evildoers lurk unseen beneath the surface of American life, a life that is constantly (if sometimes only slightly) on edge, because no one knows for sure where and when sudden death may strike again, as it did on September 11, 2001. W. fit easily as an actor in that world. Indeed he gave himself a leading role in the drama.
We may not have been happier in that world of so recent yesteryear. But “Homeland” reminds us why so many Americans found it gripping and exciting: It seemed like a matter of life and death. That’s the stuff great theater is made of.
Barack Obama’s SOTU, like every SOTU, was meant to be great theater too. Yet there was something less than satisfying about the show. Watching “Homeland” made it clear what was missing in Obama’s show: The death-dealing bad guys were nearly invisible. The “terrorists” got a very brief mention, mostly to assure us that they could be defeated by technical means, like any other technical problem, without any compromise of our cherished American values.
The real bad guys lurking constantly between the lines of the speech were the Republicans. But they were never called out by name. And their evil -- the fact that their proposed policies would kill many more Americans than “terrorists” ever will -- was hidden so deeply between the lines, it was practically invisible. So they could hardly perform effectively as the villains in the piece.
The Republicans’ evil had to be hidden because the world that the president created in his address was such a utopian world, where everything wrong in American life is just a technical problem that can be fixed with relatively little effort. In Obama’s world evil is simply a temporary error, a lapse in clear thinking, easily corrected under the guidance of a skillful tutor.
Obama took us back to the days of Theodore Roosevelt and Woodrow Wilson, when all we had to do was to reason together. We would surely recognize the logic of his proffered solutions, he seemed to say with every breath. Then, with only the slightest application of good will, all our problems could be quickly resolved. He made it all sound so simple, so obvious.
The world of “Homeland” -- W.’s world -- is the world of Franklin D. Roosevelt (and Winston Churchill), where evil is far more than a mistake in logic. It is a permanent, intractable element of human life. We cannot reason together, because some of us are moved by an impulse to evil that defies all reason. So evil is not a problem to be solved. It is an enemy to be defeated by any means necessary -- perhaps even extra-constitutional means, though that remains a matter for debate.
Few Americans watched the SOTU and “Homeland” in the same evening. But all got a taste of this stark contrast in national narratives when they watched the evening news, where Obama had to share the headlines with an evildoer defeated in the mountains outside Los Angeles. Any TV news editor worth his or her professional salt would probably lead with the story of the dead LA ex-cop, not the SOTU. The battle of good against evil is the heart and soul of all television drama, even on the news.
Yet the utopian impulse can create great theater, too. After all, it rests on imagination and fantasy, which are the root not only of theater but of all entertainment. Utopia is only entertaining, though, if it offers a vision of a completely perfect world that can be attained some day, no matter how distant, without compromise.
Barack Obama will not give us that emotional satisfaction. He is a self-confessed disciple of the theologian Reinhold Niebuhr, who battled fiercely against the utopian political impulse -- created largely by Christians -- that flourished in the days of TR and Wilson. Niebuhr accused Christians of being “unrealistic” because they ignored the classical doctrine of original sin: evil is a permanent fact of life, which we must wrestle with forever. It’s a testament to Niebuhr’s enduring influence that being “unrealistic” and “utopian” remains the cardinal sin of American political life.
Obama danced on the edge of that sin in his SOTU. The tone he set left an unmistakable sense of utopian aspiration. Yet it remained merely a vague impression because every time he approached the edge of utopia he backed away, as he always does, for the sake of “realistic” compromise with the GOP evildoers.
The question Obama's SOTU speech poses is whether the utopian impulse can be resurrected in a nation that has been gripped for so long by the drama of good against evil, a nation that has made the war against evildoers the essence of its national narrative.
Obama himself can never be the agent of utopia’s resurrection. But John F. Kennedy was certainly far from a true utopian either. And his rhetoric played a role -- a major role, some historians think -- in creating the brief era of the late 1960s, when the utopian impulse flourished throughout the land once again.
Of course JFK had MLK to do the heavy utopian rhetorical lifting. Dr. King had studied Niebuhr carefully, and he too asserted the reality of evil. But he threw in his lot with the faith of the Christian utopians who were convinced that some day evil will be overcome, not by war but by the reason and good will of humanity. No one can say how long it will take. The arc of the moral universe is long. But it bends toward justice and the perfection of the beloved community.
Obama has no one with nearly the stature of MLK to offer such a message to America today. So his tantalizing hints of utopia must do their work on their own. We don’t yet know what that work will be. Just as no one in the days of JFK could predict what effect his words would have, so we cannot predict the long term effects of Obama's turn toward utopian imagination.
Stay tuned for our next exciting episode.
Swedish B.D.S. poster. Credit: Wiki Commons.
Andrew Meyer wants us to believe that anyone who opposes Zionism, for whatever reason, is inherently anti-Semitic. He starts from the premise that we should focus on historical effects rather than intentions. Perhaps he thinks that restriction works to the advantage of his argument.
After all, it’s obvious that plenty of people have opposed Zionism with no anti-Semitic intent. Before World War II many Jews -- perhaps a majority of the world’s Jews, and certainly a vast number of devout Orthodox Jews -- opposed the Zionist project in principle. They surely had no anti-Semitic intentions. There are still plenty of Jews today who oppose Zionism. Some of them, especially in Israel, make a very thoughtful case that Zionism is ultimately harmful to the best interests of the Jewish people. Their intentions are obviously not anti-Semitic. So looking at intent certainly would undermine Prof. Meyer’s case.
But even if we look only at historical effects, his argument is mistaken. It really boils down to one claim: “Israel has been the single greatest impediment to institutionalized anti-Semitism in the international arena.” Without a Jewish state, he argues, “Jewish communities throughout the world” would lack “concrete protections” from anti-Semitism, and there would be “a more favorable climate for the growth and spread of anti-Semitism.”
That argument might have been convincing once upon a time. Historians will probably argue about it forever.
Today, though, there can hardly be any doubt that Israel is actually increasing anti-Semitism around the world. Every day Israel is creating more opposition, antagonism, and sometimes anger toward the Jewish state -- not because of its mere existence, but because of its palpably unjust treatment of Palestinians, its unjust (and too often violent) military occupation of Palestinian land, and its reluctance to make a just peace that would leave it living alongside a viable Palestinian state.
The growing atmosphere of world-wide criticism of Israel is hardly helpful to erasing the vestiges of anti-Semitism. On the contrary, it does more than anything else to keep anti-Semitism alive. Most critics of Israel’s policies know that this effect is unfortunate and unfair. They say that they object to the Jewish state’s treatment of Palestinians, not to Jews or Judaism, and there is no reason to doubt their sincerity.
However unfair it is, though, this historical effect of fostering anti-Semitism is understandable. The leaders of Israel’s government in recent years have insisted loudly that their state has, will always have, and must have a “Jewish” identity. As Prof. Meyer points out, there is no consensus on exactly what that means. But the general message comes across emphatically: Whatever Israel is and does, there is something uniquely Jewish about it.
What the rest of the world sees Israel doing, more than anything else, is occupying and oppressing Palestinians. So it’s easy enough -- even if illogical in the strict sense -- to conclude that military occupation and oppression are somehow essential expressions of Jewish identity. That’s bound to fuel anti-Jewish feelings.
Similarly, the leaders of Israel have always insisted that the state acts on behalf of all Jews, everywhere. Those leaders have done whatever they could to make that claim true, and they have largely succeeded. Israel is widely seen as the primary agent, and in a sense, the embodiment of the Jewish people on the world stage. So it is natural that many non-Jews would understand Israel’s actions as deeds done by the Jewish people at large. Since the most public of those deeds are morally dubious, at best, it is inevitable -- though again, illogical in the strict sense -- that many observers will have an increasingly negative view of Jews.
The process works in yet a third way: Growing numbers of Israel’s critics are persuaded that there is something inherently unjust in a state that privileges one group of people over all others. This argument is heard much more widely now that it was twenty or thirty years ago. Anyone who has watched events over those decades knows why: More and more people every year are concluding that the Jewish state is incapable of mending its ways. The facts on the ground give support to the (once again, logically erroneous) argument that a Jewish state is bound to be an oppressive state, which further fuels anti-Jewish feelings.
The points I’m making here are so well known and so widely discussed that I’m surprised Prof. Meyer ignored them. You can find some columnist worrying about them in the Israeli press nearly every day.
I’m surprised by something else. Prof. Meyer says he “stand[s] with the Palestinian people in demanding their right to statehood, and decr[ies] the injustice of the Israeli occupation.” And he defends his college’s sponsorship of a public discussion on the “boycott, divestment, sanctions” movement. All laudable sentiments.
So I wonder why he ignores the actual effect of writing an article titled “Anti-Zionism Is Anti-Semitism.” He must know that this slogan is commonly used to stifle expression of exactly the views he holds. In fact, the slogan is most often used to try to silence all criticism of Israel’s policies and actions, no matter how unjust or inimical they are to the interests of peace.
Many readers of the news, on the web or in print, never get past the headlines. So by choosing to write on this topic Prof. Meyer, no matter how unwittingly, is serving the same unjust policies he criticizes. And he is aiding, no doubt unintentionally, the suppression of the free debate that he actually wants to foster.
He notes at the end of his article that critics of Israeli policies are often perceived, by supporters of those policies, “to be evasively concealing” their true agenda. Unfortunately the same perception readily applies to anyone who writes an article titled “Anti-Zionism Is Anti-Semitism,” regardless of his intent or the content of his writing. I absolutely do not believe that Prof. Meyer had any hidden agenda in writing this article. Other readers might not be so generous. Effects are often as unfair as they are illogical. But I heartily endorse Prof. Meyer’s view that they must be taken carefully into account.
The on-again, off-again debate is on again: Does the executive branch of the United States government ever have the right to assassinate American citizens without due process of law? A brave soul, who hopefully will remain nameless, has leaked an internal Justice Department “White Paper” outlining the Obama administration's reasons for answering “Yes.” A chorus of critical voices answers, just as loudly, “No.”
But most of the critics agree with the administration and its supporters on one point: The question here is about the executive’s power in wartime.
If that is indeed the question -- a big “if” -- history offers a certain kind of answer. Lincoln, Wilson, and Franklin D. Roosevelt all pushed their constitutional authority to the limit during war -- and beyond the limit, critics in their own day and ever since contended. Yet the overreach of these three presidents (if overreach it was) did little to tarnish their reputations.
Even their critics generally place their actions in a larger context: It’s understandable, though regrettable, that war subjugates all other concerns to the overriding goal of victory. And imagine if the United States had lost any of those wars. Where would we be now? The “White Paper” assumes much the same question as its foundation: Who would countenance a president risking the security of the United States in wartime?
So the document ignores the more basic question: Is this actually “wartime”? Is there a precise parallel between the situation this president faces and the wars his illustrious predecessors waged?
The “White Paper” itself admits that this is a different kind of war: “a non-international armed conflict.” But it ignores the difference. It acknowledges that this is not “a clash between nations.” Yet it consistently treats al-Qaeda, for all practical purposes, as if it were a nation. And it uses all the reasoning that would apply to an old-fashioned war between two nations.
This version of reality -- call it the “We’re at war again” story -- has been so dominant for so long that it’s easy to forget how it began. After the 9/11 attack, the Bush administration made a very calculated decision to declare it an act of war. There was an obvious alternative: After the botched 1993 attack on the World Trade Center, President Bill Clinton chose to treat it as a criminal act, to be addressed by the police and justice apparatus, not the military.
A decade ago, there was still some public controversy about whether the Clinton or Bush approach was the best way to proceed. But that controversy didn’t last long. The war party’s story won out and is still winning out.
Every story creates its own world, a world spawned in imagination. The “war against al-Qaeda” story lends itself very readily to fiction; its world has been depicted in innumerable movies, novels, and TV shows.
Now the “White Paper” offers a valuable confirmation that this imagined world has become the very real world of the Obama administration and the national security establishment. In many respects, it is the world in which all Americans live. The “White Paper” lets us take a good look at its mythic foundations.
In this world, al-Qaeda is not a jumble of separate, vaguely connected cells (as many experts describe it). It is a virtual nation, with a unified, well-disciplined army whose “leaders are continually planning attacks.” Their purported motives are irrelevant; at least they are never addressed in the paper. All that matters is their one and only activity in life: ceaselessly planning attacks.
To make matters worse, “the U.S. government may not be aware of all al-Qaeda plots as they are developing and thus cannot be confident that none [i.e., no attack] is about to occur.” In other words, we must live and act as if an attack were about to occur unless we have firm evidence to the contrary. And since that evidence can never be found -- How can you prove a negative? -- the threat of attack is “imminent” at every moment of every day. That’s the pivotal premise of the story.
But who or what is always about to be attacked? Here the war story’s world gets a bit murky. On the one hand, the target is clearly the entire nation; the “White Paper” repeatedly insists that the president is acting only to protect the nation from attack. On the other hand, the document insists just as often that he is acting to protect individual Americans from attack.
The two kinds of attack are treated as interchangeable. So the war story, in effect, makes every person in America an embodiment of the nation. An attack on any one, if somehow linked to al-Qaeda (or an “associated force”), is the equivalent of a whole al-Qaeda army invading our homeland.
Is any attack on an individual American, by definition, really an attack on America itself and thus an act of war? Yes, the “White Paper” assumes -- if the attack is planned and carried out by al-Qaeda (or an “associated force”). Yet it never offers any argument to substantiate this claim. There’s no need for an argument. Within the world of the war story it’s a tautology: Since al-Qaeda is, by definition, at war with us, any violent deed it or its associates commit is, by definition, an act of war. Within another story -- say, Clinton’s story of 1993 -- the same deed would be a criminal act, calling for a hugely different kind of response.
The “White Paper” occasionally mentions a third kind of attack, on U.S. “interests.” These remain undefined. But it treats any attack on our “interests” as equivalent to an armed invasion of the nation -- even if those “interests” are on the other side of the globe. In the war story, “the nation” is an expansive concept, indeed.
Those are the highlights of the war story and the world it creates. The crucial question that the “White Paper” raises is whether this is the world we want to live in. Once we recognize that this world is a product of imagination, born from one story among several that we might have told after 9/11, we also recognize that we are not forced to live in this world. It is a choice.
The ultimate results of this choice are clear enough. There are uncounted numbers of people dead. A few of them are U.S. citizens. Some (we shall never know how many) may actually be planning an attack that might kill people on U.S. soil. And some (more than we would like to imagine, perhaps) are wholly innocent “collateral damage.” Their deaths raise powerful anti-American sentiments and motivate a few among the survivors to become active planners of attack on the United States.
Growing anti-Americanism reinforces one more inevitable result of the war story: a distant, muffled, yet very real and constant drumbeat of cultural anxiety that has become part of the soundtrack of American life.
The debate about whether the executive has the right to execute U.S. citizens without due process in wartime is certainly an important one. But isn’t it rather more urgent to debate whether we want to live in this frightening imagined world of “wartime”?
The American people may collectively choose this world despite its perils. One sign: The public endorses the president’s policy of extra-judicial killing of U.S. citizens, according to polls. In fact pollsters no longer find it a controversial issue; the most recent poll I could find that asked the question was a full year ago.
Perhaps most Americans have forgotten that another story is possible. Or perhaps most prefer to be at war. The war story, and war itself, have undeniable appeal. And a “good war,” in which the enemy is absolutely evil and the only Americans who die are “bad guys,” is so much more appealing.
But if that’s what the public wants, at least it should be a conscious choice. Then, if there’s another attack on U.S. soil, we will have to acknowledge that the story we chose to tell played a role in making the attack more likely.
Of course we could choose a different story and a different world, one where police and judiciary action rather than war is the proper response to attacks on U.S. soil. Then the question about the executive’s right to kill citizens extra-judicially would simply evaporate. Wouldn’t that be a simple, elegant way to end the debate?
John Kerry embraces John McCain at his recent confirmation hearings. Via Flickr/Glyn Lowe.
At his confirmation hearing, the new Secretary of State, John Kerry, declared flatly: “Foreign policy is economic policy.” Now them is fightin’ words if they’re spoken by a scholar of U.S. foreign policy. Scholars of the “revisionist” school have been attacked, reviled, and marginalized for decades simply for saying what Kerry seemed to say: Economic motives are the main drivers of foreign policy. So when revisionists hear a top government official say it out loud, it’s like discovering gold: It’s hard evidence that their view is correct.
And, like discovering gold, it doesn’t happen very often. When U.S. government officials speak in public, they are usually careful to say that American foreign policy has one overriding aim: promoting American values and ideals around the world. Those values and ideals hold true everywhere, the official narrative has always insisted. So our foreign policy goal is to promote the good of everyone, all over the world.
It is permissible, sometimes obligatory, to add that U.S. foreign policy also aims to protect the United States against enemies. And that can readily lead to the goal of “promoting American interests.” But the official narrative assumes that the stronger America is on the world stage, the more able it is to promote its universally true values, which are the only key to world peace. So there can be no conflict between our interests and our altruistic ideals. That identity of interests and values has always been the bedrock of the official story.
It was still the bedrock on Inauguration Day, when Barack Obama proclaimed: “We will defend our people and uphold our values through strength of arms and rule of law. ... Our interests and our conscience compel us to act on behalf of those who long for freedom. ... Peace in our time requires the constant advance of those principles that our common creed describes.” The official narrative seemed alive and well.
But just three days later Senator Kerry -- a solid pillar of the foreign policy establishment -- had surprisingly little to say about values and ideals in his statement to the Senate Foreign Relations Committee. He did talk openly about “advanc[ing] America's security interests in a complicated and even dangerous world.” And he warned that “we will do what we must to prevent Iran from obtaining a nuclear weapon.”
But Kerry went out of his way to put emphasize what revisionists have long seen as the most precious, closely-guarded secret: Economic interests are the mainspring of foreign policy. And he treated it as if it were an obvious, ordinary observation.
The establishment press put the spotlight just where the new head of State wanted it. “Kerry Links Economics to Foreign Policy,” the New York Times headlined. Though he “outlined no grand agenda for the next four years,” the Washington Post reported, “the closest he got to a foreign policy mission statement” was the simple equation: foreign policy = economic policy.
As Kerry explained himself, he gave a whole arsenal of ammunition to the revisionist argument. He introduced his discussion of economics and foreign policy this way: “It’s often said that we can’t be strong at home if we’re not strong in the world.” Then he warned that the U.S. it at risk of losing its “leverage ... strength and prospects abroad.”
Leverage and strength for what prospects? Kerry gave several kinds of answers.
The first sounded like classic revisionist theory: “The world is competing for resources and global markets. Every day that goes by where America is uncertain in that arena, unwilling to put our best foot forward and win, unwilling to demonstrate our resolve to lead, is a day in which we weaken our nation itself.” In other words, the global economy is like a huge pie. America’s strength is defined by how big a slice we get. The goal of foreign policy is to make sure we get a bigger slice than anyone else.
Kerry’s other explanations for building American “leverage” and “strength” were less direct. “The first priority ... as we work to help other countries create order ... will be that America at last puts its own fiscal house in order. “Order,” revisionists point out, has been a central term in American foreign policy discourse for a long time. It’s a code word for a stable capitalist system, where capitalists can safely predict that they’ll get a decent long-term return on their investments.
Kerry was warning that if capitalism can’t guarantee long-term prosperity in the U.S., foreign nations will not be so eager to accept American investments in their own land.
He also had another kind of warning: “It is hard to tell the leadership of any number of countries they must get their economic issues resolved if we don't resolve our own.” Getting “their economic issues resolved” is another coded message, one straight from the fount of common capitalist wisdom: To create the “order” that makes investment safe, many nations must cut public expenditures drastically. To make the world safer for American investors, the U.S. government must use its “leverage” and “strength” to compel other governments to make those painful cuts.
The underlying premise here is the premise of all U.S. foreign policy since at least the 1930s: America’s role in the world is to create and safeguard global “order” -- to make the world safe for capital investment, especially American investment. The U.S. is entitled, in fact obligated, to impose “order” everywhere, by any means necessary. Now that means, in most cases, imposing austerity.
But, Kerry said, demands for austerity from the U.S. won’t be credible if we have a huge budget deficit of our own. So “the first priority of business which will affect my credibility as a diplomat ... is whether America at last puts its own fiscal house in order.”
No doubt Kerry said all this to support Obama’s budget battle against the Republicans. Obama’s call (in his inaugural speech) to “to act in our time” was a message to the GOP to quit their obstructionist ways and accept the centrist compromises the president is always ready to offer. The administration is trying to make the case on every front that the nation’s well-being demands it. Kerry was showing that he’ll be his boss’ loyal servant and sound appropriately urgent.
But Kerry’s eagerness to make the “Foreign policy is economic policy” case reflects more than short-term political tactics. It’s a sign that that the official narrative of American foreign policy is changing, or at least is open to change. Top officials are ready to say openly what revisionists claim they’ve been saying privately, among themselves, all along (and revisionists have plenty of evidence to support that claim).
Why the shift?
The government always faces a major problem when it comes to foreign affairs: Not many Americans care much about the rest of the world, and certainly not about spreading American ideals throughout the world. Government officials have to come up with some other reason to justify their extensive involvements abroad and the tax dollars they spend on those involvements.
It’s not so hard when there is some clearly identified enemy to fight -- as long as the public thinks their tax dollars are buying American victories. Now, though, the only “victories” are pinpoint attacks on “terrorists,” and Obama wants to preserve his freedom in that fight by keeping it secret. The obedient Kerry’s single mention of “terrorism” and “drones” was to downplay their importance.
How can the whole foreign policy enterprise be justified today? At a time when public opinion focuses so single-mindedly on the economy, the answer is obvious: Just say, loud and clear, “Foreign policy is economic policy”; there’s a global economic struggle going on; we Americans need to be strong enough to win it; the only way to win it is to control economic life around the world.
And there’s no great danger for an incoming Secretary of State to say all that, nor to have it headlined in the nation’s leading newspapers. The revisionists have been so effectively silenced that their cries of “I told you so” are not likely to cause much of a ripple. So there’s no reason for the foreign policy establishment to be afraid of their criticism.
But there’s a lesson here that foreign policy revisionists might want to ponder. The stories that interpret and justify public policies -- I call them myths -- are created for political purposes. They can shift as quickly as the political winds. Sometimes those winds blow a heavy dose of truth into the myths. That’s why a myth is not a lie; it’s a mixture of truth and falsehood, with the proportions depending, in large part, on the political needs of the time.
Precisely because the political winds can shift so quickly, groups that have little influence today may find themselves with a lot more influence tomorrow. So the revisionists have good reason to store up their political resources, polish up their own myths, and pack them with as much empirical truth as they can. The golden nugget offered by John Kerry is a treasure that can serve revisionists well for all three of those purposes.
I once heard a prominent expert on contemporary Islam say that Al Qaeda is not an organized group (and this was while Osama bin Laden was still alive). It isn’t even, primarily, a group of people at all. Al Qaeda is best understood as a body of discourse, a way of talking.
How do you fight a body of discourse? With another body of discourse, of course. The United States government is doing that in all sorts of ways, spreading the gospel of democratic capitalism and the American way of life.
But how do you make a movie about a war between two bodies of discourse? If you want to win awards, pack the theaters, and turn a profit, you don’t. A good movie has to start with a mythic script. And it’s awfully hard to find the myth in a war of discourse versus discourse.
So you make a movie about a war of good guys against bad guys. That’s about as mythic as it gets. It’s the American war story that has been made in Hollywood a thousand times -- well, a thousand and one, now that we have Zero Dark Thirty. I’m finally getting around to writing about the film, after just about everyone else in the world has had their say, because I finally got around to seeing it. It turns out there was no reason to rush anyway.
After all the controversy about the torture scenes, and Kathryn Bigelow’s highly publicized claim that her film merely depicts the horrors of American behavior in the “war on terror,” letting us all make up our own minds about the moral issues, I was expecting some complexity and ambiguity -- something like what she gave us in The Hurt Locker.
Instead I got two hours and thirty seven minutes of classic, mythic American war movie.
I suppose the difference between Bigelow’s two films is a good index of the difference between the two wars they depict, as far as public perception goes. A popular film about the Iraq war was bound to be ambiguous because, once Saddam Hussein was gone, no one could tell exactly who the enemy was. They were simply (as so many American soldiers told us on the evening news) “the bad guys.”
But as long as Osama was alive, the war against Al Qaeda was a perfectly unambiguous war. In fact it was a “good war," because it fit so well the prototype of all “good wars": the war against Nazi Germany. Both were waged against forces that had, without a doubt, done terrible things. And in both wars American forces also did terrible things. But American deeds were rarely called into question because the enemy’s deeds were so indisputably evil.
There is one major difference between World War II and the war against Al Qaeda: the Germans never made a significant attack on American soil. In that sense, the war against Japan is a better parallel to our current war.
But mythically (and thus cinematically) the war against Germany remains the prototype, for many reasons, no doubt. Zero Dark Thirty reminds us of one big reason: German forces were led by a single arch-villain, the man who remains for Americans the epitome of evil. Leadership in Japan was more diffuse. Since World War II, Americans have needed an enemy led by a single “Hitler figure” before they would sustain support for a war. Osama was the Hitler-est of them all.
Zero Dark Thirty also fits the WWII movie mold by giving us a superhero who wins the day. Granted, a woman who defeats the enemy by brainy manipulation of digital data is a far cry from John Wayne in the trenches. She’s a fine measure of how much American culture has changed in the last half-century or so.
Nevertheless, Zero Dark Thirty fits the WWII mold: a gripping story of one purely good person defeating one purely evil person (and an inept bureaucracy on her own side, to boot). It’s a dark, dirty job the superhero must do, even if she wears a clean white collar. But, then, as long as the evildoer is at large, commanding his forces of evil, it’s a dark, dirty world. Someone has to do the dirty work to clean up and purify this dirty world. Someone has to descend into the darkness to create a bright new light for all of us to bask in. And that someone, our national story insists, must be an American.
There’s another important parallel linking the war against Germany with the one against Al Qaeda. In both cases, neither the troops doing the fighting nor the general public knew very much at all about the beliefs, values, or ideologies that drove their enemies. They simply “knew” (that is, believed) that the arch-villain and his minions were evildoers who threatened the very existence of the United States and thus had to be stopped at all costs. That was the essence of the myth.
Zero Dark Thirty reflects that myth quite perfectly. We never get a hint of interest on the part of the American fighters in why their enemy perpetrates violence. This is Hollywood -- or perhaps I should say, this is America -- and it just doesn’t matter. As long as there is an American superhero pitted against a foreign arch-villain and our superhero wins, no questions need be asked. Perhaps that’s why all the controversy about this film has centered on the torture scenes, not on the simplistic, superficial, conventionally American triumphalism that prevents it from being a great film.
After the arch-villain is vanquished, though, there is one question to be asked -- the question that ends Zero Dark Thirty: Now that you are no longer threatened by the evildoer, “Where do you want to go?” I presume Kathryn Bigelow wants the audience to see Jessica Chastain, at that moment, as a symbol for America. Once we have defeated evil, where do we as a nation want to go?
The tear falling down Chastain’s cheek tells me that the question is supposed to make us all cry. Why? Finally, in the last seconds of a very long film, a note of ambiguity: You decide why.
I don’t have any trouble with that one. We Americans can unite so readily and act so effectively, as a nation, as long as we believe we are fighting an evil that threatens our country, or, to use Michael Sherry’s apt phrase, as long as we feel that we’re “in the shadow of war.”
But suppose we could escape from that shadow into a world that is no longer dark and dirty? Could we unite and choose a positive new direction for our nation? Our history since the 1940s suggests that we have largely forgotten how to do that.
We do fine when we are acting out our mythology of national insecurity. But if we try to think about acting out a mythology of hope and change, we don’t know how to change or even what to hope for. That is indeed worth shedding a tear for.
Barack and Michelle Obama at his second inaugural. Credit: Flickr/Adam Fagen.
There were passages in Barack Obama’s second inaugural address that sounded like a European prime minister from a Labor or Social Democrat party addressing his Parliament. Obama had a whole laundry list of progressive proposals. Some were explicit:
“Care for the vulnerable and protect people from life's worst hazards and misfortune” through “Medicare, and Medicaid, and Social Security”; “respond to the threat of climate change”; make sure that “our wives, our mothers, and daughters can earn a living equal to their efforts. … our gay brothers and sisters are treated like anyone else … no citizen is forced to wait for hours to exercise the right to vote”; “find a better way to welcome the striving, hopeful immigrants.”
Some of the progressive program was implicit:
Protect the environment with “the technology that will power new jobs and new industries” (presumably funded generously by government); “revamp our tax code” (presumably to make the rich pay more); “reform our schools and empower our citizens with the skills they need” (presumably with more public funding for education); keep “all our children … always safe from harm” (presumably through gun control laws).
Yet Obama could not actually come across, on Inauguration Day, as a progressive prime minister. The occasion has traditional rules, written and unwritten, that bind any president, no matter what his or her political views. There must be pomp and ceremony, strict protocol, splendor and grandeur. There must be patriotic praise of America, religious praise of God, and ample assurance that the two are inextricably connected.
In other words, the occasion must be a coronation, and the star of the show must act, to a considerable degree, not as a prime minister but as a king. While the particulars of the ceremony are uniquely American, its underlying structure can be traced back to royal rituals of the third millennium BCE, when a new king received his crown and scepter (typically from priests) amid the same kind of pomp and splendor.
One scholarly opinion explains these ceremonies in terms of a worldview that saw the state as an island of order surrounded by a threatening sea of chaos. The ruler and the axis connecting him to the gods were the linchpins of order. So the demise of a ruler was an immensely threatening event. The new ruler had to be installed according to an elaborately structured ritual to protect the vulnerable state from tipping over into chaos.
The new ruler’s job was the same as the old ruler’s: to continue that protection by living every moment of his royal life according to the traditionally prescribed, ritualized rules. The state was a 24/7 dramatic production -- a “theater state.” As long as the heroic lead actor performed perfectly, the order of the state (and, in most versions of the “theater state,” the world) would be preserved.
A more skeptical scholarly view holds that the real intent of the coronation ceremony was to overawe the inhabitants of the state as well as its potential enemies, to impress upon them the immense power wielded by the new ruler. The same intent motivated the daily ritual and grandeur of the royal court after the new king was installed, this theory holds. If everyone was impressed enough with the king’s power, they would obey his commands and refrain from any kind of resistance. Thus the prevailing status quo -- the existing order -- would continue undisturbed.
So both theories arrive, by different routes, at the same conclusion: The pomp and splendor symbolized a guaranteed assurance of permanent order in the face of an ever-present threat of chaos. Maintaining the status quo was the essential -- and essentially conservative -- purpose of the “theater state.”
During the presidential campaign I wrote about the debates as an example of the “theater state.” I suggested one lesson from the survival of this ancient tradition in our democracy: Americans want their president to be, in some sense, like a king, offering “the reassurance that comes from seeing and hearing the same ritualized words and behaviors, over and over again, in a well-acted political theater.”
The presidential inauguration is more obviously a direct descendant of the ancient “theater state.” It shows more clearly that the president must be both prime minister and king. No matter how progressive he may want to be in the former role, his royal obligations force him to be the guarantor of the status quo, hence essentially conservative.
When Obama concluded his inaugural address with an appeal to the citizenry to “shape the debates of our time - not only with the votes we cast, but with the voices we lift in defense of our most ancient values and enduring ideals,” he offered a fine example of the dual role that he must play.
The call to lift voices is a pragmatic prime minister’s tactic: Mobilize public opinion in support of the ruling party, so that the opposition will see more political risk than benefit in blocking the PM’s program.
The invocation of what is “ancient” and “enduring” also has a pragmatic purpose: to win over wavering centrists to at least some parts of his program. But no matter what a president’s political calculations may be, he or she is compelled to use such language on inauguration day. It is the obligatory royal language fit for the occasion of a coronation ritual. It is inescapably conservative language, because the conservatism inherent in the role of the king is inescapable.
Barack Obama may be comfortable with that inescapably conservative element of his job, or he may be quite unhappy about it. After four years I still can’t tell. In any event, though, he is stuck with it because, in a democracy, government must give the people at least some of what they want.
Thomas Jefferson thought that his victory over the Federalists in 1800 dealt a decisive defeat to the desire for monarchy in the United States. He made his inauguration an extremely modest affair to symbolize that point. So far, at least, it seems that Jefferson was wrong.
Yet precisely because this is a democracy citizens can shape the outcome of the political process. Indeed, as Obama said, we “have the obligation to shape the debates” -- and, he might have added, the inaugurations -- “of our time.”
Camels in the Sahara near In Aménas, the site of the hostage crisis. Credit: Flickr/albatros11.
Barack Obama and his political advisors surely thought that gun control would dominate the headlines for days to come after the president announced his controversial proposals. But some armed men in a remote gas drilling site in the Sahara desert had other ideas.
The pundits love to tell us that a president who focuses on domestic policy is inevitably frustrated, because there are bound to be unexpected crises abroad that demand his, and the nation’s, attention. But there’s really nothing inevitable about it. It’s a choice that the public, and the news media who must sell their wares to the public, make.
Certainly the lives of the people at risk in the Sahara are important. It’s a tragedy when anyone is killed. But let’s face it. A handful of American lives may be lost in Algeria; maybe not. Whatever the outcome, this incident will soon disappear down the American memory hole.
In the gun control debate, on the other hand, we’re talking about a continuing threat to a huge number of Americans. Thousands of lives will surely be lost this year, and next year, and the year after that, ad infinitum, if the laws aren’t changed. Yet the gun control issue was quickly eclipsed by the public’s rapt attention to the hostage drama in the desert.
Historians may not be surprised. White Americans have been fascinated by stories of their own people being taken hostage by “bad guys” ever since the seventeenth century.
Back then, many colonists were captured by native warriors. To the native Americans it was perfectly logical: Whenever some of their people were killed by whites, they would capture -- not kill -- a roughly equivalent number of whites to replace the lost members of their community. It wasn’t about good destroying evil. It was about maintaining an approximate balance.
But the whites didn’t understand that. As most of them told the story, absolutely good (white) people were locked in an endless struggle with absolutely evil (native) people. When whites were taken captive by natives, whites typically saw it as a violation of the most basic moral rule: good should triumph over evil. When whites escaped, it was easy to explain it as an act of God, restoring the proper moral order of the universe.
That’s how Mary Rowlandson told the story of her captivity in The Sovereignty and Goodness of God (1682), a book that quickly became a best-seller and continued to be widely read for more than a century. But Rowlandson’s is only the most famous of the many so-called captivity narratives that have captivated the imaginations of white Americans ever since.
What makes these stories so compelling? Historians have made a cottage industry out of finding new answers to that question. One intriguing theory begins with a well-documented observation: Plenty of captured whites were in no rush to go home. A good number chose to “go native” and live out their rest of their lives among the Indians.
This widely known fact freaked out a lot of white people; it turned their world upside down. If the “good” voluntarily chose to blend into the “evil,” how could anyone be sure anymore where the line was that separated the two? And if that line was blurred, how could there be any moral order at all?
What these whites needed, above all, was reassurance that the moral line dividing them from the native people was absolute, impermeable, and immutable. That’s why captivity narratives were so popular (this theory goes): In these tales, the whites were always absolutely good and the natives absolutely evil. Telling and reading the stories over and over again was a way of reaffirming the simplistic moral fantasy as the true reality, which made it easier to treat the observable, empirical world as if it were not real.
Is it still going on today? The parallel is far from perfect. There’s no evidence that armed Muslim forces want to capture white Americans to populate Muslim communities. Yet white America still has an insatiable appetite for captivity narratives.
And the reasons behind that appetite may very well be much the same. Even if very few white Americans have visibly “joined Al Qaeda” (whatever, exactly, that might mean), lots of white Americans feel increasingly unsure that they can see a clear-cut, absolute line between good and evil.
The gun control debate is a fine example. While some Americans are sure that stricter gun laws would be good, and some are sure those same laws would be evil, a vast number in between aren’t quite sure of anything. So the public as a whole is in a state of moral confusion. The same is true, of course, about so many other issues.
On at least one point, though, there is an overwhelming consensus: Al Qaeda (whatever it may be) is evil. So when America is attacked by, or pitted against, Al Qaeda, America is self-evidently good.
Ditto when Americans are captured by Al Qaeda -- especially if the capture is “masterminded” by a fierce-looking, black-turbaned, one-eyed Muslim with the exotic name Mokhtar Belmokhtar, AKA “the Uncatchable.” It sounds too wicked to be true, like something “straight out of central casting,“ as the Times of London said -- although in a grade-B Hollywood movie, where we would expect to find him, he would be called simply “the Evil One.”
It’s precisely the mythic quality of this story, and of all captivity narratives, that makes them so fascinating. In myth, as in Hollywood, all the world’s shades of grade can be boiled down to simple black and white.
There’s a perverse sort of advantage when good people are captured by evildoers rather than killed outright. Attacks and battles are usually short-lived affairs. The story is told, and then we’re quickly on to the next story. Nothing is as stale as yesterday’s news.
But a hostage crisis can continue for a long time. Day after day we get to see or read the captivity narrative. And each repetition offers more reassurance that, despite all our disputes and uncertainties, the struggle of good against evil goes on. So we know there are still some absolutes to provide order in our moral universe.
This theory that explains the popularity of captivity narratives also explains why the public would so quickly switch its focus from gun control to the drama unfolding in the Sahara. The gun debate only reinforces the sense that no one knows any longer what’s good and what’s bad. The endless news about the hostage crisis eases that disturbing feeling and replaces it with a satisfying reassurance that, ultimately, all is still right with the world -- even if a bunch of people have to die to prove it.
(For a look at the mythic qualities of the gun control debate, see my recent post on ReligionDispatches.org.)
Martin Luther King, Jr. with President Eisenhower in 1956.
You probably know the mythic Dwight Eisenhower, the “great peacekeeper in a dangerous era,” who bravely withstood the communist threat while skillfully avoiding all-out war. The quote comes from Evan Thomas, the latest writer to make a mint by retelling the tale. It would hardly be worth noticing, except that pundits keep trotting out the mythic Ike by as a model for the current president to follow.
Latest example: the Washington Post’s influential foreign affairs columnist David Ignatius, a dependable megaphone for the centrist foreign policy establishment. He’s praising Thomas’ book, Ike’s Bluff, for supposedly showing us how a great president deals with “continuing global threats … that require some way to project power.”
Thomas’ book bears the grandiose subtitle “President Eisenhower’s Secret Battle to Save the World.” Save it from what? Why, the “red menace,” of course. And now, says Ignatius, Obama must deal with al-Qaeda and Iran -- who are also, presumably, threatening to destroy the world. Eisenhower had to stop the communist “advance in Europe and around the world,” Ignatius writes. “Obama has a similar challenge with Iran.” Then he tacks on al-Qaeda as the other looming threat to our national security. It’s the myth of homeland insecurity, as clear as you’ll ever see it.
When I call this a myth, I don’t mean it’s an outright lie. Like most of the myths in American political life, it blends some number of facts with a sizeable dose of fiction to create a narrative that expresses basic assumptions about the world and shapes government policies.
For example: In the very fluid situation created by the devastation of World War II, the U.S. government saw a chance to install its capitalist system solidly everywhere except the Soviet Union. Stalin, seeing his nation potentially encircled by an enemy, naturally did what he could to promote Soviet influence throughout Eurasia.
Eisenhower made this the stuff of myth: “Russia is definitely out to communize the world,” he wrote in his private diary. “Now we face a battle to extinction.” In 1953 Ike carried this fear-stoked exaggeration into the White House. He wrote in private letters that the Soviets were “seeking our destruction,” and his goal was to prevent “the Kremlin’s control of the entire earth.”
To achieve that goal, he was absolutely ready (though certainly not eager) to use nuclear weapons. Sorry Evan Thomas, but Eisenhower was never bluffing. He told his National Security Council that “if the Soviets attempt to overrun Europe, we should have no recourse but to go to war.” He was equally ready to use nukes to end wars in Korea and Vietnam, he told the NSC, if he thought it necessary. In 1958 he said much the same about the standoff over Berlin.
Eisenhower understood the risks. But he summed up his view quite succinctly to the British ambassador: “He would rather be atomized than communized.” In his mythic worldview, those were both very real possibilities. However the risk of being atomized arose only because he was approving the most rapid buildup of weapons of mass destruction in U.S. history and making sure that disarmament negotiations could never succeed.
Ike did all this because he took for granted the mythic threat that he, and so many other Americans, had created out of their own fears: the “red menace.” Driven by this image of imminent danger, he sowed all the seeds of a nuclear confrontation that could “atomize” the world. It was largely just luck that allowed him to escape the ultimate showdown.
His successor wasn’t so lucky. JFK had to taste the bitter fruit that grew from the seeds Ike planted.
Despite all this history, which is plain enough to anyone who reads the once-secret documents of the era, the mythic version of Eisenhower continues to be held up as a model that current presidents should follow.
So pundits like David Ignatius encourage Barack Obama to threaten Iran with “economic, military and political destruction if it refuses to make a deal” -- on U.S. terms, of course, which is bound to stiffen Iranian resistance. And he encourages Obama to continue using lethal drones to kill people, without knowing who they are or what their attitudes toward America might be -- which is sure to turn attitudes in the victims’ communities against America.
But all this is done in the name of “national security,” to contain supposed threats that are imagined to be as ominous as the “red menace” that once dominated America’s public imagination. What do we gain by letting our imaginations run away with us again?
Evan Thomas is right on one point: “Public terror was a price” -- the price, I would say -- that the nation paid for Eisenhower’s policies. Why do so many “foreign policy experts” want to take us back to that era of terror, or create a new incarnation of it?
The answer involves more than cynical manipulation. Those “experts” may very well be sincere when they tell us about the terrifying “global threats … that require some way to project power.” The more they discuss the “sources of insecurity” with each other at their high-level conferences and expense-account luncheons, the more they convince each other that their myth is literal fact.
The same goes for the politicians tutored by the experts. Sure, the politicians will lie to get specific policies implemented. But when they tell the tale that shapes their policies -- the story of “impending threat to our national security” -- there is no reason to assume that they are bluffing us.
That’s what I learned from reading thousands of pages of Eisenhower’s letters, diaries, and private conversations. No one can ever know what was in his mind. But in the documents there was never a hint that he was consciously purveying an invented “red menace” narrative. On the contrary, everything he said seemed to take for granted the truth of that myth.
So who knows? The pundits who equate “the Iranian threat” with “the red menace” may really believe it. Barack Obama may believe it too. Looking back to the cold war years teaches us how dangerous it is when the “experts” and national leaders take their own myths seriously.
Of course we should debunk the falsehoods they purvey. But debunking alone doesn’t weaken the power of a myth. It takes a new narrative. That’s something to think about as we approach a unique convergence – Inauguration Day and Martin Luther King Day on the very same day. The president, beginning his second term, is hardly likely to give us a radically new narrative. Dr. King already gave us one, many decades ago.
The New York Times has just published an expose on Fix the Debt, “a group of business executives and retired legislators who have become Washington’s most visible and best-financed advocates for reining in the federal deficit.” It turns out that “close to half of the members of Fix the Debt’s board and steering committee have ties to companies that have engaged in lobbying on taxes and spending, often to preserve tax breaks and other special treatment.” The Times gives plenty of examples to support that charge.
I’m shocked. Shocked. Why, I wouldn’t be surprised if tomorrow’s Times reveals that there’s gambling going on in the back room at Rick’s Café Americain.
Actually, the analogy with the film Casablanca may not be a bad one. Mr. Rick certainly makes enough money from his business establishment to live quite comfortably, dress impeccably, and keep his place among the city’s elite. But we know that he’s in no way just a greedy money-grubber. He aims to be a useful citizen, providing a service that his city needs and offering it at a high level of quality.
I suspect that Erskine Bowles and Alan Simpson, who co-founded Fix the Debt to promote economic policies along the lines of their Simpson-Bowles plan, would probably say much the same about themselves. So would the other leaders of Fix the Debt who were outed by the Times.
They would make the case that they aren’t merely out to engorge their own bank accounts. Smart people who go into government or lobbying know that they are not going to get truly rich that way. They are going to help other people get truly rich, though they’ll be well-paid for their services along the way.
Nor, they’d probably claim, are they hypocrites. Though the Times article never uses that word, it’s the word that will spring to the minds of many readers. The article leaves a clear impression of men (they’re all male) who claim to be serving the public good while actually serving their own private interests.
Fix the Debt leaders might well protest that this view rests on a false dichotomy, as if one must choose between public good and private interests. On the contrary (I suspect they’d argue), the genius of capitalist democracy is to erase the conflict between those two. Our system is set up so that the only way to improve material life for everyone is to let the truly rich get even richer. Call it “a rising tide” or “trickle down.” Either way, if we limit the wealth of the truly rich we all suffer. So enriching private interests is the best way to serve the public good.
If they are historically minded, they might point out that most of the Founding Fathers saw things this way, too. Alexander Hamilton articulated this view at great length on behalf of the Federalists, who didn’t think a fledgling democracy could survive unless it had a strong central government making thoughtful arrangements for the very rich to get richer.
Jefferson and Madison refuted that view on behalf of the eighteenth-century Republicans. But they came from the wealthy, slave-owning elite of Virginia. Jefferson argued against abolishing the slavery that made him rich because it would bring the whole socio-economic system crashing down upon everyone -- rich and poor, white and black, alike. And as president he used the power of government in many ways that helped the truly rich get richer, because he saw those policies necessary to serve the public way. Every president since has done the same thing.
There are many good arguments, both economic and moral, against the view that our society, or any democracy, must or should or really does work this way. But Simpson, Bowles, and the rest of the Fix the Debt crew probably have never taken those arguments seriously; perhaps never even heard them. Nor did most of the Founding Fathers. The rich and their top-flight hired hands generally live in a restricted social circle, where they meet and hear only other economic elite figures like themselves.
As in any social circle, their conversation is based on shared premises. It constantly reinforces their shared story of how human life works. In other words, they hang out with each other and keep telling each other the same myth. They never get a chance to hear any other myths. So why shouldn’t they genuinely believe their own? I’m not saying they do believe it. I’m just suggesting they might.
It’s worth considering because, if the rest of us assume that they are merely money-grubbers, most of us may easily dismiss the Times’ story with the same mock shock that was my first reaction. Of course they are gaming the system for their own profit, we’ll say. That’s how the system works. It’s the way of the world. So we’ll have our few minutes of righteous indignation and then go on our way, reaffirming our own myth about the selfishness of the rich.
But suppose a major newspaper ran an expose on the mythic worldview of Fix the Debt’s leaders. Suppose it was clearly explained and traced back to its roots in the worldview of the Founding Fathers. Then at least some of us would feel that we ought to start thinking about it.
Just what is wrong with their assumptions? If we don’t want our society based on those assumptions, can we reform the current system but still keep its basic structure? Or is their elite worldview the foundation of that structure? If we challenge their way of thinking, must we soon find ourselves talking about a revolution?
Back in the 1790s, the Republican attack on the Federalists raised these issues, albeit in a restricted way. Jefferson and Madison never seem to have taken a really long, hard look in the mirror. But even if they had a disturbing measure of hypocrisy, they did spark a debate about whether the United States really needed a federal government helping the truly rich get richer. Sometimes, at least, that debate was conducted on a fairly sophisticated intellectual level.
In the past year the United States has flirted with at least a few threads of that debate, and even that very limited flirtation has been very healthy for us. We should take every opportunity to continue it. If we cut it off with a curt “They all do it,” we steer ourselves toward an intellectual and political dead end, which cuts off any possibility of meaningful change. We ignore the myths of the economic elite at our peril.
Credit: Wiki Commons
As Congress and the administration went through their tortured post-election wrangling (or was it a dance?) over fiscal policy, Americans never seemed quite sure what mythic lens was best suited to viewing the proceedings.
Were we watching ourselves, all together, hurtling toward a cliff and trying desperately to avoid plunging over it? Or were we divided into two political and ideological camps, approaching a final showdown. I explored both the “cliff” and “showdown” metaphors in the run-up to the New Year’s denouement.
Now that a deal has been done and we can watch the public reaction through the news media, which mythic metaphor is the winner?
The many “Who won? Who lost?” evaluations seem to support the “showdown” view. The widespread view that this is merely round one, with more battles between the two parties sure to follow, also seems to give the nod to the “showdown” metaphor. Or perhaps, instead of a single showdown, we’ll start talking about a long, drawn-out "war."
But take a closer look. “Deal done, but Threats Remain; ‘Cliff’ deal averts economic disaster but hazards linger,” the headline article on the Washington Post website gravely warns us. USA Today titles its roundup of opinion, “'Fiscal cliff' deal doesn't bode well,” and the editors of that paper conclude that the deal sets “only resets the stage for the next suspenseful act.” (At least they understand that their job is to turn complex economic and political problems into dramatic stories.) The editorial page editor of the New York Times sums up the common view: “The Cliff is Dead. Long Live the Cliff.”
So we’ve actually ended up with a story that blends the two dominant metaphors. It tells us that we are still heading toward, or perhaps teetering on, the brink of a disastrous cliff, precisely because more showdowns between the two major parties lie ahead.
This isn’t a myth that Americans are very familiar with. The closest parallel might be the “government shutdown” deadlock of 1995, which led to two brief suspensions of many federal services. But few people are likely to recall that as a dreadful disaster; the immediate aftermath that’s best remembered is a spike in Bill Clinton’s popularity ratings.
Franklin D. Roosevelt tried his best in 1937 to depict a political showdown as a looming economic disaster for the nation. But his public showdown was principally with the Supreme Court, and only secondarily with conservatives (especially Democrats) in Congress. He lost both fights, and though the economy continued to struggle there was no precipitous decline (in part because the Supreme Court ended up approving a number of New Deal measure that FDR feared would be struck down).
Perhaps the closest historical analogy to the doomsayers’ view of the future would be the 1850s, when intractable political struggle split the nation apart. But that was not a struggle over economic policy (at least not in the public imagination). And the result was a temporary calamity resulting in long-term benefit to the nation, as far as most Americans are concerned.
So the idea that an ongoing political war over economic policies might truly bring national disaster has few if any roots in the soil of the American mythic imagination.
Of course the idea of living on the edge of disaster has powerful roots in modern American history in the realm of foreign affairs. During the early Cold War years Americans became accustomed to living on “the brink,” as it was commonly called, of nuclear war.
In 1956 Secretary of State John Foster Dulles explained in Life Magazine that “the ability to get to the verge without getting into the war is the necessary art" of Cold War diplomacy. Dulles left no doubt that he and the whole Eisenhower administration had mastered the art of brinksmanship. Historians still debate that claim vigorously.
But there’s little debate that during the Eisenhower era the myth of homeland insecurity -- the story of a nation living constantly on the brink of catastrophe, in what the president called “an age of peril” -- became “the new normal,” as a White House staffer put it. So Americans were afraid, but not terribly surprised, when Ike’s successor, John F. Kennedy, had to face the brink most directly during the Cuban missile crisis.
Nor should anyone have been surprised when the myth of homeland insecurity arose again with such power within hours after the Twin Towers fell on September 11, 2001. The sense of a nation living always on the brink, facing a constant threat of destruction, had indeed become “the new normal.” So -- again, no surprise -- Vice President Dick Cheney raised few eyebrows when he said that we’d have to get used to “the war on terror” as “the new normal” forever.
When it comes to national mythology, the lesson of the Cold War years was that domestic political showdowns come and go; they’re most commonly described as “squabbles.” But they’re all fought out under the shadow of permanent threat, in a nation teetering at the edge of extinction.
That lesson endures. So the showdown -- even if it is seen only as the first battle in a long war, the first act in a protracted drama -- will become merely a way to explain the most basic “fact”; that is to say, the winning myth: America is doomed to live on edge of the “fiscal cliff” for a long, long time. The New Year fiscal deal has confirmed the view so many already held: Our economic plight is now “the new normal.” We might call this the new normal myth.
As long as the “cliff” myth prevails, it will carry many of the same implications as the Cold War “brink.” When you are facing catastrophe your highest priority is to protect yourself and what you already have. It’s only logical to avoid making any major changes or even thinking about any substantial innovations. They’re simply too risky for people teetering on the edge of a cliff.
It's no wonder that this “showdown” was a drama acted out by extremely cautious people taking only the smallest, most cautious steps. Nor should we expect anything else in the future, as long as the “cliff” myth prevails.
Guns and violence are “a deep illness in our society,” columnist Frank Rich opines. “There's only one other malady that was so deeply embedded into the country's DNA at birth: slavery. We know how long it took us to shake those shackles. And so ... overthrowing America's gun-worship is not a project that will be cured in a legislative session; it's a struggle that's going to take decades.”
I wonder if Rich is too pessimistic. He assumes that the gun-control issue is now where the slavery issue was in perhaps the 1820s, when the abolitionist movement was just beginning to gather steam as an organized Protestant reform effort. But that doesn’t seem a fair comparison.
There has already been a well-organized, well-publicized gun control movement in the U.S. for decades. And it has already had a brief era of great success, in the early 1990s: the Gun-Free School Zones Act in 1990 (revised 1995), the Brady Bill in 1993, and the 10-year assault-weapons ban in 1994. That era was followed by a strong and relatively successful reaction from anti-gun-control forces, leaving us now with a common but mistaken impression that most Americans have always been reactionaries on this issue.
If the analogy is to the slavery debate, it might be more accurate to think of 2012 as akin to 1852. In the preceding years pro-slavery sentiment in the South, and the pro-slavers’ political clout in Washington, had grown much stronger. Then Harriet Beecher’s Stowe’s epochal novel Uncle Tom’s Cabin appeared. The immensely popular book, and the many dramatizations of it that were quickly produced, gave powerful new energy to the anti-slavery movement.
Although historians are supposed to refrain from predicting the future, there is no rule against imagining hypothetical possibilities. So I’ll suggest, with lots of qualifiers, that it’s possible that the dreadful murders in Newtown might turn out to play a role in some way akin to Uncle Tom’s Cabin.
Who would have thought that Barack Obama, so deeply immersed in such delicate negotiations about taxes and budget, would run the risk of publicly advocating specific gun control measures: banning the sale of military-style assault weapons and high-capacity ammunition clips, and requiring background checks before all gun purchases. Granted, they are popular measures, as Obama himself admitted.
But there will be plenty of pushback from the National Rifle Association and other pro-gun groups, who have proven very effective in the past. So the president knows he is taking a considerable political risk.
In fact, if the 1850s is the appropriate decade for comparison, it’s a safe bet that the movement Obama has now joined will suffer losses in the near future. The anti-slavery movement was shocked by the Kansas-Nebraska Act in 1854, the ensuing battle over “bloody Kansas,” the Dred Scott decision in 1857, and the hanging of John Brown for raiding the Harper’s Ferry Arsenal in 1859 (just to name the most influential events).
Yet each of those shocks ultimately had a similar effect to the shock we received when all those little children and their teachers were killed in Newtown. They redoubled the commitment of reformers to create political change, and therefore they heightened the tension between the opposing political forces, a tension that ultimately led to massive change.
So the lesson of the 1850s is that no one event is likely, by itself, to transform public attitudes and policies. But a series of events, each one profoundly shocking, can have that effect. When the first of those events occurs, no one can know for sure that it is the first of a history-changing series. That’s something we can only know in retrospect. But we can know that change does sometimes happen in a series of spasmodic leaps.
There’s one more interesting parallel to consider. Throughout the 1850s, the total abolition of slavery always remained a minority view. The history-changing events of the decade never made the abolition of slavery a broadly popular opinion. The broad wave of support, spurred by every tragic turn of events, was for “free soil”: banning the extension of slavery to places it was not already legal.
That was clearly Abraham Lincoln’s position, the major plank on which he won the presidency. Only under fierce pressure to win the Civil War did he become “The Great Emancipator,” the prophet of total abolition.
Similarly, there is no serious talk now of a total ban on the sale and/or possession of guns in the United States. Barack Obama knows it would be political suicide to endorse such an extreme position, just as Lincoln knew in the 1850s that total abolitionism would be political suicide.
But the lesson of Lincoln’s career is that political issues and causes have a life of their own. Once you join or endorse them in even a partial way, there’s no telling where you might end up. The fates forbid that we ever have to endure anything remotely like the bloodshed of the Civil War, for any reason, including the eventual banning of guns. But even without violence history can lead us to very unexpected outcomes, sometimes in very sudden leaps, as we are learning right now.
I know it’s foolish hubris to hear about a tragedy like the school shooting in Connecticut and then immediately start writing about it. But many of us who blog do it, at least in part, as a way to deal with feelings that otherwise might overwhelm us. It’s cathartic. And it’s our wager that, in the process, we’ll say something helpful to others who are trying to make a little bit of sense out of at least some corner of the tragedy
Convincing explanations of any kind are ultimately bound to elude us. All one can do is try to shed a little light on a little piece of the immense calamity, from one’s own particular viewpoint. I naturally think about American mythic traditions that seem relevant in this situation.
After the mass killing in an Aurora, Colorado movie theater last summer I noted a point that Washington Post wonk Ezra Klein Klein confirms in a very useful post today: While the American public generally supports a number of specific gun control proposals, when pollsters ask about “gun control laws” in the abstract a growing number of Americans say they oppose it. And pollsters consistently find that mass killings do nothing to increase support for gun control.
Back then I suggested that “when nations, like individuals, try to go in two directions at once they get paralyzed. That’s where we are on the politics of gun control.” I added that the paralysis makes us ever more frightened and craving safety. The traditional American source of safety is a gun -- or two, or three, or more. I concluded that “the root of the problem is our dedication to the fantasy of absolute safety and security. The sooner we recognize that as our national fantasy and stop arming ourselves to the teeth in pursuit of it, the safer we all will be.”
At the time I did not know that the killer had been in treatment with a very competent psychiatrist. I merely assumed that it’s mentally or emotionally disturbed people with guns who kill people, at least on such a mass scale. We still don’t know anything about the killer in the Connecticut school. But again that assumption seems to be a rather safe one.
In other words, I start with the premise that the opponents of gun control are half right. Guns don’t kill people, as they like to say. But the other half of the truth is the part they won’t say: Mentally or emotionally disturbed people with guns kill people.
And now I’m thinking about the connection between mental/emotional disturbance and the widespread resistance to the idea of “gun control,” which I assume comes from the mythic tradition that equates guns with absolute safety.
I’ve been working with a group in my community trying to promote public support for mental health treatment. It has made me very aware of the profound reluctance we see all around us (even in a very liberal and wealthy county like mine) to treat mental/emotional disturbance as a communal problem.
To say the same thing from the other side: When we talk about mentally or emotionally disturbed individuals, our society puts the emphasis on “individuals.” Without really thinking about it, most of us assume that we’re dealing with peculiar cases, each one caused by some unique set of problems encased in one individual’s brain.
We just don’t have many cultural resources at all to think about mental/emotional disturbance as a societal problem. Oh, there’s shelves full of books in university libraries which can teach us to see it that way. But that academic perspective has not percolated through to our shared public myths. We still tend, as a society, rather reflexively to see troubled people as individual “weirdos,” unique outliers from the norm.
And our natural inclination, most of the time, is to stay as far away from them as we can -- unless they are family members or otherwise connected to us in ways we couldn’t escape even if we wanted to. Then we try our best to get help for them. And we usually discover that the resources our society provides are far too meager to give them the help they really need -- precisely because, as a society, we don’t think of such disturbances as a collective problem. So we don’t even think about, much less provide the resources for, collective solutions.
I suspect this pattern has its deepest roots in a tradition that was pervasive through the late nineteenth century and still affects us deeply: viewing mental/emotional disturbance through the lens of religious and spiritual language. I’ve spoken with ministers who are trying hard to bring their fellow clergy into fruitful conversation with mental health professionals. It’s an uphill struggle, they say, in part because there are still many clergy who assume that personal prayer and spiritual renewal is the only appropriate treatment.
What we have here, to some degree that’s impossible to quantify, is a living legacy of the days when mental and emotional disturbance were interpreted as signs of sin. (“Evil visited this community today,” said Connecticut Governor Dan Malloy, as if the the tragedy were caused by some distant, utterly alien metaphysical force.) Just as sin was seen to be the responsibility of the individual, so mental/emotional disturbance is still seen to be, if not the individual’s responsibility, at least an individual problem.
The proud American tradition of individualism is also, I suspect, at the root of the popular resistance to gun control. Discrete gun control measures gain popularity because most people think that they will apply only to others. Things like background checks and no guns for felons -- or the mentally ill -- don’t apply to me, the average respondent in a poll assumes. But gun control in general means that I may no longer have the right to defend myself, my family, and my home.
The curious fact (which I noted in my post last summer and Klein confirms) is that the actual number of American households with guns has declined fairly steeply in the last forty years. So the objection to gun control laws doesn’t come only from people who have guns and want to hold on to them (though they are the largest portion of the naysayers). It also comes from people who imagine that they might some day feel the need for a gun to protect themselves. They don’t want their individual freedom abridged.
So here is the picture we end up with: an image of a nation where at least half the people (or more, depending the poll) assert their individual rights by opposing gun control laws, while uncounted millions are walking around with serious disturbances locked up inside them -- disturbances that occasionally burst out with horrific consequences. It’s a picture made up of 300-plus million separate individuals.
Most of us see it that way because we don’t have the cultural traditions -- the myths, I’d say -- that would let us see both gun ownership and mental/emotional disturbance as societal facts, as manifestations of what the community as a whole is doing.
So we go on letting individuals arm themselves to protect their individual rights and freedom, or so the myth tells us. (Illinois just became the 50th state to allow citizens to carry concealed guns.) But we tragically underfund and ignore societal programs to help the mentally/emotionally disturbed, because we simply don’t see any relationship between them and the rest of us, or so the myth tells us.
In such an individualistic nation, the recipe for absolute safety seems simple enough: Give everyone the freedom to carry a concealed gun, and stay as far away as possible from those “weirdos.” We’ve just seen, in a Connecticut schoolhouse, what that recipe produces.