MythicAmerica explores the mythic dimension of American political culture, past, present, and future. The blogger, Ira Chernus, is Professor of Religious Studies at the University of Colorado at Boulder and author of Apocalypse Management: Eisenhower and the Discourse of National Insecurity.
To receive periodic email summaries of the blog, send an email to firstname.lastname@example.org, with “Update” in the subject line. You can communicate directly with Ira at the same address.
Guns and violence are “a deep illness in our society,” columnist Frank Rich opines. “There's only one other malady that was so deeply embedded into the country's DNA at birth: slavery. We know how long it took us to shake those shackles. And so ... overthrowing America's gun-worship is not a project that will be cured in a legislative session; it's a struggle that's going to take decades.”
I wonder if Rich is too pessimistic. He assumes that the gun-control issue is now where the slavery issue was in perhaps the 1820s, when the abolitionist movement was just beginning to gather steam as an organized Protestant reform effort. But that doesn’t seem a fair comparison.
There has already been a well-organized, well-publicized gun control movement in the U.S. for decades. And it has already had a brief era of great success, in the early 1990s: the Gun-Free School Zones Act in 1990 (revised 1995), the Brady Bill in 1993, and the 10-year assault-weapons ban in 1994. That era was followed by a strong and relatively successful reaction from anti-gun-control forces, leaving us now with a common but mistaken impression that most Americans have always been reactionaries on this issue.
If the analogy is to the slavery debate, it might be more accurate to think of 2012 as akin to 1852. In the preceding years pro-slavery sentiment in the South, and the pro-slavers’ political clout in Washington, had grown much stronger. Then Harriet Beecher’s Stowe’s epochal novel Uncle Tom’s Cabin appeared. The immensely popular book, and the many dramatizations of it that were quickly produced, gave powerful new energy to the anti-slavery movement.
Although historians are supposed to refrain from predicting the future, there is no rule against imagining hypothetical possibilities. So I’ll suggest, with lots of qualifiers, that it’s possible that the dreadful murders in Newtown might turn out to play a role in some way akin to Uncle Tom’s Cabin.
Who would have thought that Barack Obama, so deeply immersed in such delicate negotiations about taxes and budget, would run the risk of publicly advocating specific gun control measures: banning the sale of military-style assault weapons and high-capacity ammunition clips, and requiring background checks before all gun purchases. Granted, they are popular measures, as Obama himself admitted.
But there will be plenty of pushback from the National Rifle Association and other pro-gun groups, who have proven very effective in the past. So the president knows he is taking a considerable political risk.
In fact, if the 1850s is the appropriate decade for comparison, it’s a safe bet that the movement Obama has now joined will suffer losses in the near future. The anti-slavery movement was shocked by the Kansas-Nebraska Act in 1854, the ensuing battle over “bloody Kansas,” the Dred Scott decision in 1857, and the hanging of John Brown for raiding the Harper’s Ferry Arsenal in 1859 (just to name the most influential events).
Yet each of those shocks ultimately had a similar effect to the shock we received when all those little children and their teachers were killed in Newtown. They redoubled the commitment of reformers to create political change, and therefore they heightened the tension between the opposing political forces, a tension that ultimately led to massive change.
So the lesson of the 1850s is that no one event is likely, by itself, to transform public attitudes and policies. But a series of events, each one profoundly shocking, can have that effect. When the first of those events occurs, no one can know for sure that it is the first of a history-changing series. That’s something we can only know in retrospect. But we can know that change does sometimes happen in a series of spasmodic leaps.
There’s one more interesting parallel to consider. Throughout the 1850s, the total abolition of slavery always remained a minority view. The history-changing events of the decade never made the abolition of slavery a broadly popular opinion. The broad wave of support, spurred by every tragic turn of events, was for “free soil”: banning the extension of slavery to places it was not already legal.
That was clearly Abraham Lincoln’s position, the major plank on which he won the presidency. Only under fierce pressure to win the Civil War did he become “The Great Emancipator,” the prophet of total abolition.
Similarly, there is no serious talk now of a total ban on the sale and/or possession of guns in the United States. Barack Obama knows it would be political suicide to endorse such an extreme position, just as Lincoln knew in the 1850s that total abolitionism would be political suicide.
But the lesson of Lincoln’s career is that political issues and causes have a life of their own. Once you join or endorse them in even a partial way, there’s no telling where you might end up. The fates forbid that we ever have to endure anything remotely like the bloodshed of the Civil War, for any reason, including the eventual banning of guns. But even without violence history can lead us to very unexpected outcomes, sometimes in very sudden leaps, as we are learning right now.
I know it’s foolish hubris to hear about a tragedy like the school shooting in Connecticut and then immediately start writing about it. But many of us who blog do it, at least in part, as a way to deal with feelings that otherwise might overwhelm us. It’s cathartic. And it’s our wager that, in the process, we’ll say something helpful to others who are trying to make a little bit of sense out of at least some corner of the tragedy
Convincing explanations of any kind are ultimately bound to elude us. All one can do is try to shed a little light on a little piece of the immense calamity, from one’s own particular viewpoint. I naturally think about American mythic traditions that seem relevant in this situation.
After the mass killing in an Aurora, Colorado movie theater last summer I noted a point that Washington Post wonk Ezra Klein Klein confirms in a very useful post today: While the American public generally supports a number of specific gun control proposals, when pollsters ask about “gun control laws” in the abstract a growing number of Americans say they oppose it. And pollsters consistently find that mass killings do nothing to increase support for gun control.
Back then I suggested that “when nations, like individuals, try to go in two directions at once they get paralyzed. That’s where we are on the politics of gun control.” I added that the paralysis makes us ever more frightened and craving safety. The traditional American source of safety is a gun -- or two, or three, or more. I concluded that “the root of the problem is our dedication to the fantasy of absolute safety and security. The sooner we recognize that as our national fantasy and stop arming ourselves to the teeth in pursuit of it, the safer we all will be.”
At the time I did not know that the killer had been in treatment with a very competent psychiatrist. I merely assumed that it’s mentally or emotionally disturbed people with guns who kill people, at least on such a mass scale. We still don’t know anything about the killer in the Connecticut school. But again that assumption seems to be a rather safe one.
In other words, I start with the premise that the opponents of gun control are half right. Guns don’t kill people, as they like to say. But the other half of the truth is the part they won’t say: Mentally or emotionally disturbed people with guns kill people.
And now I’m thinking about the connection between mental/emotional disturbance and the widespread resistance to the idea of “gun control,” which I assume comes from the mythic tradition that equates guns with absolute safety.
I’ve been working with a group in my community trying to promote public support for mental health treatment. It has made me very aware of the profound reluctance we see all around us (even in a very liberal and wealthy county like mine) to treat mental/emotional disturbance as a communal problem.
To say the same thing from the other side: When we talk about mentally or emotionally disturbed individuals, our society puts the emphasis on “individuals.” Without really thinking about it, most of us assume that we’re dealing with peculiar cases, each one caused by some unique set of problems encased in one individual’s brain.
We just don’t have many cultural resources at all to think about mental/emotional disturbance as a societal problem. Oh, there’s shelves full of books in university libraries which can teach us to see it that way. But that academic perspective has not percolated through to our shared public myths. We still tend, as a society, rather reflexively to see troubled people as individual “weirdos,” unique outliers from the norm.
And our natural inclination, most of the time, is to stay as far away from them as we can -- unless they are family members or otherwise connected to us in ways we couldn’t escape even if we wanted to. Then we try our best to get help for them. And we usually discover that the resources our society provides are far too meager to give them the help they really need -- precisely because, as a society, we don’t think of such disturbances as a collective problem. So we don’t even think about, much less provide the resources for, collective solutions.
I suspect this pattern has its deepest roots in a tradition that was pervasive through the late nineteenth century and still affects us deeply: viewing mental/emotional disturbance through the lens of religious and spiritual language. I’ve spoken with ministers who are trying hard to bring their fellow clergy into fruitful conversation with mental health professionals. It’s an uphill struggle, they say, in part because there are still many clergy who assume that personal prayer and spiritual renewal is the only appropriate treatment.
What we have here, to some degree that’s impossible to quantify, is a living legacy of the days when mental and emotional disturbance were interpreted as signs of sin. (“Evil visited this community today,” said Connecticut Governor Dan Malloy, as if the the tragedy were caused by some distant, utterly alien metaphysical force.) Just as sin was seen to be the responsibility of the individual, so mental/emotional disturbance is still seen to be, if not the individual’s responsibility, at least an individual problem.
The proud American tradition of individualism is also, I suspect, at the root of the popular resistance to gun control. Discrete gun control measures gain popularity because most people think that they will apply only to others. Things like background checks and no guns for felons -- or the mentally ill -- don’t apply to me, the average respondent in a poll assumes. But gun control in general means that I may no longer have the right to defend myself, my family, and my home.
The curious fact (which I noted in my post last summer and Klein confirms) is that the actual number of American households with guns has declined fairly steeply in the last forty years. So the objection to gun control laws doesn’t come only from people who have guns and want to hold on to them (though they are the largest portion of the naysayers). It also comes from people who imagine that they might some day feel the need for a gun to protect themselves. They don’t want their individual freedom abridged.
So here is the picture we end up with: an image of a nation where at least half the people (or more, depending the poll) assert their individual rights by opposing gun control laws, while uncounted millions are walking around with serious disturbances locked up inside them -- disturbances that occasionally burst out with horrific consequences. It’s a picture made up of 300-plus million separate individuals.
Most of us see it that way because we don’t have the cultural traditions -- the myths, I’d say -- that would let us see both gun ownership and mental/emotional disturbance as societal facts, as manifestations of what the community as a whole is doing.
So we go on letting individuals arm themselves to protect their individual rights and freedom, or so the myth tells us. (Illinois just became the 50th state to allow citizens to carry concealed guns.) But we tragically underfund and ignore societal programs to help the mentally/emotionally disturbed, because we simply don’t see any relationship between them and the rest of us, or so the myth tells us.
In such an individualistic nation, the recipe for absolute safety seems simple enough: Give everyone the freedom to carry a concealed gun, and stay as far away as possible from those “weirdos.” We’ve just seen, in a Connecticut schoolhouse, what that recipe produces.
Solidarity poster from Poland in 1989 -- an effective use of the "showdown" myth in politics. Credit: Wiki Commons.
Progressive groups are trying to rally their troops to stop any cuts to Medicare, Medicaid, and Social Security. They may wish they could turn out crowds large and noisy enough to make a media splash, the way the Tea Party did a couple of years ago. But their troops are all volunteers, and as far as I can tell not enough of them are showing up for duty to make that media splash.
Barack Obama and his ax-wielding budget aides will draw the obvious conclusion: Most people say they oppose cuts to the big three “entitlements.” But they don’t care strongly enough to make any noise about it. Mostly what they want is to stop hearing about the dangers of the “fiscal cliff.”
So Democrats can make cuts to the big three, satisfy the Republicans, end the “fiscal cliff” crisis, and pay a very small political price. In fact the Dems will probably come out with a higher rating in the polls because they’ll show that they can “make Washington work.”
That’s probably what’s going to happen in the next few weeks, unless some progressive crowds get out there with Tea-Party-like enthusiasm and start screaming “No! Stop!”?
Why aren’t they out there yet? One reason, I’ve suggested, is that progressives have not challenged the metaphor that everyone uses to describe the situation: We’re headed for a “cliff.” Every metaphor tells a story. And the stories we tell shape the way we view things, which in turn determines the policies we’ll adopt or reject.
The story of the “cliff” tells us that apocalyptic peril looms ahead. We’re all in this together, and if we take one more step in the wrong direction we’re doomed. But we don’t have any consensus on which direction is the right one. Most people, facing that kind of threat, are afraid to take a step in any direction. So they just stand still, cling to the status quo, and turn more conservative.
Recently I learned that there are some progressives who understand the power of metaphor. I met some folks who are organizing to save Medicaid. They certainly want Medicare and Social Security protected too. And they’re not talking about any “cliff.” They are talking about the “fiscal showdown.”
All of a sudden the whole situation looked different to me. It’s not all of us together rushing toward a precipice, trying frantically to figure out where to direct our collective steps, constantly bumping into each other -- and sometimes trampling each other -- in our panic. If it were, we’d have good reason to feel paralyzed, afraid to move at all.
No, the “showdown” metaphor gives us two clearly defined groups -- good guys and bad guys -- facing each other in a fight to the finish. We each get to choose which side is good and which is bad. But once we’ve made the choice, we get to stand with the good guys and join in the fight. We get to take action.
Once the good guys defeat the bad guys, the people who have been blocking progress toward a better life for all are gone. The way is clear to make all sorts of improvements for our society and everyone in it.
Sure, for progressives that’s a fantasy. Even if the Republicans go down to terrible defeat in this round of negotiations (which is hardly likely, given their majority in the House), they’ll bounce right back and start trying to force some other horrible new policies on us.
But imagine if all the headlines were about the “fiscal showdown,” not the “fiscal cliff.” “Showdown” is an energizing fantasy. It creates a feeling that we can eventually “clean up this town, make it a decent place where fine folks will want to raise their families.” I think I heard that in a movie or two, or actually a few dozen.
The film history of the “showdown” -- with its familiar mantra, “draw, podner” -- reminds us that this metaphor is classic Americana. The good guy is the all-American kid. Whatever virtues he represents are, by definition, all-American virtues. And he’s expected to win an unconditional victory over the bad guy. At the OK Corral or anywhere else, the “showdown” has a fine patriotic pedigree.
If progressives go out into the street for a “fiscal showdown,” they’re acting out a traditional American drama. In a strange way that makes them more appealing to the rest of the public, even to the most conservative among us.
On the other hand, if we are hurtling toward the cliff the best we can hope for is to avoid disaster at the last minute. The only film prototype I can think of is James Dean as the Rebel Without a Cause. That’s hardly an appealing image if progressives hope to get their message beyond their already rebellious circles.
Those of us who are committed to nonviolence may not feel very comfortable with the traditional American “showdown” metaphor, since it’s so loaded with overtones of violent death. But we don’t shrink from confrontation any more than Gandhi or Dr. King did. The “showdown” we want isn’t between two groups of people. It’s between two sets of policies, each with its underlying values and mythic narratives.
When we support more funding for Medicare, Medicaid, Social Security, and all the government’s other human service programs, we are going out to fight for a society where we are all interconnected; all threads in a single garment of destiny; each caring deeply for and feeling responsible for the well-being of all others. We are fighting against a rampant, uncaring individualism built on greed and selfishness.
That’s what this fight is really about. And when you bring it down to that level of basic values, it’s hard to see how anyone can advocate compromise. Because if greedy individualism wins, we all lose -- even the richest among us, though they don’t know it yet. So the only way to avoid sending our society over the moral as well as fiscal cliff is to make sure progressives win this showdown.
And here in America, the traditional place for a showdown is in the street, out in public, where everyone can see the victory of right over wrong.
Dear Rush Limbaugh,
The night President Obama was re-elected you went to bed thinking that Mitt Romney “put forth a great vision of traditional America, and it was rejected.” So “we’ve lost the country.” You explained to your audience that the voters had chosen a “Santa Claus” government over hard work as the way to get their needs met.
Well, now that Santa is finishing up the last toys and getting the reindeer ready to fly, I want to bring you a season’s greeting full of good cheer. I want to cheer you up by telling you about the Christmas card I just got from the Obamas. It should ease your fears that your country, the one you call “traditional America,” is disappearing.
All the Obamas signed the card, even their little dog Bo (who added his pawprint). In fact Bo is the star of the card; he’s the only one who got his picture on it. There he is romping through the snow on the Obamas’ lawn. Hey, Rush, what could be more traditionally American than that?
And then read the message inside: “This season, may your home be filled with family, friends, and the joy of the holidays.” That’s it. No government coming into your home to spy on you -- or to give away stuff. In fact, no stuff at all. And no fat guy in a red suit to bring stuff. Just a home filled with family, friends, and Christmas joy.
(Yeah, I know it says “holidays.” But seriously, when did you ever see a picture of a little dog romping in the snow as a symbol of Hanukkah, or of anything associated with Muslim culture? We are obviously talking Christmas here.)
Why do you suppose I got this card? I don’t know. The only reason I can imagine is that I’m on some list of people who volunteered for Obama during the campaign. It says it was “not authorized by any candidate or candidate’s committee.” It was “paid for by the Democratic National Committee, www.democrats.org” (which means no government money was spent on the card, so don’t jump to any nasty conclusions).
But you know as well as I that the Democrats are trying to hold on to all of us who volunteered, so that when the time comes they can mobilize us in whatever political fight they need us for. I mean, you should see all the emails they still send me.
The thing is, I didn’t do very much for the campaign. There must be tens of thousands of people, maybe hundreds of thousands, who did as much as I did. You’ve heard about the size of the Obama “ground game,” I bet. And they must all be getting the same card.
Now think about it, Rush. (This should really dry your tears.) The Democrats made a Christmas card to send to this huge list of people who support the Dems so solidly that they’ll give a few volunteer hours. These are all the people that you think are taking away your country, rejecting “traditional America.” The Dems surely hired some pretty high-priced PR professionals to figure out exactly what should go on that card -- what would make all of us who get it feel so good that we’ll want to volunteer even more.
And what did they come up with? Santa giving away stuff to a diverse rainbow coalition of greedy Americans? An inter-faith gathering, complete with atheists, celebrating a neutralized “seasonal observance”? A gay couple sitting down to Christmas dinner with their multi-racial children?
No. Not even a white working-class couple sitting down to Christmas dinner with their blond-haired, blue-eyed children. Just a little dog in the snow and a “home, family, friends, coded-Christmas” greeting.
But that’s not all, Rush. It gets better. The picture shows the dog in front of a grand, immense mansion, wearing a scarf no less. His head is held up straight and high, aligned perfectly with the stately pillars of the White House, as if he were marching in a military parade. And it’s all framed in a thin line of gold. Open it up and the message is embossed in gold, under the seal of the president in fine detail, embossed in the same gold.
Why, when I hold this in my hands I feel like I’ve been magically transported to Romneyland. Come to think of it, suppose Mitt had won and the Republicans sent a Christmas card to all the volunteers from his campaign. What would be different?
Well, they might leave out the dog, because that would remind people of the “tied to the top of the car” story. But surely they would have found some equally traditional Christmas-y picture full of snow, and done it in equally elegant style, with the same visual allusion to the martial dignity of America. Beyond that, only the names would be changed.
So apparently the Democrats’ best PR pros think that an elegant Romneyesque vision of “traditional America,” filled with gold, will warm the hearts of Dem loyalists. What do you make of that, Rush?
I take it as a coded message, not merely that Christmas is still the top-dog holiday around here, but that your idea (I call it your myth) of “traditional America” is very much alive and still packs an emotional wallop.
Yes, your myth took something of a hit this last election day. But it wasn’t such a serious blow. Last I looked, your guy got over 47 percent of the votes and my guy less than 51 percent. My guy did a couple of points better four years ago. But there was a congressional election in between where we got slaughtered by “traditional America.”
I bet the wizards who plot strategy and make Christmas cards for the Democratic National Committee remember that slaughter very vividly and aren’t nearly so sure as you are that you’ve lost your country. At least, they want us Dem activists to know that we had still better give lip service to “traditional America.”
I suspect it’s more than that, though. I suspect that at least the Christmas-y piece of the “traditional America” myth is still meaningful in some (perhaps subliminal) way to a lot of dyed-in-the-wool Democrats. They don’t think Christmas is about Santa handing out stuff -- and no Dems I know (which is a lot) think that government is about playing Santa, handing out stuff.
But they do have some sentimental attachment to the Norman Rockwell version of America and all the values it represents. They are even impressed (though they might hate to admit it) by the elegance of gold.
So cheer up, Rush. You’ve got the old American myth on your side of the political fence. And old myths die hard -- so hard, apparently, that even a lot of us on the opposite side of the fence are still hooked into your “traditional America.”
But here’s the best news: The deepest message of this card from the Obamas is that they love a lot of old American traditions, and they assume plenty of us Dem loyalists do too. We’re all patriots, on your side and ours. We all love the same country and want the best for it, even if we have different ways of getting there.
Merry Christmas, Rush!
How the Dems could win the fiscal cliff debate: mobilize for the moral equivalent of war. Credit: Flickr/Library of Congress/StockMonkey.com/HNN staff.
Where’s that surge of public outrage that’s supposed to force the Republicans to surrender in the “fiscal cliff” negotiations? The Democrats are still waiting for it ... and waiting ... and waiting, while they teeter on the edge of the cliff.
The Dems are so busy scrutinizing the polls, they forgot to notice the impact of the little word “cliff.” Sure, it’s just a metaphor. But every metaphor tells a story. And the stories we tell (or, more commonly, take for granted, without ever spelling them out) shape the way we view things, which in turn determines the policies we’ll adopt or reject and the way we’ll live our lives.
Any story about a “cliff” is simple: We are safe now, with our feet planted firmly on solid ground. The whole broad earth supports us. But if we take one more step in the direction we’re currently heading it will be an apocalyptic step. Suddenly we’ll be plunging down through the abyss toward certain destruction, helpless to save ourselves. If we step in any other direction we will remain securely on solid ground; we’ll escape the apocalypse.
The story of the “fiscal cliff” is more complicated because the public is getting so much conflicting advice about which direction is safe and which is the truly dangerous one. When you are standing on the edge of the precipice, with so many voices yelling “Go this way!” -- “No, that way! -- “No, the other way!” -- what’s the sensible thing to do? Don’t move at all. At least that way you know you are safe.
And sensible reasoning is reinforced by emotion. When we’re confused and in mortal danger our “fight or flight” response can easily get paralyzed. We freeze; play dead. It’s a primal response, the psychologists say, from deep inside the reptilian brain.
When people are too afraid to move, they see all images of change as images of danger. Inertia carries them on in the direction they’ve been going. It seems like the safest direction because it requires no new decisions. Conserving the status quo feels like the most comforting path.
In short, when apocalypse looms and it’s not clear how to prevent it, people are likely to become more conservative. So if the Democrats want dynamic movement -- a surge of public support for innovative new policies to reduce economic inequality -- “cliff” may be exactly the wrong metaphor.
“Cliff” may also be the wrong metaphor if you want a story that actually fits the facts, as two reports in the New York Times explain: “America’s fiscal condition will be altered without a deal between President Obama and the Republicans in Congress. But not radically so, and in many cases not immediately.”
“Policy and economic analysts … said the term ‘fiscal hill’ or ‘fiscal slope’ might be more apt: the effect would be powerful but gradual, and in some cases, reversible.” “The slope would likely be relatively modest at first,” according to Chad Stone, the chief economist at the Center on Budget and Policy Priorities.
So why do the Dems ignore more appropriate metaphors and go along with the popular metaphor of the “fiscal cliff”?
For the same reason Republicans embrace the “cliff” image, says Washington Post wonk Ezra Klein: “Legislators from both parties have concluded that crises are the only impetus to get anything -- and thus the opportunity to get everything -- done.”
That may well be true in the back rooms of DC, where there’s little sense of urgency and some confidence that a final deal will surely be cut. But outside the beltway, crisis is more likely to breed conservatism.
Except, perhaps, when we go to war. Historian Michael Sherry has shown that the most effective impetus to get anything done in American political life is to convince the public that we’re living “in the shadow of war.” Then we have a feeling of apocalyptic crisis, since Americans have always tended to talk about their wars in apocalyptic terms, as if the only alternative to victory were the demise of the nation.
But when war breaks out we also have a clear consensus on how to respond. We don’t freeze. We band together and mobilize to fight back.
The enemy need not be a foreign foe. Sherry offered copious examples of domestic societal problems framed as wars as far back as the 1930s, when Franklin D. Roosevelt often proclaimed that fighting the Great Depression was much the same as fighting the Germans in World War I. During FDR’s first term, there was widespread agreement that the New Deal was the best way to resist the enemy of a broken economy. So the nation mobilized to fight back.
However the New Deal teaches another lesson about the “war” metaphor: It triggers a dynamic common effort for apocalyptic victory -- at first. But war also breeds apocalyptic fear, which sooner or later creates a more conservative mood, at least on the domestic policy front. That was clear by the middle of FDR’s second term. Both world wars, the Korean War, the Vietnam War, and the post-9/11 response all produced similarly conservative reactions on domestic issues.
In any case, this isn’t the ‘30s redux. The “fiscal cliff” is not a war metaphor. The only “war” triggered by our current economic problems is the one between the Democrats and Republicans about what to do as we teeter on the “cliff.” So talk of a “fiscal cliff” doesn’t unite the nation and set it moving in a clear direction, as war metaphors do. The political warfare only heightens the confusion and, therefore, the conservative impulse.
It’s worth wondering how the Dems would have fared if they had refused the “cliff” metaphor and opted instead for “war.” If we can have wars on cancer or poverty, for example, why not a similar war against “special privileges” or “to save the middle class”?
Those obviously metaphorical “wars” on the domestic front don’t usually generate apocalyptic fear the way actual military conflict does. Perhaps the “war” metaphor might have mobilized the kind of support the Democrats had hoped for.
We’ll never know. For better or worse the Democrats are content to leave us, and themselves, hanging on the edge of a “cliff.”
Credit: HNN staff.
Just when we thought it was safe for Americans to go out in a democratizing Middle East ... Well, I guess we stopped thinking that a while ago. But now a lead story on the front page of the New York Times makes it official. Far from boosting our security, the Arab Spring has given us more to be afraid of.
Gone are the days when all we had to worry about was fanatical Shi’ite Islam. Now a new Sunni “axis” is emerging, the Times informs us -- using a word that should send chills up the spine of anyone who knows anything about World War II -- with Egypt, Turkey, and Qatar playing the role once filled by Germany, Japan, and Italy.
All three Mideast nations are governed by Sunni Muslim parties. So are Libya and Tunisia. More ominously, according to the Times, Hamas is allying with the “axis.” And if the Syrian rebels win their civil war, they’ll take Syria out of the Iranian orbit and into the new “axis” too.
The result “could be a weaker Iran.” After years of warning us about a “new cold war” with a possibly nuclear-armed Iran, you’d think the mass media would be celebrating.
But no. The Times merely warns us that we have to shift our anxiety to a new target. Why? The answer is a mother lode of precious material for students of American political mythology.
These Sunnis, reporter Neil MacFarquahar explains, “promote a radical religious-based ideology that has fueled anti-Western sentiment around the region.” That’s a good example of how exaggerated facts create the emotional punch so essential to myth.
Yes, there’s a religiously-based ideology fueling anti-Western sentiment. “Radical” makes it sound inherently dangerous. But it’s hardly radical or dangerous to those who hold it. It’s perfectly sensible to them.
And it’s a huge stretch to say that government leaders in Egypt and Turkey “promote” this ideology. They are trying to harness and lead it. But at the same time they are trying to restrain it as they navigate tricky political waters, where they depend heavily on secular forces for their economic and political well-being.
More soberly, MacFarquahar writes that “the new reality could be ... a far more religiously conservative Middle East.” Yes, it could be. But it might not be. There’s no way to know.
The American journalist shows a sharper understanding of the response in his homeland, focusing precisely on this uncertainty of the future. “The shifts seem to leave the United States somewhat dazed.” “The United States” here means that tiny fraction of one percent of the American population who make or directly influence foreign policy. On the international stage, where they act out the drama of geopolitics, they represent the entire nation.
They are dazed because “what will emerge from all the ferment remains obscure. ... Confusion reigns in terms of knowing how to deal with this new paradigm, one that could well create societies infused with religious ideology that Americans find difficult to accept.” In this case, “Americans” probably does mean a majority of the whole population.
Why should Americans find it difficult to accept the religious choices of people on the other side of the world? Why does it even matter if Americans accept them? MacFarquahar’s answer is simple and surprisingly candid: “The old leaders Washington relied on to enforce its will, like President Hosni Mubarak of Egypt, are gone or at least eclipsed. ... The new reality could be ... [a] Middle East that is less beholden to the United States.”
In case you’re one of those liberals who doesn’t think the U.S. should be enforcing its will on independent foreign nations, the next sentence should bring you around: “Already, Islamists have been empowered in Egypt, Libya and Tunisia, while Syria’s opposition is being led by Sunni insurgents, including a growing number identified as jihadists, some identified as sympathizing with Al Qaeda.”
It’s impressive to watch guilt by association in action: Islamists are linked to Syrian Sunnis, who are linked to insurgents (an inherently danger-packed word), who are linked to jihadists (and even scarier word), who are linked to sympathizers with Al Qaeda (the scariest word of all, of course).
By the logic of association -- a basic principle of mythic thinking -- the conclusion is obvious: If the U.S. can no longer enforce its will and keep Mideast nations beholden to us, we’re on the way to rule by Al Qaeda. Are you worried now??
Despite the allusion to World War II in the loaded word “axis,” this all reminds me more of the era right after the war. Many American policymakers were somewhat dazed and confused by the political ferment, especially in Europe, that was making the future obscure.
But some told a simple story that made sense of it all: The U.S. had such preponderant power that total global control seemed within our reach. Nothing less should satisfy. However, the Soviet Union and other political actors wouldn’t just roll over and submit. The principle of guilt by association proved that they must all be communists controlled by Stalin. He was causing all the ferment, promoting a radical ideology that fueled anti-American sentiment.
It was a paradigm foreign to the American way, the story went; wherever it took hold, nations would no longer be beholden to the U.S., and we would no longer be able to enforce our will. Nor could we control, or even predict, the future. Anyone even remotely associated with anyone remotely associated with a communist shared blame for this frightening chaos. All of them became enemies who had to be destroyed.
Those enemies might arise anywhere, which meant danger lurked everywhere. So America became an embattled fortress and a watchtower of constant vigilance. By the late ‘40s this narrative reigned supreme.
The result was not merely four decades of cold war, but a firmly entrenched mythology of homeland insecurity that still persists and spawns new fears. Global control remains the impossible dream. Since there is always someone frustrating it, there is always a new enemy springing up. And so we are told yet again that we must be always on guard, always afraid, always ready for the next “new cold war.”
We aren’t anywhere near that in our relationships with Sunni-led nations -- at least not yet. But in a myth-soaked foreign policy discourse, “the good guys” can become “the bad guys” awfully fast, as we learned in a very few years right after World War II. I guess it is a good idea to be on our guard.
About three score and a couple of years ago my sister was a research librarian in Hollywood, working for an outfit that dug up information needed by moviemakers. One day she called me and said, “You’ve got a PhD in the history of Judaism. So what are the facts about the lost ark, the one that was in the Jerusalem Temple in biblical times?” “There are no facts,” I quickly replied. “It’s all just legend. Why do you want to know, anyway?”
“Steven Spielberg is making a movie about the lost ark, and he wants us to get him the facts.” “A movie about the lost ark?”, I asked incredulously. “Is he crazy? Does he think anyone is going to pay money to see that?”
Obviously, I may know something about history but not much at all about the movies. I suppose that alone might disqualify me from making any comment on Spielberg’s latest epic, Lincoln.
But when America’s greatest living mythmaker takes on America’s most mythicized president, how can the author of a blog called MythicAmerica remain silent? If it’s not my obligation to say something, at least it’s an irresistible temptation.
What places Lincoln above all presidents in our national memory is his image as The Great Emancipator, a larger-than-life man led by a crystal clear and unwavering moral vision on the transcendent moral issue of American history. In mythic terms, the mere fact that America could produce such a leader is powerful evidence of a clear moral vision at the heart of America, a vision that all Americans can draw from and thus share in, at least vicariously. If that moral vision can be combined, in this one person, with skillful use of our democratic system to put the vision into practice, so much the better.
But recent historians have created at least a hint of a different myth, in which Lincoln is larger than life because he so skillfully manipulated the system in pursuit of some lesser goal -- saving the Union not for a greater moral purpose, but merely as an end in itself; or perhaps, even worse, merely being a winner for the sake of being a winner, in both war and politics.
I expected the film to explore this issue, to take a stand on it, to tell us what the Lincoln myth for our generation should be. Spielberg’s choice to focus on the Thirteenth Amendment seemed well suited to the task. The key scenes would be those in which Lincoln came to his decision about pressing for immediate passage. That would reveal just what kind of mythic figure the director (and screenwriter Tony Kushner) wanted us to see.
Watching the film, I quickly found myself frustrated because that question was sidestepped, or at best made rather secondary. Lincoln’s decision-making process had been concluded before the time frame of the film even began. We are introduced to his firm decision in the form of a dream.
My frustration was heightened by the rather wooden way the political-historical facts were discussed. The dialogue was so fragmentary and rapid fire that it could hardly be considered a thoughtful, much less thought-provoking, treatment of the issues in question. The historian in me couldn’t figure out quite what to make of it all.
Scenes of personal interaction -- among Lincoln, his wife, his sons, their servants, minor functionaries, and soldiers -- relieved the tension because they meant nothing as history. They were simply superb cinema, and I could indulge completely in enjoying them as such.
Then at a certain point it struck me that I was missing the point of the movie: It was all simply superb cinema. If I let myself, I could be sucked into the story and carried along by it, as I suspect most of the audience was (except the guy sitting next to my wife, who fell asleep). Once I allowed myself to suspend disbelief and treat what I called the political scenes on the same level as what I called the personal scenes, it was a truly glorious piece of theater, a spectacle from the Hollywood “dream factory” at its best. How appropriate that we meet the Thirteenth Amendment first in a dream.
The tension between historical fact and pure theater was reinforced right after the movie by two incidents. As the credits rolled, a woman sitting near me told a friend about a high school American history teacher who was giving his students extra credit for seeing the movie. Maybe he wanted them to think about how history is turned into mythic spectacle. But I doubt it. Since the filmmakers emphasized so strongly their debt to historian Doris Kearns Goodwin, and the credits included thanks to so many other historians, there’s an understandably widespread (though unfortunate) view that this is a fine way to learn real history.
When I got home and glanced at my email, I found that a friend had sent out a piece by the New Yorker’s film critic, David Denby. His conclusion sums up the very ahistorical quality of the film. It’s strange to call a movie “momentous,” he says, because great movies typically suggest their larger meanings only through implication. But “Lincoln” is momentous because the message is so direct: “Spielberg and Kushner marched straight down the center of national memory ... and they got it right.”
What they “got right,” of course, was not the facts of history; no doubt they got plenty of those facts right, but that misses the point. What they “got right” was the path that leads down the center of national memory. Since national memory is mythic and need not be checked by facts, that path can always appear to aim at, and be guided by, America’s crystal clear and unwavering moral vision, so that it runs straight and true through the twists and turns of messy democracy.
Spielberg is obviously in love with this traditional story of America’s journey along the path of moral truth. (See Saving Private Ryan, Amistad, and his video game, Medal of Honor.) Now his immense technical gifts have allowed him to create his most impressive pageant of America marching down that path, headed by its greatest leader, as interpreted by its greatest mythmaker.
Of course it’s not just Spielberg. Some scholars believe that Americans, as a people, are more likely than many others to see their history as a morality tale because so many Americans have taken the Bible as a sort of code book to decipher the meaning of our historical events.
David Denby raises this theory at the outset of his review, quoting Lincoln’s law partner, William Herndon: Lincoln was “the noblest and loveliest character since Jesus Christ ... I believe that Lincoln was God’s chosen one.” Denby goes on to note that the popular image of Lincoln still includes “attributes both human and semi-divine ... which combine elements of the Old and New Testaments.”
As for the New, he might have noted the obvious: In the end, Lincoln is martyred for having cleansed his people of their sin. Denby also could have pointed to the sequence in which Lincoln reminds his son that the president is the all-powerful ruler (at least as far as the army is concerned), but then gives his son up to the risk of death in that army, where so many soldiers died to wash away the sins of the whole nation -- a sort of “God the son becomes God the father” sequence. Is it too much to add that the exquisite lighting of the film, especially in the interior shots, creates an aura of the holy spirit hovering over everything the great man says and does?
Denby offers only one Old Testament reference: the sequence in which Lincoln talks of his “awesome power,” and demands that his aides get the last two votes to pass the Thirteenth Amendment. “It’s Lincoln’s only moment of majesty in office ... Any thought of Jesus disappears. This is an Old Testament figure, wrathful and demanding.”
But there’s a deeper Old Testament dimension to the lead character, which Spielberg spotlighted by closing the film with a flashback to the second inaugural address: “The Almighty has His own purposes.” Both the “offense” of slavery and “this mighty scourge of war” to punish that offense may be among those purposes. Yet “as was said three thousand years ago, so still it must be said, ‘The judgments of the Lord are true and righteous altogether.’"
The New Testament is a relevant prototype for a story about God’s martyred chosen one. But since the film is really about a nation’s memory, the Old Testament is the more relevant prototype. The Old is the story of a whole nation’s historical struggles with offense and punishment, embedded in a thick web of political complexities, but all guided by an omnipotent moral hand toward a transcendent goal.
It may be most rewarding to watch Lincoln as a biblical epic, ranked alongside films like The Ten Commandments and The Greatest Story Ever Told as one of the best American films of that genre. Lincoln reminded me why so much of the Bible is such fine literature: Once we are grabbed and swept away by a great story, crafted by great storytellers, a careful analysis of the historical facts no longer seems so important any more, and certainly not nearly so interesting.
After the last credits roll and the last reviews are read, though, we are left wondering what it means for a nation to continue remembering its own history as if that history were a Bible story.
IDF brass in a briefing about the conflict in Gaza, November 17, 2012. Credit: Flickr.
Ask most Americans why Israel went to war in Gaza again and they’ll give you a simple answer: Palestinians were shooting rockets into Israel, and, as President Obama said, “there’s no country on Earth that would tolerate missiles raining down on its citizens from outside its borders.”
To name those rockets as the root cause of the war is like saying my fever caused my flu. But why shouldn’t the public identify a symptom as the cause of the conflict? They hear and read the same misleading explanation in their news media over and over again. So they see no reason to dig any deeper.
Historians, of course, will dig deeper. They’ll be be suspicious of explanations of any war that go back no further than the last few days, or even few months. A barrage of rockets may have been the “precipitating” event, as Obama put it, with what must have been a carefully chosen word. It can equally be said that Israel's assassination of a high-ranking Hamas official involved in negotiating a truce was the precipitating spark.
But the fuel has been building up for two centuries.
I’m not talking about the wrong-headed cliché, “Oh, those Jews and Arabs. They’ve hated each other for centuries. They’ll go on fighting forever.” The long history of Jewish-Arab relations runs the gamut from bitter enmity to tolerant co-existence to cordial friendship. When Jews and Arabs meet, anything is possible.
Yet the historical circumstances of any particular meeting set limits to the possibilities. Since the nineteenth century, the overwhelming historical circumstance has been a passionate embrace, on both sides, of modern secular nationalism.
I emphasize “secular” to dismiss the other common but wrong-headed cliché, “It’s a religious war, and those never end.” Religion is the tail that may occasionally wag the dog in Jewish-Arab relations. But the dog -- the beast itself -- is the core innovation of nineteenth-century nationalism: one’s personal identity, worth, and dignity come from full membership in a nation-state.
By the end of the nineteenth century, most Jews who met Arabs were Zionists. Zionism was, and still is, the Jewish form of modern secular nationalism. Nineteenth-century Zionist writing is rife with expressions of anger over the indignities suffered by Jews in the preceding centuries -- but even more with expressions of shame, implying (and often stating outright) that Jews have themselves to blame, that they allowed themselves to become powerless victims. In the canon of modern nationalism, that is perhaps the gravest of sins.
Zionism was, above all, an effort to use modern nationalism to prove that Jews could achieve personal worth and dignity only by escaping the shameful sin of powerlessness. To that end, Zionists had to enact a script in which they confronted and (unlike their ancestors) successfully overcame enemies who were persecuting them for no other reason than being Jewish. That was the Zionists’ way of proving their right to have a proud, self-respecting nation-state, entitled to an equal place alongside all the modern nation-states.
But here was the Catch-22: Zionists could never feel like a proud nation unless they were actively dispelling the pall of the shameful Jewish past. So they had to be constantly enacting their script, in which innocent Jews struggle to overcome oppressive enemies.
The need for constant enemies produced a Jewish myth of constant insecurity, which shaped the Zionist view of history at every step. (I make this case in much more detail in my essay “The Myth of Israel’s Insecurity.”)
Of course the script required some real people to play the role of the anti-Semitic enemy. Before 1947, when the British ruled Palestine, they played that role, along with the Palestinian Arabs. Once the state of Israel was born, a long (but narrowing) list of actors played the role: the Arabs, the Nasserites, the Palestinians, the PLO, and now Hamas.
Certainly not all Israelis view the world through the myth of insecurity. But so many do that no successful Israeli political leader has dared (or perhaps wanted to) question it. So the myth became the guiding light of policy.
In the case of Gaza, the myth dictates that Hamas must be treated as an irrational gang of anti-Semites determined to destroy Israel. All the evidence to the contrary (including the most recent CNN interview with the head of Hamas) must be dismissed as merely the devious lies one would expect from such a diabolical crew.
More specifically, the myth dictates that Hamas must be smuggling into Gaza the weapons it needs to mount an all-out assault on Israel. So it makes perfect sense, from the Israeli perspective, to demand that Gaza be blockaded, to prevent Hamas from getting those weapons -- even though that means Gazans also can’t get food, medicine, building materials, and other necessities of life.
Some Israelis may find that an unfortunate side effect of the blockade. Others may see it as the main effect, like the prominent Israeli official who said that the point of the blockade is “to put the Palestinians on a diet, but not to make them die of hunger."
In either case, the blockade has the effect of proving Israel’s power over an enemy, which in turns proves (according to the mythic script) that the Jewish nation can hold its head up high, that Jews need no longer feel ashamed and blame themselves for powerlessness.
Thus the blockade has continued, provoking the only kind of resistance that Gazans have managed to come up with: sporadic rocket fire into Israel. Those who fire the rockets have said repeatedly that they will cease when the blockade ceases. But Israel’s nationalism demands that it dismiss these promises as deceit.
Of course those who fire the rockets know that their weapons are far too weak to influence Israeli policy directly. Perhaps they harbor some theories of indirect influence. In any case, they appear to be moved by the same nineteenth-century nationalist values that shaped Zionism: To remain passive, to accept and exhibit powerlessness, would be a mortal blow to their sense of dignity and self-respect. Rather than risk this gravest sin, they must show Israel, and the whole world, whatever power they have. So they risk the terrible retaliation that Israel periodically mounts.
Ultimately, then, it is the legacy of nineteenth-century nationalism that has kept the two sides locked in this ongoing conflict. Perhaps right now we are seeing the glimmer of a new kind of nationalism taking over, where pragmatic self-interest gets precedence over old-fashioned notions of national pride. But we should not underestimate the reach of the long shadow of the nineteenth century.
On Election Day we learned who will be president for the next four years. In the days after Election Day we learned something almost as important: the story that will be told about the election of 2012. The popular story of any election takes on a life of its own, and it can shape the political landscape for years to come.
We can now safely project the winner of this year’s election story contest: Republicans self-destructed by moving too far to the right on issues that matter to women (especially unmarried women), newly empowered Latinos, and still empowered African-Americans.
Among liberal pollsters this pro-Obama coalition (plus the under-30s) is often called “the rising American electorate” (RAE). They are the future, the story goes. The Republicans must face that fact, make the necessary changes, or get ready to become history. Race, ethnicity, and gender are destiny.
But I wouldn’t write off the Republicans so fast. If the story is told this particular way it can actually work to the GOP’s advantage. The 2012 election may become a turning point in our political history, as the story makes it out to be, only if class is added to race, ethnicity, and gender as a fundamental element in the plot.
I came to that conclusion by looking at some numbers that have largely been left out of the popular story.
First there is the most crucial and most often ignored number: seven million. That’s the drop in the number of white voters between 2008 and 2012. Seven million white voters just didn’t show up this year. The big question is whether they will show up four years from now or, just as importantly, two years from now.
In 2014, 20 Senate seats now held by Democrats will be up for grabs, 11 of them in states where Dems are vulnerable. Republicans will have 13 seats up, but only one in a state where a Dem might win. So a large turnout of Republican voters could easily give the GOP control of the Senate.
For the Democrats to retake the House in 2014, they must hold on to all the new seats they won this year -- all in swing districts -- and win at least 18 more, eight of them in leaning GOP districts. A strong showing of Republican voters would prevent that and insure that the GOP gets an even larger majority in the House.
As this year’s exit polls show, Republican success depends on a high percentage of white male voters. And there’s one thing that is sure to bring lots of white men: the currently popular story that emphasizes race, ethnicity, and gender.
It’s already being translated into language that white conservative men understand all too well: Latinos are teaming up with blacks and liberal (code for “loose”) women to take over the country. They’re the reason we are losing the America we once knew and loved. Rush Limbaugh told his millions of listeners the day after Election Day, “I went to bed last night thinking we've lost the country.”
But two years from now Limbaugh will be telling those millions that it’s time for patriots (read: whites) to take back their country. And they will try mightily, simply by showing up at the polls. Ditto for the dittoheads four years from now.
So for those of us who fear this vision of the future, it’s a good idea to look for another story about this year’s election that fits the facts but can blunt the boomerang effect of the “race, ethnicity, and gender” narrative. Fortunately, it’s staring us right in the face.
Pick up the exit poll and look at the category labeled “Family Income.” (The best breakdown is on the FoxNews site, but it’s the same poll all the media used). You’ll see a strikingly simple tale: The more money you make, the more likely you were to vote for Mitt Romney. Under 30K families went 63% and 30 - 50K families 57% for Obama. Among 50 - 100K families Romney got 52% and among 100 - 200K he increased to 54%.
50K, the median family income, is the great political divide. Voters below the median gave Obama 60% of their votes, and thus his victory. And some of them were white men and married women above age 30.
The RAE made up 48% of the voters, and two-thirds of them went for Obama. So 32% of the electorate were pro-Obama RAE voters. But Obama got a shade over 50% of the votes. So some 18% of voters were not part of the RAE yet opted for Obama. Some were folks with graduate degrees, most of them no doubt above the median income. But that leaves the decisive swing voters: several million white men and married women below the median income who voted for Obama.
So the election wasn’t just about racial, ethnic, or gender politics. It was also about the economy, stupid. As Paul Krugman wrote, “the big numbers came from groups unified by economic fear. … While single women and members of minority groups are more insecure at any given point of time than married whites, insecurity is on the rise for everyone, driven by changes in the economy.”
Yet the story of class -- which fits the exit polls as well as the story of race, ethnicity, and gender -- got virtually no hearing in the mass media.
It’s always been taboo in America to talk about class. The myth that “we’re all middle class” has been among the most powerful of all our national myths. Both candidates this year knew that very well, which is why they often sounded so silly as they fought to see who could mention the sacred words “middle class” most often. Barack Obama never talked about helping the poor, only about helping people who aren’t yet in the middle class achieve that normative status.
Historically, Americans have been able to avoid talking about the glaring class divides and tensions in their midst by focusing on the equally glaring divides and tensions surrounding race. It’s almost a cliché among historians to say that, while other nations have dealt so often with class conflict, we’ve dealt constantly with race conflict.
The growing salience of Latinos complicates matters a bit because thoughtful people (and the U.S. Census Bureau) know that Latinos are an ethnic group composed of many racial identities. But many (most?) white Americans see Latinos merely as “brown-skinned people,” making it easy to assume that Latinos, like African-Americans, are a dark-skinned race. So in practical political terms Latinos become part of the story of race as a substitute for class in public discourse.
Feminist historians would be quick to add that gender conflict has been central to public discourse, along with race conflict, throughout American history.
So the popular story of Election Day 2012 reflects a long-standing pattern, unique to the United States, of avoiding talk of class in favor of talk about race, ethnicity, and gender.
The Democrats are just now beginning to talk about the divide between the rich and the rest of us. It’s another big step to talk about the differences between those above and those below the median income -- including the decisive political difference.
But if Democrats don’t take that step soon they risk another major defeat in 2014 and perhaps 2016. Then all the benefits of re-electing Barack Obama could easily slip down the political drain.
On the other hand, imagine this as Democrats’ story of the 2012 election: The winning coalition was a rainbow of folks under median income who saw clearly where their bread was buttered, and it wasn’t on Wall Street or in the corporate offices of Bain Capital.
Then the central question of the elections of 2012 and 2014 might become, “Do you want a government dedicated solely to increasing the wealth and cutting the taxes of the rich while slashing the vital government services we all depend on?,” rather than, “Do you want those blacks, Latinos, and liberal women to take over the country?”
Making class a central issue could get at least some whites, especially men, in the lower income brackets to think of their vote in a rather different light -- not as revenge against the people who are “taking away our country,” but as a chance to continue a move toward the economic justice they deserve as a reward for all their years of hard work.
David Petraeus and Paula Broadwell in 2011. Credit: Flickr/U.S. Navy.
Robert Rubin, former secretary of the Treasury, writes in the New York Times: “Now that the election is over, Washington’s attention is consumed by the looming combination of automatic spending cuts and tax increases known as ‘the fiscal cliff.’”
“Consumed”? Excuse me, but I just checked the websites of the Times, the Washington Post, USA Today, CNN, Fox News, CBS, NBC, and ABC. Every one of them had the same lead story -- and it was not “the fiscal cliff.”
By now, of course, you know what it was. Everybody knows: THE SCANDAL WIDENS!
If Robert Rubin had written that some people in Washington are giving some attention to the “the looming ‘fiscal cliff,’” he might have been correct. In Washington they’re sort of forced to deal with such wonkish stuff, at least part of the time.
But outside Washington the “fiscal cliff’ must be so far eclipsed by THE SCANDAL that hardly anyone can see the “cliff” at all, much less see it looming ominously just ahead. Even in Washington, the news sources suggest, the “cliff” is taking a distinctly back seat to THE SCANDAL.
The news media are once again showing their depressing penchant for sensationalism. But there’s no point in complaining. It would as useful as complaining about the weather. Like Hurricane Sandy, THE SCANDAL will dominate the headlines until it runs its natural course and plays itself out.
If you want to know why, try this little thought experiment. Imagine that you are a Hollywood screenwriter hoping to pen the next box-office blockbuster. You’ve been offered two very different projects.
One is a film about the president and Congressional leaders negotiating to avoid a financial catastrophe. The other is about the nation’s two most prominent generals, one head of the CIA, caught in some mysterious secret relationships with two attractive younger women, both married, one a wealthy socialite and the other a Harvard-trained expert on terrorism.
No-brainer, right? That second project sounds like something that could only happen in a Hollywood movie, not in real life -- something manufactured in “the dream factory,” full of larger than life characters freighted with complex symbolic meanings, doing things that pack a powerful emotional punch.
In short, like any good movie, THE SCANDAL has all the qualities we associate with myth. Which is precisely why it has eclipsed what may be the most important political negotiations in decades.
Of course it’s the very real newsroom editors, not some hypothetical Hollywood writer, who are faced with the choice. Their job is to deliver audiences to advertisers. And what audiences want from their news is not so much accurate facts or penetrating logical analyses as gripping tales, the kind that would make good movies. So that’s what the editors give them. Why do you think we call them news “stories”?
No doubt it’s significant that this particular story is loaded with sex appeal. I’ve read plenty of Freud (once even taught a course on him), so I could offer some opinions on why sex sells. But I’ll demur.
The larger and more important point is the power of narrative to shape our perceptions of public events. (I was going to say “public affairs,” but that seems a poor choice of words here.) Indeed, it’s may be fair to say that events don’t become public -- at least don’t take on public significance -- until they are represented in narrative form. And the more mythic those narratives are, the more public attention they get.
In this case, as so often, it’s all most unfortunate. THE SCANDAL will soon be forgotten, as most scandals are, and have no lasting impact on the nation. But the negotiations to avoid going over “the cliff” will have a huge and lasting impact on all of us. The fate of Medicare and Medicaid, and perhaps Social Security too, hangs in the balance.
There’s an organized movement to stop President Obama from agreeing to cuts in those entitlement programs. That movement might have some success if it can muster broad public support. Will the public ever know about it? I wonder. It is getting a bit of news coverage. It’s even featured on the WaPo website -- buried beneath six (6) stories about THE SCANDAL!
As I said, there’s no use complaining about it. But we can use THE SCANDAL as a very useful reminder that we can’t understand American public life -- and certainly not American political life -- without giving serious attention to its mythic dimension.
I suppose what those opponents of cuts to entitlements need now is a good myth. Something about “Grandma,” perhaps? Remember those fictional “death panels”? But, as THE SCANDAL reminds us, mythic tales can be full of empirically true facts.
An empirically true, but emotionally powerful, story about what will happen to “Grandma” if her Medicare is cut might be just the thing right now -- once THE SCANDAL fades from the front page of public memory. Let’s just hope it fades before “Grandma,” and all of us, go over “the cliff.”
Credit: Flickr/Wiki Commons/HNN staff.
As the presidential race neared the finish line, I occasionally tried to resist my obsession with today’s politics by opening Peter Onuf’s Jefferson’s Empire. The more I read, though, the more I realize that studying Jefferson doesn’t take us out of the present at all. It merely reminds us that, as Faulkner said, the past isn’t even past.
Onuf explains that Jefferson’s vision of America was profoundly shaped by his understanding of the British empire, where all power and wealth flowed from the periphery (especially the colonies) to the center, the great metropolis of London and its royal court. Jefferson insisted that the United States of America must be the opposite: a vast empire with no metropolitan center and thus no periphery to be oppressed by the center.
This view became the framework for Jefferson’s understanding of American nationalism and thus (like so much else in Jefferson’s thought) a basic staple of the American political narrative for future generations.
At every moment of crisis, Onuf writes, Americans have repeated Jefferson’s essential revolutionary gesture. They have understood -- “(or imagined)” he adds, in a crucial parenthetical remark -- that “they confronted powerful domestic enemies” ensconced in the metropolis “who were prepared to sacrifice the common good for their own selfish advantage. Thus even as the memory of the Revolution evoked images of transcendent brotherhood and union -- the apotheosis of empire -- it also taught young patriots to question the patriotism of their opponents and to mobilize against them.”
Jefferson is, of course, the holy grail of every generation of American political speakers. All want to prove that they are his genuine representative, worthy to bear and pass on his legacy.
So it’s not surprising that, in his victory speech, Barack Obama evoked powerful Jeffersonian images of transcendent brotherhood and union. “We rise or fall together as one nation and as one people. ... What makes America exceptional are the bonds that hold together the most diverse nation on Earth, the belief that our destiny is shared,” he proclaimed.
But he embedded his reconciliatory words within a veiled warning that there are still domestic enemies to be confronted: “By itself, the recognition that we have common hopes and dreams won't end all the gridlock.”
And throughout his speech he made it clear how to identify the enemies. They’re the ones who resist all the “common hopes and dreams” he named: better schools, new technologies, health care for all, equality for racial minorities and gays and the disabled, “new jobs and new opportunities and new security for the middle class”; in short, the whole agenda of policy goals for which he advocates government action and spending.
Where would the money to fund these improvements come from? And who would oppose them? Obama didn’t have to spell out the answers. Having spent months demanding higher taxes from the rich, and attacking Republicans who promise lower taxes, he could assume that everyone got the message clearly enough.
Obama did not question the patriotism of the rich whose special interest lies in resisting higher taxes, nor of the Republicans who carry their banner in Congress. But when he rejected the “wishful idealism that allows us to ... shirk from a fight,” he was clearly mobilizing his political troops to do battle against them. In good Jeffersonian fashion, he clearly implied that all patriotic Americans would rally to his call.
The most immediate battleground is the showdown over the looming “fiscal cliff.” Enter another major contender for the title of “true Jeffersonian” in 2012: John Boehner.
In a press conference just hours after Obama’s victory, the speaker of the House of Representatives sounded his own clarion call for transcendent brotherhood and union: Voters “gave us a mandate to work together to do the best thing for our country. ... Let's challenge ourselves to find the common ground that has eluded us ... and do the right thing together for our country.”
But Boehner, too, could scarcely conceal his warning about domestic enemies who imperil the common good for their selfish interests. “The greatest challenge of all [is] a massive [federal] debt.” And “the entitlement programs are the primary drivers of our debt.”
Boehner knew it would be impolitic for the losing party to spell out the obvious implication: The enemies are all those recipients of Social Security, Medicare, and Medicaid who refuse to take the cuts that true patriots would eagerly accept.
(Over at Fox News, Bill O’Reilly didn’t hesitate to say it out loud. “It’s not a traditional America anymore,” he lamented. There’s a new majority made up of people who “want stuff.”)
Boehner went beyond Obama by identifying the good guys as well as the bad guys. No less than seven times he lauded small businesses -- the “rock of our economy” -- and demanded that they be protected from tax hikes.
This praise of independent entrepreneurs gave Boehner another point in the competition for the title of “true Jeffersonian.” Jefferson assumed that the vast majority of patriotic Americans would be independent yeoman farmers, the most common form of small businessmen in his day.
Boehner scored an even bigger Jeffersonian point, though, when he warned against “government taking a larger share of what the American people earn.” Here was the familiar heart of the GOP's Jeffersonian message, the evil of the metropolis and especially the royal court: “Feeding the growth of government through higher tax rates won’t help us solve the problem. ... A ‘balanced’ approach isn’t balanced if it’s done in the old Washington way of raising taxes now, and ultimately failing to cut spending in the future.”
But don’t count Obama out in this “true Jeffersonian” contest. He’s proven that he’s every bit as much a “comeback kid” as Bill Clinton. And in this case his path to victory is clear, though not easy. He has to explain to the American people that the new form of empire, which Jefferson did so much to create, only managed to produce part of the change that TJ expected.
It did largely eliminate the old imperial system, in which rulers housed in the metropolis reaped direct financial gain from their political control. No one gets rich simply by being president or speaker of the House. Top-flight politicians can almost always make far more money by using their skills in the private sector, where the real wealth is.
Real wealth still flows in huge waves to the metropolis, of course. But it’s not the same metropolis as the seat of political power. To put it bluntly, Washington and New York (and Chicago, Los Angeles, San Francisco, Houston, and Dallas) are separate metropoles; the political and economic centers are no longer the same. That’s the piece of the picture Jefferson did not foresee.
This means that, theoretically, the government in Washington can be the true agent of the common good, the benefactor of all, while the masses remain oppressed by the other metropoles, where wealthy and powerful domestic enemies sacrifice the common good for their own selfish advantage.
Obama could use his impressive rhetorical gift to make the case that this theoretical possibility has become the actual reality. Then he could call true patriots to mobilize in support of the political metropolis against the selfish enemies of the nation, who live so lavishly in the economic metropoles.
Obama could add that the economic metropoles have reorganized our economic life so that Jefferson’s vision of a land full of small businessmen can no longer match the reality. But, he could explain, Jefferson’s praise of the yeoman farmer need not be seen as praise for the family or household as an independent economic enterprise.
Rather, Jefferson was making a compelling argument that everyone in society benefits when each household has a firm and dependable foundation of economic sufficiency. In our day this comes much more often from earned wages and benefits than from labor in one’s own fields. But it is still the task of the political metropolis to insure that the land is filled with economically secure households, despite all the opposition from the economic metropoles.
If Obama makes these rhetorical moves he can defeat Boehner and the Republicans in the contest for the title of “true Jeffersonian.” More importantly, he can update our understanding of America as the Jeffersonian “empire of liberty” and make it relevant for the twenty-firs century. And, in the process, he just might take control of policymaking in the political metropolis, too.
I’ve waited eagerly for the day after Election Day, to see what the story of Election 2012 would be. Every presidential winner has a story attached to his name. Sometimes the story is not so memorable. (What was the day-after-victory story of Jimmy Carter or George H.W. Bush?). Often, though, the story told about an election outlives the direct influence of the president whose name is attached to it:
1960: John F. Kennedy: Youth and vigor can meet any challenge.
1968 and 1972: Richard Nixon: Law and order stem the tumult of “the ‘60s.”
1980 and 1984: Ronald Reagan: It’s morning in America as we shrink big government.
2004: George W. Bush: America must win the war on terror.
What about 2008, when the name of Barack Obama was indelibly linked to the words “hope and change”? Had Obama lost in 2012, his story probably would have been as forgotten as Carter’s or Bush 41’s.
But given Obama’s victory, the jury is still out, awaiting the verdict of history yet to be written.
In his bid for reelection, the president intentionally avoided any emphasis on the “hope and change” narrative. Focus groups showed that voters had “lowered their expectations, and they responded better when Obama appeared to have lowered his expectations, too,” as Ezra Klein reports.
Yet many observers, listening to the president’s 2012 victory speech, thought they heard powerful echoes of the “hope and change” story returning.
This time, though, the story is thicker because it’s linked to two themes that dominated Obama’s campaign rhetoric. One is the burned-once caution Klein notes, which showed up clearly in the victory speech: “As it has for more than two centuries, progress will come in fits and starts” because the work of self-government is always “hard and frustrating.”
The other new theme is economic inequality, the demand that the rich should pay a little bit more so that the middle class can survive, expand, and perhaps even thrive again. Nearly a year ago the president signaled that this would be the leitmotif of his campaign, in a speech in Osawatamie, Kansas.
As the campaign went on and the focus groups held sway, that theme was blurred by a host of others which looked like winners among crucial niche groups in crucial states. But the original leitmotif never disappeared.
It came back in a Washington Post story just days before the election, surely planted by the Obama campaign, that the president would demand higher taxes on the rich in the post-election bargaining as we approach the “fiscal cliff.”
And it came back in the victory speech, too. The president coupled “reducing our deficit” with “reforming out tax code.” He promised to “continue to fight for new jobs and new opportunities and new security for the middle class,” to “keep the promise of our founding,” that “you can make it here in America if you're willing to try.”
Obama put these economic promises in the same broader ideological context he had used since Osawatamie: “We are an American family, and we rise or fall together as one nation and as one people. ... What makes America exceptional are the bonds that hold together the most diverse nation on Earth, the belief that our destiny is shared. ... This country only works when we accept certain obligations to one another and to future generations.”
So “hope and change” now has a more specific meaning: Struggling against entrenched opposition to force the rich to act, at least a little bit, as if they had an obligation to care about the economic well-being of the rest of us.
Despite the Obama campaign’s efforts to blur and soften this narrative, there was no way -- and is no way -- that it can separated from the man. It remains the most obvious demarcation between the president and his challenger, who was widely perceived as the embodiment of the wealthy and their power and privileges.
This difference in narratives does not explain Obama’s victory. None of the influential voices in the mass media are saying that. Indeed, so many different explanations are being offered for Obama’s victory that no single story will emerge as “the story” of the day after Election Day, 2012. But history can attach a narrative to a president even if it does not judge that narrative to be the key to his electoral success.
History can also judge the narrative immensely successful regardless of the president’s policies. No one should expect Obama to make a serious dent in the power and privileges of the rich. His first term shows no evidence that he wants to do more than symbolically chip away at the edges of that power and privilege.
Yet symbolism is an immensely powerful force. Kennedy’s youthful vigor never solved his greatest challenge, Vietnam. But it helped give rise to the youth culture of the ‘60s. Nixon could not turn back all the changes the youth culture initiated. But his theme of “law and order” blunted the truly radical power of the ‘60s and paved the wave for Reaganism.
Reagan didn’t really shrink government. But he created a mythology that government is the problem, which still reigns in the House of Representatives today. George W. Bush’s war on terror led to fiascos in Iraq and Afghanistan. But drones still kill innocent civilians by the scores because political reality demands that any president must “defeat terrorists.”
Regardless of his policies, if Obama is widely seen four years from now as a successful president, his story of shared destiny translated into greater economic equality will be remembered as the true meaning of “hope and change.” And it will take on a life of its own, exerting a powerful influence on American political life long after Barack Obama leaves the White House.
Credit: HNN staff.
“Who lost Libya?” Mitt Romney has not asked the question exactly that way. Neither has Paul Ryan, nor any prominent Republican politician or commentator, as far as I know. But anyone familiar with the history of U.S. foreign policy since the 1940s can hardly avoid hearing that question, between the lines, in the GOP assault on the Obama administration’s handling of the September 11 killings in Benghazi.
The “Who lost … ?” pattern first emerged after the communist revolution transformed mainland China in 1949. Republicans angrily demanded, “Who lost China?” The taste of omnipotence coming out of World War II was still fresh in Americans’ mouths. It seemed like the U.S. had such immense power, we could control just about everything that happened everywhere outside the Soviet Union and its eastern European bloc.
The Democrats boasted about that apparent omnipotence. Secretary of State Dean Acheson crowed that the U.S. was “the locomotive at the head of mankind ... the rest of the world is the caboose.” The Democrats assumed that claiming credit for achieving such power could only redound to their political advantage.
Then suddenly the Chinese revolution made it seem like a big “red” chunk of the caboose had come loose and was careening out of control. Given the widespread premise that the U.S. controlled the entire “free world,” it was impossible for many Americans to believe that the Chinese had the power, on their own, to release themselves from America’s grasp.
The only logical way to explain it was to assume that someone within the U.S. government had consciously let China go. Someone had committed treachery. It must have been an inside job.
The Republicans saw this explanation as a great chance to neutralize the points the Democrats had scored on foreign policy throughout the 1940s. They insisted that the traitorous villains had to be inside Acheson’s State Department.
The political dynamite was defused in June 1950, when Truman sent several hundred thousand U.S. troops to fight the communists in Korea. That was hardly his main motive, but it was a welcome political side effect.
However the “Who lost China?” debate had long-lasting effects. Apart from the ensuing purge of the best Asia experts from the State Department (which paved the way for the disastrous U.S. involvement in Vietnam), the debate had a major impact on the narrative of U.S. foreign policy for years to come.
It reinforced the assumption of American omnipotence. To argue seriously about “Who lost China?” implied that we once “had” China, as a sort of possession, and had let it slip from our grasp.
To chalk it up to internal treachery was not merely consistent with the image of U.S. omnipotence; it actually reinforced the image. Now, the story went, just as the U.S. government could hold on to nations at its will, so it could let them go, even though that would always be a mistake.
And the Democrats’ response to the charges -- ramping up the Cold War in Korea and elsewhere -- further reinforced the idea that the U.S. ought to aim, at least, at total control of the “free world.” The Democrats had to say that to reassure a nervous public. The obvious fact that other nations act independently could hardly get a fair hearing.
Nevertheless, the reassuring implications of the debate were offset by a more frightening one. Though we were still holding on to the rest of the “free world,” the “loss” of China showed how fragile our hold was. If we weren’t hyper-vigilant, who knew what country we might lose next. At any time the “dike” might burst (as Dwight Eisenhower warned his National Security Council, as the discussed Vietnam in 1954) and the “red tide” would flood our own homeland.
The reassurance and the fear actually reinforced each other. The more Americans worried about “losing” some other nation, the more they reinforced the premise that the “free world” was indeed a possession under our control. And the more we “had,” the more we had to “lose.” So our global control would always be threatened, it seemed. But the bipartisan narrative agreed that strong, wise, patriotic leaders should be able to keep the “dike” firm and hold on to the “free world” forever.
This myth of homeland insecurity became the fundamental myth of American foreign affairs. And Democrats were haunted by the shadow of the “Who lost China?” question. They were constantly on the defensive, vulnerable to GOP charges of being weak on security. Only in the late 1950s and early 1960s did they successfully fend off those charges.
Although the Cold War ended, the myth and its specter of permanent peril endured. Once the “Iron Curtain” fell, the whole world came to look like a possession that we were supposed to control. Every nation was ours to lose. As Colin Powell, chair of the Joint Chiefs of Staff, put it in the early ‘90s, “the real threat is the unknown, the uncertain.” The U.S. needed “the ability to respond to the crisis nobody expected, nobody told us about, the contingency that suddenly pops up at 2:00 in the morning.”
During the Democratic primary contest of 2008, some copywriter for the Hillary Clinton campaign advanced the danger hour to 3:00 am. But the impact of that famous “phone call” ad showed that the myth of homeland security, institutionalized during the Cold War years, was still as powerful as ever.
In early September, 2008, Barack Obama was falling behind in the polls; his campaign based on “hope and change” was stumbling. Then suddenly a new peril appeared on the scene: an impending collapse of the economic system that threatened to flood the nation with disaster. Obama was judged most able to fend off that peril and he surged ahead.
But Obama and his political strategists knew that, as Democrats, they would always be open to charges of being “weak” on security issues. No doubt many factors moved the president to adopt a national security policy in many ways resembled his predecessor’s. But the need to guard his right political flank was surely one of those factors.
Like the Democrats of the late ‘40s, Obama’s 2012 campaign team expected to score lots of political points by crowing about American domination -- in this case, domination of a splintering, Osama bin Laden-less Al Qaeda. Once again, though, calling attention to homeland security issues put the Democrats in a precarious political position. Having intentionally created an impression of a strong U.S. hand controlling events around the world, they were vulnerable to any event that called their total control into question. On September 11, 2012, in Benghazi, that event arrived.
Credit: Wikimedia Commons/HNN staff.
Today I posted a long article on Truthout.org titled "What's Still the Matter With Kansas -- and With the Democrats?" The title refers to a popular 2005 book by Thomas Frank, exploring the puzzle of why so many people of middling economic means vote for Republicans whose policies so clearly favor the rich and do little to help people of middling economic means. Frank chose Kansas as the place to study a large number of voters who vote against their economic self-interest because he came from Kansas.
In my article I use "Kansas" as a symbol for all those voters. I argue that Democrats are losing this key demographic group, and maybe this election, because they're unwilling to support values issues dear to the heart of “Kansans” that they could very plausibly endorse.
One little piece of that article may be of special interest to historians. I note another book on the same topic, Red State Religion, by another native Kansan, the eminent sociologist of American religion Robert Wuthnow. He stresses the powerful spirit of community you will find among these Republican voters of Kansas. He also traces the history of that spirit being expressed both in religious communities and in electoral politics. There’s a rich tradition of many “Kansans” voting Democratic for decades, in the nineteenth and early twentieth century, when populism and progressivism overlapped in so many way. Back then, lots of “Kansans” understood the invaluable role of government.
Now, their descendants will still often bend over backwards to help you out when you need it -- as long as they judge you deserving. But, crucially, they insist on reserving that right to judge for themselves. They won’t let any government bureaucrat do it.
Why not? Wuthnow traces the distrust of the federal government back to 1938, when Franklin D. Roosevelt failed to follow through on the promises he’d made in the 1936 campaign. As my Truthout article shows, that’s far too simplistic an explanation. It’s only one factor, and probably not a major factor, in understanding the “Kansas” of today.
Still, it’s an interesting point. Wuthnow does make a strong case for the late ‘30s as the crucial point at which “Kansas” began to support the conservative drive to shrink government.
But he neglects to explore the complexities of that turning point. FDR did not intentionally forsake “Kansas.” He made a couple of bad strategic blunders: insisting on his court-packing plan after it was obviously bound to fail, and campaigning in the 1938 primaries against some stalwart Democrat conservatives running for re-election to Congress, who won re-nomination and re-election anyway.
As a result, FDR lost a lot of political capital in Congress. For that and lots of other reasons, Congress became more conservative and blocked progressive measures that FDR probably would have been happy to sign into law.
So FDR was blamed for failures that were mostly caused by an obstructive Congress. Of course back in those days a president was allowed to blame Congress, loud and clear, for obstructing progressive measures that he would have approved.
Today that seems to be pretty much taboo. Barack Obama, who suffered much the same fate as the second-term Franklin Roosevelt, has put very little effort into pinning the blame on the Republicans in Congress. If he tried to make that a major issue, he would be pilloried by the press as a whiner and a weasel, trying to avoid taking responsibility.
I’m not sure why that change in media perspective has happened. But it’s certainly worth noticing.
Credit: Wikimedia Commons.
The prominent psychologist Steven Pinker has a long piece on the New York Times website, trying to explain why Republicans do so well in the South and the West but not in the rest of the country. It seems that it all comes down to how different regions have, historically, dealt with the eternal threat of societal anarchy. Harvard media stars rush in where careful historians usually fear to tread, or at best tread very lightly.
There are plenty of holes in Pinker’s speculative framework big enough to drive most any vehicle you can think of through. For starters, if the North is indeed historically accustomed to counting on government to tame anarchy, as he argues, how to explain the Republican strength in New Hampshire, or in the non-urbanized areas of northern Ohio, Indiana, and Illinois? And if the West (which one assumes includes the “red” Great Plains states) is so accustomed to rejecting government as the tamer of anarchy, how explain the great political success of Progressivism and farmer-labor coalitions in those states in the days of William Jennings Bryan?
If Pinker’s whole edifice is taken seriously, it quickly dies the death of a thousand qualifications.
But rather than subject it to such a slow, painful death by analyzing it in detail, I’d rather look at the part of the article that has some persuasive power. That means setting aside all the speculation about the history of geographical regions and looking at politics in terms of personal decisions. What makes some people choose a candidate who sees a prominent role for government in society, while others choose a candidate who wants to limit and weaken government's role?
Pinker is an expert on the history of the long-term decline in human violence. So his focus, naturally, is on how people deal with violence and the prospect of it being inflicted upon them.
He links the small-government view to the culture of honor, where individuals -- mostly men -- decide for themselves when they have been offended and how to punish the offenders. They keep “the safeguarding of their personal safety” as their own private prerogative. At best, they cede that power to “their own civilizing forces of churches, families and temperance,” created largely by women.
Those who would allow government a much larger role “are extensions of Europe and continued the government-driven civilizing process that had been gathering momentum since the Middle Ages.” They are especially extensions of “the Age of Reason and the Enlightenment, [when] governments were forced to implement democratic procedures, humanitarian reforms and the protection of human rights.”
If there’s any truth in this speculation, it suggests that less-government advocates live in a social world where collective institutions for curbing violence appear to be relatively weaker and less dependable, compared with the social world of more-government advocates. (Note that I say social, not geographical, world. Two next-door neighbors can -- and from the campaign yard signs I see in my town, often do -- live in totally different social worlds.)
Explaining how and why those different social worlds arose is like explaining the weather: The causal factors way too complicated, with far too many variables, to be modeled completely on even the most sophisticated computers. The best we can hope for are partial explanations, depending on what particular questions are asked. Historians and social scientists should certainly keep on vigorously pursuing those questions. But they should not hope for the kind of simple, all-encompassing explanation that Pinker offers here.
However, like the weather, the effects of different social worlds can be understood with a lot more certainty than the causes. People who feel relatively less protected from offense and violence, for whatever reasons, are more likely to feel more vulnerable, to see the world as a more threatening place and other people as sources of threat. So they are more likely to draw upon the mythology of homeland insecurity to make sense out of their experience -- a mythology based on the premise that we Americans will always face some serious threat to our very existence.
People who feel relatively safer from offense and violence are more likely to feel more protected, to see the world as a place where people can cooperate because others are not such sources of threat. So they are more likely to draw upon the mythology of hope and change to make sense out of their experience -- a mythology that says people can work together to make a better community for all, using government as their collective agent.
In the current presidential election we might seem to have a direct head-to-head competition between the two social worlds, with the two locked in a virtual tie. But things are more complicated. The number one apostle of the mythology of hope and change, Barack Obama, states bluntly that “the first role of the federal government is to keep the American people safe.” That’s “homeland insecurity” at its best.
Perhaps he is simply trying to appeal to the less-government advocates, so he can peel off enough of their votes to eke out a victory. If so, it’s good evidence of how strong the mythology of homeland insecurity is.
But I think this is better evidence of how closely the two great mythologies are intertwined. Pinker’s “red state vs. blue state” kind of analysis is popular for the same reason athletic contests of all kinds are so popular. We want to see two clearly defined sides fight it out and, in the end, have a clear-cut winner and loser.
But it doesn’t match the reality of American life. No one feels absolutely threatened or absolutely secure. Like Pinker’s “red state” and “blue state” personalities, these absolutes are ideal types, useful only for theoretical purposes.
In fact, all of us live somewhere on a spectrum between those two theoretical constructs. All of us feel some degree of threat and vulnerability, and some degree of safety and protection. So all of us are drawn to both of the great mythologies. How we vote will depend largely on the particular mix of the two within our minds and our autonomic nervous systems.
It’s not surprising, then, to see both of the major party presidential candidates drawing on both of the mythologies and blending them together. Each does it in his own way, gesturing somewhat more toward one end or the other of the spectrum. But both recognize that the crucial swing voters are in the middle of the spectrum, with their sense of vulnerability and their sense of protection balanced in roughly equal measure.
That seems to sum up the state of the union in the autumn of 2012. No one can yet predict which way the balance will tip by Election Day.
For the as-yet-undecided, it’s worth remembering that even the smallest gesture toward one end or the other of the spectrum is a self-fulfilling prophecy. Those who act as if the institutions that protect us are relatively weak end up weakening the institutions that protect us, so that ultimately we are all in fact more vulnerable. And that’s true no matter where we live.
Tammy Baldwin in 2010. Credit: Flickr/Center for American Progress.
In case anyone doubts the power of myth and symbol in American politics: In the dead-heat race for the Senate in Wisconsin, one issue now towers over all others, the Washington Post reports. It’s not health care or education or energy or immigration. No, it’s Democratic Congresswoman Tammy Baldwin’s 2006 vote against a purely symbolic bill to continue recognizing September 11 as a national day of remembrance and mourning.
Baldwin voted against the bill because it included a clause endorsing the Patriot Act and a host of other post-9/11 legislation, which few people had read completely and even fewer understood thoroughly.
But an ad by Baldwin’s opponent, former Wisconsin governor and secretary of health and human services Tommy Thompson, conveniently omits that explanation and all the symbolic recognitions of 9/11 that Baldwin did vote for. Instead, the ad features military personnel and veterans charging that Baldwin dishonors the victims of 9/11, disgraces the flag, slaps every one of America’s military personnel in the face, puts the nation’s security “in jeopardy,” leads us down “a very dangerous path,” and doesn’t care about America’s children. All this from one symbolic vote -- and in 30 seconds.
As a piece of political advertising, it has impressive production values and certainly tugs at plenty of voters’ heartstrings. But only one thing sets it apart from many other such slick ads: It has now made Baldwin’s no vote six years ago the pivotal issue in the far-too-close-to-call contest, according to WaPo reporter Aaron Blake.
It would take an entire book to unpack all of the symbolic and mythic narratives crammed into those thirty seconds. I won’t even try to outline the table of contents of that book here. I simply want to note what a huge role pure symbolism can play in what we think of as the very real world of power politics, as if “symbolic” or “mythic” and “real” were somehow opposites.
But if we define “real” as whatever makes a difference in the world, then in Wisconsin in 2012, at least, the mythic symbolism is the dominant reality. If a politician as liberal as the fifty-year-old Baldwin enters the Senate, she might well be there for three decades or more, moving up to committee chairs, wielding significant influence, and thereby nudging the Senate at least a bit further to the left. If she’s kept out by the emotional impact of this ad and this issue, the future of the Senate will be at least a little different for decades to come.
Moreover, Wisconsin is still very much a toss-up in the presidential race. Voters’ feelings for or against Baldwin are sure to influence the fate of Wisconsin’s ten critical electoral votes.
In this context it’s also worth recalling a former senator from Wisconsin, named Joseph McCarthy. Talk about myth and symbolism becoming political reality!
All this is a useful reminder that myths and symbols are political realities, deserving the same careful attention we give to any other political reality.
McGovern vs. Nixon campaign pamphlet, 1972. Credit: Pennsylvania AFL-CIO.
George McGovern was the first presidential candidate I actively campaigned for. Like many baby boomers, I stood on the street corner handing out “Vote for McGovern” handbills. The fifty-year-old Democrat was so unique among politicians, we gave him a special exception to our first commandment: Never trust anyone over thirty.
Under thirty? Sure. We knew we could trust each other. Or so we thought.
But the day before George McGovern died, I stumbled across a little known fact that took me back those forty years and made me wonder whether my trust was misplaced.
Assuming that we can trust the data compiled by American National Election Studies, it seems that on Election Day 1972, of my fellow under-thirty, baby-boomer voters, only 47 percent marked their ballots for McGovern. 53 percent voted for Richard Nixon.
There are at least two good lessons here: First, our knowledge of the body politic depends largely on who we hang out with. We tend to assume too easily that the people we know in our own demographic groups (age, gender, race, whatever) represent the entirety of those demographics. I suppose we ought to get around more, talk to more people who are like us demographically but not politically.
The other lesson is that the common wisdom handed down as history is often not borne out by the facts. I suppose we ought to do more empirical research and less parroting of the common wisdom (of which, in this case, I was guilty all these years).
By coincidence, on the day George McGovern died I learned another fact about that 1972 election: women voted overwhelmingly for Nixon, in virtually the same numbers as men. And there was no gender gap at all in 1976. Since 1980, though, Republican presidential candidates have done far better among men and Democrats far better among women. It looks like the same pattern will repeat again this Election Day.
When I mentioned this to my wife, she asked an obvious question that I’ve rarely if ever seen discussed in all the fevered analysis of the polls: In presidential elections, do more women vote, or more men, or is it roughly equal?
Since I had the American National Election Studies website up on my computer, it was easy to get an answer: In ’72 and ’76, women voters far outnumbered men. It didn’t matter much then, since there was virtually no gender gap.
But since 1980, women have continued to outnumber men by nearly as much. On average, roughly 54 percent of voters have been women. To repeat: In all those elections, women have voted Democratic in significantly higher numbers than men. So if the vote had been evenly split between the two genders, the Republicans would have done significantly better.
In 2000, for example, Al Gore won the women’s vote by 11 percent. George W. Bush got the men by 9 percent. But 56 percent of the voters were women. Had it been 50-50, Bush would have won easily and the Supreme Court would have been spared its worst embarrassment in living memory.
As far as I can tell, since 1972 (when the stats I have on the gender gap begin), there’s no case where the preponderance of women was the decisive factor; i.e., where a 50-50 gender turnout would have swung the election to the other candidate.
But 2012 could be a first. As close as this election is, and with the gender gap as large as ever, if the pattern of women outnumbering men by about 8 points continues, Barack Obama might well gain re-election solely due to the women’s vote.
The larger point here is that, since 1980, the presidential vote has not accurately reflected the political views of the population at large (assuming that the gender split in the overall population is roughly 50-50, which is roughly the case in the U.S.). With so many more women voting, the electorate has trended a bit more Democratic than the whole body politic. In other words, the presidential election results have led us to think that the American people were a bit more liberal than they really were.
In the same way, the mythic tale of George McGovern and the “youth vote” led us to think that the baby-boomers of “the ‘60s” were a bit more liberal than they really were.
Considering what hard times it’s often been for liberals since 1972, it’s a bitter pill for those of us on the left to learn that the reality has been even worse than we thought.
It’s sad that we no longer have George McGovern with us. He was such a fine model of the committed liberal who keeps on speaking up for what he (or, more likely, she) believes in, regardless of how chilly the political climate may be.
Credit: Flickr/Obama for America.
Did you think the second presidential debate was too nasty, that it was sad to see the two lead actors portray such a polarized image of American politics? The third performer up on the stage, moderator Candy Crowley, didn’t think so.
“They were talking to their bases who want to see them stand up to each other,” Crowley said on CNN after the debate. “They were so good being at each other’s face, and I thought this was a debate, so I let it go. … It was so good.”
The woman with the only front row seat didn’t seem to be interested in the content of the candidates’ arguments, much less their logical coherence. She cared about the show. And as long as they were at each other’s face, “it was so good.”
A long-time TV professional, who has made television her life, naturally judges the debate by the same criteria she would judge any television show. And appropriately so, since the debate is above all television entertainment.
That’s why when debate season roles around I always turn to TV critics, like the New York Times’ David Carr. What struck Carr most about the first debate was not anything about the content. It was the extraordinary size of the audience -- over 70 million -- “breaking a 32-year-old record in viewership.” (And there was every likelihood that the second debate would score even higher.) Only the Superbowl gained more viewers -- a TV show where we don’t merely hope, but know with certainty, that the performers will be at each other’s face.
“Credit live event television,” Carr wrote, “the last remaining civic common in an atomized world. While ratings for almost everything on television have sunk, big spectacles that hold some promise of spontaneity -- N.F.L. games, the Olympics and various singing competitions -- continue to thrive.” And, of course, so do the presidential debates, as long as the race is close enough that the big prize is at stake.
Carr quotes Jeff Zucker, former chief executive of NBC Universal: “Television is about drama, whether it is the Olympics, the Super Bowl, or ‘Homeland,’ and these debates have provided incredibly great drama. It just proves the adage that if you put on a good show, and both of these debates have been very good television, the audiences are going to be there.”
Carr and Zucker didn’t say it, but they know as well as Candy Crowley what makes great drama that draws big audiences: conflict, characters standing up to each other and being at each other’s faces.
Crowley and Carr were merely two of the thousands of journalists and commentators, not only on TV but in every news medium, who all read from the same prescribed text: It’s fundamentally about performance. Obama lost the first debate because of his poor performance. In fact he lost most because of his performance when he wasn’t speaking. So the content of his words could not have played much role at all in his loss.
That’s why everyone was focused on Obama’s performance in the second debate. And he played it pitch perfect. When Romney spoke, Obama showed no scorn or disinterest or boredom. He was all ears, apparently paying attention with the appropriately neutral face. But when it was his turn to speak, he was at Romney’s face -- certainly not all the time, but enough to make it the biggest news event of the night.
Romney gave as good as he got, though -- letting the New York Times website headline (happily, I trust), “Rivals Bring Bare Fists to Rematch.”
Media professionals don’t really care who won, as long as they get a good conflict-packed show. Having one candidate declared the surprise, clear-cut winner, as in the first debate, is a bonus; it makes the show even better.
Most voters will agree it was a good debate. It offered enough conflict to create a good drama, which is always entertaining.
But the voters care about more than just production values and being entertained. They have a much more urgent question than “Was it a good show?” As Maureen Dowd put it, “Every election has the same narrative: Can the strong father protect the house from invaders?” That’s the question the voters ask about each candidate -- consciously or unconsciously -- as they watch the two perform.
That’s bound to be the crucial question in a nation whose political life is shaped so much by the myth of homeland insecurity -- a myth that says invaders are always outside, threatening to burst through the door and destroy us if our leaders don’t have fists strong enough to keep them out.
There’s no common agreement about who the invaders are. Indeed, one way to understand American political discourse is to see it as a debate about the name of the truly threatening invader. Is it the rich who thrive in an unregulated, runaway, overly free market? Or is it the government, imposing too much taxation and too much regulation? Or perhaps the terrorists? Or maybe it doesn’t matter so much who, exactly, the invaders are.
The crucial question is which candidate is strong enough to keep out the invaders, whoever they may be.
Oh, perhaps you thought the crucial question had something to do with the economy, since you’ve been told that about a zillion times. Consider this:
In CNN’s instant (but “scientific”) poll of second debate watchers, well over 55% said Romney would be the better president when it comes to boosting the economy and lowering the deficit. But the same group awarded Obama a victory in the debate by the sizeable margin of 46% to 39%.
Obama lost the first debate, the media consensus agrees, because he simply did not look strong enough to protect the house. In the second debate, he was warned, he had either to look strong enough or to expect defeat on Election Day. He certainly got the message, proved himself up to the task, and took home the blue ribbon.
But Romney did a creditable job of performing the role of strong father, too. So he’s not out of the race by any means. It will continue to be close unless one or the other candidate shows a moment of major weakness.
Whoever wins, though, this debate will stand as evidence that many voters are looking for both good entertainment and that strong father to soothe their insecurities. Perhaps they are looking for good entertainment mostly because it, too, soothes their insecurities.
Romney as Pagliacci -- acting out the theater state. Credit: Flickr/HNN staff.
There’s an old theory that people perform religious rituals as a way of acting out their sacred myths. Scholars of religion don’t take this old theory very seriously any more. It’s far too simplistic and misses too many aspects of the meaning of function of ritual. Sometimes, though, this theory still sheds interesting light on rituals. It’s especially useful when a ritual does pretty obviously act out a myth and the people performing the ritual tell you that they are reenacting one of their myths.
A fine example is the Christian ritual of Eucharist: eating the body and drinking the blood of Christ. In the Gospel story of the Last Supper, Jesus explicitly tells his disciples to keep on eating bread and drinking wine after he is gone, because those consumables are his body and blood. When you ask Christians who believe that the consumables literally become the body and blood why they are doing the ritual, they’ll tell you that they are obeying Jesus’s command and doing exactly what the disciples did. They are acting out their sacred myth.
Christians, when I call the Gospel story a myth, please don’t be offended. I don’t mean it’s a lie. A myth is a narrative that people tell to express their most basic views about what the world is like and how they should live in it. The myth serves that purpose whether it’s totally false, totally true, or (as is usually the case) some mixture of the two. So it’s perfectly possible that every word in the Gospels tells us what actually, literally happened in the life of Jesus of Nazareth. The Gospels would still be Christians’ mythology.
Fact-checking the myth is irrelevant to its role in the lives of the people who tell it. They do not judge it by whether it can be proven factually true. Rather, it shapes their view of truth; it tells them what they can accept as factually true and what they must consider false. So they act out their myth in a ritual to reinforce their commitment to truth as the myth teaches them to see it -- or so the old theory goes.
It’s worthwhile dusting off that old theory in this election season, which presents us with an interesting twist: What happens when fact-checking itself becomes a ritual? I don’t have quantitative data, but it seems to me that we have much more fact-checking in this presidential election than in any election before. Fact-checkers seem to be all over the place.
And the mass news media promote their fact-checking as a major part of their campaign coverage. They treat it as something their audience really wants. Since they are in business to make money, presumably they do have quantitative data; presumably they’ve done market research that shows they can increase their ratings or readership with all that fact-checking.
Why is fact-checking so popular? The traditional American view of democracy has a ready answer: The people know that, to be responsible voters, they must know the facts. How else can they judge which party’s policies are best for the nation? And they must know whether the candidates are leveling with them. We want a president who is a straight-shooter, not one who will deceive us for his or her own political gain.
There’s a complex myth of democracy packed into that little story. There’s a basic premise: Democracy can work because we humans are rational animals. We are built to be fact-checkers; we all have the capacity to separate true facts from lies. And once we have true facts, we know how to analyze them logically to come to reasonable conclusions. If that weren’t true, democracy would be a foolish experiment, indeed.
But, the myth goes on to say, a capacity is useless unless it is developed through training. That’s why democracy demands universal access to education. How much education is a matter of debate; other democracies tend to set the bar higher than we Americans. The basic concept is the same in all democracies, though: Only educated people can be responsible citizens because only the educated have actualized their potential for fact-checking and rational thinking.
Many of the reformers who promoted universal public education in the nineteenth century (for boys at least; some weren’t sure about girls’ capacity for reasoning) were motivated by that myth. Of course capitalism also drove education reform; the industrial revolution created a demand for more educated workers, just as the high-tech revolution has in our own time. But a genuine commitment to the mythic vision of democracy played a significant role back then. (We’re probably too close to evaluate how much of a role it plays in moves toward expanding educational opportunity today.)
The myth of democracy says that citizens must educated enough to know which policies are best for their community. But good citizens must also bring their rationality into the polling booth. They must know which candidates promote and implement the right policies. They must know whether incumbents have done so, and whether challengers might do better. That means they must have honesty from their leaders and transparency from their government.
Hence, the need for fact-checkers at every step on the campaign trail. It’s only logical.
Except that there’s no evidence all the fact-checking has any measurable impact on the voters’ choices.
As soon as the first presidential debate ended, many Obama supporters were quite gleeful. Mitt Romney had made so many demonstrably false statements, and denied his own positions so often, that it seemed like a bonanza for the Democrats. They duly set about broadcasting that bonanza, falsehood and deception by falsehood and deception.
And look what they got for their efforts.
Even the prominent pro-Obama intellectual Robert Reich, a master of progressive ideas, opens and closes his “Memo to the President” for the next debate with advice about performance style. Though Reich offers plenty of ideas too, he knows that ideas hardly mattered any more than facts in the outcome of the first debate. Romney won on style points alone.
The “theater state” is a performance art. Every candidate is judged, above all, on their performance. Good theatrical performers know how to create satisfying illusory images of truth. It’s one of their highest skills. Mitt Romney proved that in the first debate. The big question, all the mass media reports tell us, is whether Barack Obama can prove equal to the task in the second debate.
Michael Scherer’s conclusion to his perceptive Time cover story on fact-checking is quite on the money:
When the final book is written on this campaign, one-sided deception will still have played a central role. As it stands, the very notions of fact and truth are employed in American politics as much to distort as to reveal. And until the voting public demands something else, not just from the politicians they oppose but also from the ones they support, there is little reason to suspect that will change.
But why should the voting public demand something else? They’ve already got this enormous stage in the political theater packed to the rafters with fact-checkers. The fact-checkers are performing their duly appointed role in the drama, just as the candidates are. The fact-checkers, too, are seasoned performers skilled in the art of creating satisfying illusory images of truth.
Above all, they create the illusion that American democracy is alive and well because the public is apparently being informed of the facts and the veracity of each candidate is apparently being carefully evaluated and widely reported. Fact-checking, then, is the ritual enactment of our myth of democracy. As long as the myth keeps getting acted out, we can trust that it is alive and well.
There has been growing suspicion over the years whether democracy really is alive and well in this postmodern world, where signs are increasingly detached from the reality they claim to signify. The ritual of fact-checking eases the anxiety about the state of our democracy in this “theater state.” That, I submit, is why fact-checking is so popular.
Are you worried about the looming “fiscal cliff”? Well if it’s your only worry about the American economy, you’re not worried nearly enough. There are plenty of other economic cliffs out there, just waiting for you.
That’s the lead story on the front page of this past Sunday’s Washington Post. “Even if Washington somehow finds a way to avoid the fiscal cliff -- the automatic tax hikes and federal spending cuts that threaten to plunge the nation back into a recession --” Zachary A. Goldfarb warns us, “the economy could suffer a stiff blow next year.”
Tax hikes and spending cuts could take billions of dollars out of the economy. But if we extend tax cuts and cancel spending cuts, we’ll increase the federal debt, bringing new and unpredictable economic suffering. So we’re trapped.
There’s no glimmer of good news to counter the gloom and doom brought to you by the WaPo. There’s only an overwhelmed Congress and administration, forced to grapple with an impending economic apocalypse. We’re likely to go over one cliff or another, it seems -- no matter who wins the election.
Indeed, from this article you wouldn’t even know that there is an election coming up. The apocalyptic threat is treated as a fact of life that transcends politics.
We’ve seen endless news reports and opinion pieces for a long time now telling us that this is “the new normal.” It doesn’t always mean that we’re doomed to go totally over the cliff. But it always means something pretty disastrous compared to the promise of endlessly growing prosperity, which was, until recently, taken for granted in our shared national story.
A permanent possibility of disaster is nothing new in the American story, though. What’s new is to find it in the domestic, economic arena. When it comes to foreign affairs, Americans are accustomed to living with apocalyptic danger as the norm, expecting their government to manage the threat at best, but never to extinguish it.
We first learned this fear-ridden way of life back in the 1950s. Of course then the threat was “the reds.” The Eisenhower administration created the foreign policy that Ike’s successors followed throughout the cold war, the policy I call “apocalypse management.”
Eisenhower warned publicly that we were “not in a moment of peril, but an age of peril.” In an internal White House memo, a staffer described it as “the new normal.” After the 9/11 attack, Dick Cheney used the same phrase to describe the supposedly endless “war on terror.”
Now, the WaPo suggests, permanent fear is still the new normal. The only difference is that the peril comes from the economy within.
Over at the Sunday front page of the nation’s other most influential newspaper, the New York Times, the horizon is a just tad brighter. There’s a big color photo of Donna’s Diner in Elyria, Ohio, with the dawn’s early light barely relieving the gloom of night. The headline reads: “At the Corner of Hope and Worry: A Small Café, and a Small City, are Put to the Test by a Tough Economy.”
Below is a photo of Donna, the proprietor, holding her hands in an obviously prayerful gesture, with anxiety etched on her face. It looks like there’s still a chance that Donna, an iconic ordinary American, will somehow avoid the cliff, pass the test, and make it through these tough times -- if she has enough hope and faith, the photo suggests. But no one can say for sure.
Put these lead stories from the nation’s two most prominent newspapers together and you get a complicated narrative: As we head toward a domestic apocalypse, there’s not much the government can do about it. The politicians will try their best to manage this “new normal.” But they are so hopelessly tangled in their internal contradictions, we can’t count on them for anything. We would do better to put our hope in the faith and resilience of ordinary Americans, people just like you and me. That sounds like a very Republican message.
When it comes to foreign policy, presidents of both parties have offered much more than that when they pledged to protect the American people from “the red menace” and "the terrorists." They never said it was up to the people themselves to keep the nation safe. They promised that the government would do the job.
Democrats traditionally made the same promise when economic apocalypse loomed. William Jennings Bryan famously preached that the Democrats would save the people from being crucified on a cross of gold. Franklin D. Roosevelt asserted that, if Congress failed to halt the Great Depression, he would ask for “broad Executive power to wage a war against the emergency, as great as the power that would be given to me if we were in fact invaded by a foreign foe.” That was the biggest applause line of his first inaugural address.
In the same address Roosevelt summed up the traditional Democratic view of how ordinary Americans respond to crisis. He insisted “as a first consideration, upon the interdependence of the various elements in all parts of the United States -- a recognition of the old and permanently important manifestation of the American spirit of the pioneer.” FDR knew that the pioneers were no rugged individualists. They built their communities by working together, using government as their agent.
Barack Obama seemed to be building his campaign on the same kind of message, until he lost his narrative way. He still offers more from government than Romney, to be sure. But now, as the race tightens and he knows he’ll have to work with another Republican House, he seems to focus more on the power of “ordinary people.” In his closing remarks in the first debate, all he could offer from government was to “channel” the “genius, grit, and determination” of the American people.
If that is the political narrative of the future -- if the alternative that Democrats once offered is so muted in our national conversation -- then today’s “new normal” is something new indeed. And the real looming tragedy is the way it diminishes the possibilities for a better future that we all could enjoy if, like true pioneers, we expected -- and elected -- government leaders to serve our common interests.