History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Sat, 23 Mar 2019 06:49:22 +0000 Sat, 23 Mar 2019 06:49:22 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://hnn.us/site/feed Trump's Executive Order Censors Free Speech on College Campuses

 

In 1961, a historian at the University of Pittsburgh named Robert G. Colodny was called before the House Un-American Activities Committee. Colodny was just one of HUAC’s many targets, a list which included screenwriters like Dalton Trumbo and playwrights such as Arthur Miller. HUAC remained a fearsome and fundamentally anti-democratic means of intimidation and often professional ruin even after the height of the McCarthy era’s Red baiting. The professor drew suspicion after he innocuously referred to Cuban “agrarian reforms” in the Pittsburgh Press, which was enough for a local state representative to label Colodny a communist sympathizer. Shortly after, Congress and then the university itself launched investigations. This, it should be said, is what an attack on academic freedom looks like. 

Part of what contributed to the professor’s new-found notoriety was that Colodny had been among those idealists and visionaries, including writers like Ernest Hemingway and George Orwell, who enlisted themselves in the army of the democratically elected government of Republican Spain, which in the late 1930’s was threatened and ultimately defeated by the fascist forces of the future dictator Francisco Franco. They were often tarred as “prematurely anti-fascist,” with historian Adam Hochschild explaining in Spain in Our Hearts: Americans in the Spanish Civil War 1936-1939 that for those fighters the conflict was “seen as a moral and political touchstone, a world war in embryo, in a Europe shadowed by the rapid ascent of fascism.” Franco received aid and assistance from Mussolini and Hitler, with the Luftwaffe’s brutal destruction of the Basque city of Guernica indeed a prelude to the coming horror of the bloodiest war in human history. Women and men like Colodny, who served in the international brigades, correctly believed that right-wing nationalism and international fascism should be countered on the battlefields of Spain. As Orwell would write in his 1938 account Homage to Catalonia, “I recognized it immediately as a state of affairs worth fighting for.” From 1937 until the following year, Colodny would fight in a battalion of volunteers known as the Abraham Lincoln Brigade, the first integrated squadron of American soldiers, and one of over fifty international brigades composed of leftists who fought against the Spanish fascists. The future professor sustained a gunshot wound above his right eye which left Colodny partially paralyzed and blind. Despite his injuries, he’d later serve in the American armed forces, going onto receive a doctorate in history at the University of California at Berkeley, where he specialized in the philosophy of science.

Such were the vagaries of a fascinating, if unassuming, professional career until Colodny would be called to account for his anti-fascist record. After his congressional testimony, the University of Pittsburgh was under pressure to terminate Colodny’s appointment, but after six months of investigation they would conclude that the professor’s political opinions and service didn’t constitute a reason for dismissal. Pitt’s Chancellor Edward H. Litchfield wrote in his conclusion to the investigation, in a statement that deserves to be the canonical statement on academic freedom, that a university “embraces and supports the society in which it operates, but it knows no established doctrines, accepts no ordained patterns of behavior, acknowledges no truth as given. Were it otherwise, the university would be unworthy of the role which our society has assigned it.”   

As moving and apt an encapsulation of the free inquiry that lay at the heart of American higher education as any that’s ever been written, and one that as of today is under serious threat by the machination of the Trump administration. On March 21st Trump signed an executive order with the anodyne designation of “Improving Free Inquiry, Transparency, and Accountability at Colleges and Universities,” a declaration that by name alone would be easy to assume is congruent with Litchfield’s idealistic argument of half a century ago. But the order’s language, which claims that we must “encourage institutions to appropriately account” for free inquiry in their “administration of student life and to avoid creating environments that stifle competing perspectives” lacks not just Litchfield’s poetry, but indeed means the exact opposite of that earlier defense. Trump’s order, fulfilling a promise to his right-wing supporters and their long-standing obsession with a perceived liberal bias in the academy, exists not to promote inquiry, but to stifle it; not to expand perspectives, but rather to limit them; not to encourage free speech, but to censor it. 

Trump’s order was germinated out of the debate that has surrounded questions concerning the scheduling of fascist speakers at universities. Today’s order can arguably be traced back towards an incident at Colodny’s alma matter of Berkeley, which incidentally was also the birthplace of the Free Speech Movement of the 1960’s. In 2017 violent confrontations between political groups, none of whom were affiliated with the university, led to the cancelling of one event due to security concerns. Importantly the university had approved this speaker’s visit, and indeed said the speaker was paid with student activity funds. At no point was the speaker censored or oppressed, despite his abhorrent views. 

With his characteristic grammar, punctuation, orthography and enthusiasm for capitalization, the president tweeted on February 2, 2017 that “If U.C. Berkeley does not allow free speech and practices violence on innocent people with a different point of view – NO FEDERAL FUNDS?” Tallying the inaccuracies in a Donald J. Trump statement is a bit like searching for sand at the beach, but it should go without saying that neither Berkeley faculty nor its administration had enacted  “violence on innocent people.” Rather a hitherto invited right-wing speaker arrived with his own retinue of supporters that were countered by community groups not affiliated with the university itself, and unsurprisingly hate speech generated hate. 

The language of the March 21st executive order is nebulous, but seems to imply that colleges and universities will lose federal funds if there choose not to host certain speakers. This is, as should be obvious, the opposite of free speech. A university has every right to decide who will speak on its campus, and the community certainly has the right to object to certain speakers, who are normally paid from the budget generated by student activity fees. It’s unclear if such a federal order will be consistently applied, so that an evangelical college would be required to invite pro-choice speakers, or a Christian university would have to pay visiting atheist lecturers, but I’ll let you guess what the intent of the proclamation most likely is. 

Colodny’s brother-in-arms George Orwell would probably have something astute to say about the manner in which the Trump administration has commandeered the language of free speech so as to subvert free speech. At the very least you have to appreciate the smug arrogance of it. Such an executive order, which is red-meat to Trump’s base, is the culmination of two generations of neurotic, anxious, right-wing fretting about apparent liberal infiltration of colleges and universities. While it’s true that faculty, depending on discipline, tend to vote liberal, you’re as likely to find a genuine Marxist among university professors as you are to find an ethical member of the Trump administration itself. Furthermore, this concern over “political diversity” is only raised when conservatives feel threatened, and academe is simply the one small corner of society not completely dominated by the right. Ask yourself what insecurity encourages those who dominate the executive branch, dozens of state governments, business, and increasingly the judiciary to continually fulminate about academe, Hollywood, and the media?

You’ll note that the concern over the perceived lack of political diversity among faculty normally begins and ends at the social sciences and humanities, though more recently the natural sciences have also been attacked for daring to challenge the conservative ideological orthodoxy on issues such as climate change. Conservatives aren’t concerned about a lack of diversity among business faculty, or even more importantly among the trustees of colleges and universities, where every higher education worker knows that the real power is concentrated. For that matter, there is no equivalent hand-wringing about political diversity on corporate boards, though perhaps a socialist sitting in on a board meeting at the Bank of America could have all done us some good in 2008. Nobody in the Republican Party seems terribly concerned that other professions which hew to the right, be they law enforcement or investment bankers, don’t have a “diversity” of political opinions represented in their ranks. 

That’s because today’s order obviously has nothing to actually do with free inquiry and diversity, but rather intends to stranglehold it. Terry Hartle, the senior vice president for government and public affairs at the American Council on Education said in a speech that “As always in the current environment, irony does come into play. This is an administration that stifles the views of its own research scientists if they are counter to the pollical views of the administration… And the president vigorously attacks people like Colin Kaepernick.” 

It’s impossible to interpret much of what the administration does without an awareness of their own finely adroit sense of sadistic irony and mocking sarcasm. In such a context, Thursday’s executive order, whose full ramifications remain unclear, is far from a defense of free inquiry but rather a sop to those like right-wing activist David Horowitz, director of his own self-named and so-called “Freedom Center,” or the administrators of the website Professor Watchlist, maintained by the conservative group Turning Point USA. Trump’s executive order is an attempt to return us to the era in which Colodny could be fired for his progressive views, an age of blacklists and loyalty oaths. 

Anyone attending a college, or has children enrolled, or who works in higher education, is amply aware that the state of the American university is troubled. The recent enrollment scandal whereby wealthy parents simply paid their children’s way into elite institutions (as cynically unsurprising as this may be) only underscores the malignancies which define too much of post-secondary education in America today. College is too expensive, too exclusionary, and its resources are misallocated. The academic job market is punishing, and serves not the graduate students who aspire towards a professorial job. Undergraduates take on obscene amounts of debt, and the often-inflated reputation of the Ivy league and a handful of other instructions still sets too much of the tenor of American social, political, and cultural life. But none of these problems are because the university is too “liberal.” To the contrary, American higher education could stand to move a lot more to the left in terms of admissions and employment. If anything, the current crisis in higher education is most closely related to the imposition of a certain business mentality upon institutions whose goal was never to be the accumulation of profit for its own sake. 

Because despite its contradictions, American higher education has historically remained the envy of the world. There is a reason that international students clamber for a spot at an American college. Since the emergence of the American research university in the 19th century, higher education has been at the forefront of research and innovation. Even more importantly, democratizing legislation such as the GI Bill and affirmative action transformed American universities into the greatest engine of upward class mobility in human history. It’s not a coincidence that conservative attacks on higher education occurred right at the moment when it became available to the largest number of people, but the nature of these most recent attacks, making federal funding contingent on which right-wing agitator receives a hefty speaker’s fee, could have a chilling effect on education. 

Sociologist Jonathan R. Cole writes in The Great American University that our system of higher education has been “able to produce a very high proportion of the most important fundamental knowledge and practical research discoveries in the world.” By intervening in the details of who is invited to speak on a college campus (which is of course separate from censorship), the federal government threatens the independence and innovation of higher education, by imposing an ideological approved straight-jacket upon that which has historically been our great laboratory of democracy. Colodny wrote that the goal of higher education was so that “some traditional holder of power feels the tempest of new and renewing ideas.” The man who currently occupies the Oval Office can’t abide either of those things, and so he’d rather burn it all down than spend a moment being threatened by institutions that actually enshrine free inquiry. The gross obscenity is that he’s self-righteously claiming the mantle of that same free inquiry to do it. 

 

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171563 https://historynewsnetwork.org/article/171563 0
What is Antisemitism? Steve Hochstadt teaches at Illinois College and blogs for HNN.

 

 

Antisemitism is alive and well these days. In Europe and America, the number of antisemitic incidents areincreasing every year, according to those who try to keep track.

 

News about antisemitism has recently wandered from the streets and the internet into the halls of Congress. The presence of two newly elected young Muslim women in the House, who openly advocate for Palestinians against Israel, has upset the strongly pro-Israel consensus that has dominated American politics for decades. Accusations of antisemitism are especially directed at Ilhan Omar from Minneapolis, who has used language that is reminiscent of traditional antisemitic themes in her criticism of Israeli policies. Her case demonstrates that it can be difficult to distinguish between unacceptable antisemitism and political criticism of the Jewish government of Israel and its supporters.

 

Some incidents seem to be easy to label as antisemitic. For example, when a large group of young people physically attacked Jewish women while they were praying. Many women were injured, including the female rabbi leading the prayers. The attackers carried signs assailing the women’s religious beliefs, and the press reported that the women “were shoved, scratched, spit on and verbally abused”.

 

An obvious case of antisemitism? No, because the attackers were ultra-Orthodox Jewish girls and boys, bussed to the Western Wall in Jerusalem in order to attack the non-Orthodox Women of the Wall, who were violating misogynist Orthodox traditions about who can pray at the Wall. This incident fulfills every possible definition of antisemitism. For example, the International Holocaust Remembrance Alliance offers the following description of public acts that are antisemitic: “Calling for, aiding, or justifying the killing or harming of Jews in the name of a radical ideology or an extremist view of religion.” The ultra-Orthodox leaders who encouraged the assault would argue that they were protecting, not attacking Judaism, and that the Women of the Wall were not really Jewish anyway.

 

Acts of antisemitism are political acts. Accusations of antisemitism are likewise political acts, deployed in the service of the political interests of the accusers. Many, perhaps most accusations of antisemitism are made in good faith for the purpose of calling attention to real religious prejudice. But such accusations are often made for less honest political purposes.

 

The Republicans in Congress who demand that Democrats denounce Ilhan Omar are cynically using the accusation of antisemitism for political gain. Many Republicans have themselvesmade statements or employed political advertisements that are clearly antisemitic. The rest have stood by in silence while their colleagues and their President made antisemitic statements. But they saw political advantage in attacking a Democrat as antisemitic.

 

Supporters of the Israeli government’s policies against Palestinians routinely accuse their critics of antisemitism as a means of drawing attention away from Israeli policies and diverting it to the accusers’ motives. Sometimes critics of Israel are at least partially motivated by antisemitism. But the use of this rhetorical tactic also often leads to absurdity: Jews who do not approve of the continued occupation of land in the West Bank or the discrimination against Palestinians in Israel are accused of being “self-hating Jews”.

 

This linking of antisemitism and criticism of Israeli policy has worked well to shield the Israeli government from reasonable scrutiny of its policies. In fact, there is no necessary connection between the two. Criticism of current Israeli policy is voiced by many Jews and Jewish organizations, both religious and secular.

 

Supporters of the idea of boycotting Israeli businesses as protest against Israeli treatment of Palestinians, the so-called BDS movement, are sometimes assumed to be antisemitic and thus worthy of attack by extremists. But the pro-Israel but also pro-peace Washington Jewish organization J-Street argues that “Efforts to exclude BDS Movement supporters from public forums and to ban them from conversations are misguided and doomed to fail.” I don’t remember that any of the supporters of boycotting and divesting from South Africa because of its racial policies were called anti-white.

 

Those who advocate a “one-state solution” to the conflict between Israel and the Palestinians are sometimes accused by conservatives of being antisemitic, with the argument that this one state will inevitably eventually have a majority of Muslims. The Washington Examiner calls this equivalent to the “gradual genocide of the Jewish people”.

 

The absurdity of equating anti-Zionism with antisemitism is personified by the denunciations of Zionism and the existence of Israel by the Orthodox Satmar, one of the largest Hasidic groups in the world.

 

On the other side, the most vociferous American supporters of Prime Minister Netanyahu’s government have been evangelical Christians. Although they claim to be the best friends of Israel, the religious basis of right-wing evangelical Christianity is the antisemitic assertion that Jews will burn in hell forever, if we do not give up our religion. Robert Jeffress, the pastor of First Baptist Church in Dallas, who spoke at President Trump’s private inaugural prayer service, has frequently said that Jews, and all other non-Christians, will go to hell. The San Antonio televangelist John C. Hagee, who was invited by Trump to give the closing benediction at the opening of the new American Embassy in Jerusalem, has preached that the Holocaust was divine providence, because God sent Hitler to help Jews get to the promised land. Eastern European nationalists, who often employ antisemitic tropes to appeal to voters, are also among the most vociferous supporters of Netanyahu and Israel.

 

Political calculations have muddied our understanding of antisemitism. Supporters of the most right-wing Israeli policies include many people who don’t like Jews. Hatreds which belonged together in the days of the KKK may now be separated among right-wing white supremacists.

 

But no matter what they say, purveyors of racial prejudice and defenders of white privilege are in fact enemies of the long-term interests of Jews all over the world, who can only find a safe haven in democratic equality.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/blog/154195 https://historynewsnetwork.org/blog/154195 0
Mike Pence Says the US Has Been "A Force For Good in the Middle East" for "nearly 200 years"; Here's How Historians Responded Allen Mikaelian is a DC-based editor and writer. He received his history PhD from American University and served as editor of the American Historical Association’s magazine, Perspectives on History. The Political Uses of the Past Project collects and checks statements by elected and appointed officials. This is the first installment of what will hopefully become a regular feature of the project. Read more about the project here. Contact the editor of the project here.

Vice President Pence: "For nearly 200 years, stretching back to our Treaty of Amity and Commerce with Oman, the United States has been a force for good in the Middle East"

For nearly 200 years, stretching back to our Treaty of Amity and Commerce with Oman, the United States has been a force for good in the Middle East. Previous administrations in my country too often underestimated the danger that radical Islamic terrorism posed to the American people, our homeland, our allies, and our partners. Their inaction saw the terrorist attacks from the U.S.S. Cole; to September 11th; to the expansion of ISIS across Syria and Iraq — reaching all the way to the suburbs of Baghdad. But as the world has witnessed over the past two years, under President Trump, those days are over. —Vice President Michael Pence, Remarks, Warsaw Ministerial Working Luncheon, February 14, 2019

Historians say...

Eight historians responded to our request for comment; their full statements and recommended sources are on the Political Uses of the Past page).

The vice president starts with the 1833 treaty with Oman, and so shall we, even though it’s an odd place to start. As Will Hanley of Florida State University noted in his reaction to Pence’s claim, the treaty itself is a piece of routine boilerplate, not so different “from dozens of other 1830s agreements between Middle East authorities and representatives of American and European states.” But there was at least one innovation, as Hanley explains: “The Sultan of Muscat inserted a clause saying that he, rather than the US, would cover the costs of lodging distressed American sailors. A more accurate statement [by Pence] on this evidence would be ‘For nearly 200 years, stretching back to our Treaty of Amity and Commerce with Oman, representatives of the United States have pursued standardized agreements in the Middle East and enjoyed meals that we haven't paid for.’”

Vice President Pence made this broad statement at a ministerial meeting on terrorism, but his mind was primarily on Iran. His intent was to draw a contrast between the United States and Iran, with the former being a “force for good” in the region and the latter being a perpetrator of continual violence. But by going back to 1833 to reference a routine and fairly boring trade agreement with a minor kingdom, he appears to be grasping at straws.

If Pence was looking for good done by the United States in the Middle East, he could have asked some of the historians who reacted to his statement. He may have learned from Joel Beinin how “American missionaries established some of the leading universities in the Middle East: The American University of Beirut, The American University in Cairo and Robert College in Istanbul. The Medical School of AUB is among the best in the region.” He may have been interested to hear from Indira Falk Gesink that "after World War I, most of those polled in the regions surrounding Syria wanted the US as their mandatory power (if they wanted any)." He may have learned from Lior Sternfeld how the United States has sponsored “schools, universities, and orphanages” and took a stand against its European allies and Israel during the Suez Crisis of 1956.

But if he had asked and had learned about these efforts, he would also have learned from Professor Beinin that many of the missionaries who established these schools went to work for the CIA in the postwar period, “so even the very best thing that Americans have done in the Middle East since the early 19th century was corrupted by government efforts to exert power over the region in order to control its oil.” And Pence would have also had to hear Professor Sternfeld tell about the 1953 coup in Iran that cemented a brutal regime in place for the next quarter-century and how, as described by Professor Gesink, "from that point on, US actions in the Middle East were guided by demand for oil and anti-Communist containment." Finally, he would have had to hear about how much that 1953 coup has to do with our relations with Iran now.

Historians who replied to our request for comment could not find much “force for good” in the historical record. Instead, they find “death, displacement, and destruction” (Ziad Abu-Rish), support for “the most ruthless and brutal dictators at every turn” and the “most fanatical and chauvinistic nationalist and religious forces at every turn” (Mark Le Vine), “intense and destructive interventions … characterized by public deception, confusion, and mixed motives” (Michael Provence), "a moral compromise with authoritarianism"  (Indira Falk Gesink), and actions that have “contributed to breakdowns in security, widespread violence, and humanitarian disaster” (Dale Stahl).

Homage to the Shah after coup d'état, 5 September 1953, The Guardian - Unseen images of the 1953 Iran coup.

Three historians below recommend The Coup: 1953, The CIA, and The Roots of Modern U.S.-Iranian Relations by Ervand Abrahamian, and this book is incredibly pertinent today. Previous historical accounts and justifications by 1950s policymakers made the coup all about Mosaddegh’s unwieldiness to compromise or said it was all about winning the Cold War. Abrahamian instead shows that it was about oil, or, more specifically, “the repercussions that oil nationalization could have on such faraway places as Indonesia and South America, not to mention the rest of the Persian Gulf.” And for this, Iran and the Middle East got, courtesy of the United States, the brutal Mohammad Reza Shah. The shah crushed the democratic opposition, filling his jails with thousands of political prisoners, and left “a gaping political vacuum—one filled eventually by the Islamic movement.” And so here we are.

Mike Pence’s incredibly blinkered statement can be viewed as an extreme counterpoint to the right-wing view of Obama’s Cairo speech, in which the president mildly acknowledged that the US had not always been on the side of right in the Middle East, and that its history of actions have come back to haunt us all. Such things, it seems, must not be spoken in the muscular Trump administration, even if it means abandoning an understanding that might actually be useful. “For me as an historian,” Mark Le Vine notes below, “perhaps the worst part the history of US foreign policy in the region is precisely that scholars have for so long done everything possible to inform politicians, the media and the public about the realities there. Largely to no avail.” Indeed, Mike Pence here appears intent on utterly blocking out history and historical thinking, even as he dreams of a long and glorious past.

Browse and download sources recommended by the historians below from our Zotero library, or try our in-browser library.

 

Ziad Abu-Rish, Assistant Professor of History at Ohio University

I'm only going to tackle the "force for good" claim, without getting into the claims about Trump compared to his predecessors or the notion of "radical Islamic terrorism." Let's give Vice President Pence a chance at being correct... Read more

Joel Beinin, Donald J. McLachlan Professor of History and Professor of Middle East History, Emeritus, Stanford University

American missionaries established some of the leading universities in the Middle East: The American University of Beirut, The American University in Cairo and Robert College in Istanbul. The Medical School of AUB is among the best in the region... Read more

Indira Falk Gesink, Baldwin Wallace University

I think this is a much more complicated question than is generally acknowledged. On the one hand, some American private citizens have had long-lasting positive impact—for example the founding of educational institutions such as Roberts College, the American University in Beirut (originally the Syrian Protestant College), and the American University in Cairo. At that time, the US generally was viewed positively in the region. ... Read more

Will Hanley, Florida State University

It's not possible to use historical evidence to support a black-and-white statement like "The United States has been a force for good in the Middle East." Even if it were possible, the slim 1833 treaty between the US and the Sultan of Muscat is meager evidence. ... Read more

Mark Andrew Le Vine, Professor of Modern Middle Eastern History, UC Irvine

This statement is ridiculous even by the standards of the Trump administration. The US has been among the most damaging forces in the Middle East for the last three quarters of a century. ... Read more

Michael Provence, Professor of Modern Middle Eastern History, University of California, San Diego

The United States had no role in the Middle East before 1945, apart from private business and educational initiatives. Within a couple years of 1945, the US tilted toward Israel in its first war, began overthrowing democratic Middle Eastern governments, and propping up pliant dictators. ... Read more

Dale Stahl, Assistant Professor of History, University of Colorado Denver

I see this statement as "more or less false" because there are clear examples where the United States has not had a positive influence in the Middle East. One needn't reflect very far back into that "nearly 200 years" of history to know that this is so. ... Read more

Lior Sternfeld, Penn State University

While the US had some moments where it was a force for good, with projects like schools, universities, and orphanages, it was also a source for instability in cases like the 1953 coup against Mosaddegh that overturned the course not just of Iran but of the region in its entirety. Read more

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/blog/154194 https://historynewsnetwork.org/blog/154194 0
Roundup Top 10!  

The New Zealand Shooting and the Great-Man Theory of Misery

by Jelani Cobb

Most of the men who committed these recent acts of terror composed manifestos. A sense of history turning on the fulcrum of a single man’s actions is a theme within them.

 

Nazis Have Always Been Trolls

by Adam Serwer

Historically, they rely on murderous insincerity and the unwillingness of liberal societies to see them for what they are.

 

 

The first time the U.S. considered drafting women — 75 years ago

by Pamela D. Toler

As legislative debate about drafting women in 1945 shows, if the military need is great enough, women will be drafted no matter how uncomfortable lawmakers are with the prospect.

 

 

Poor criminal defendants need better legal counsel to achieve a just society

by Connie Hassett-Walker

Why we must fulfill the promise of a famous Supreme Court decision to truly achieve criminal justice reform.

 

 

Native children benefit from knowing their heritage. Why attack a system that helps them?

by Bob Ferguson and Fawn Sharp

For 40 years, the Indian Child Welfare Act has protected the best interests of Native children and helped preserve the integrity of tribal nations across the United States.

 

 

The Story of the Dionne Quintuplets Is a Cautionary Tale for the Age of ‘Kidfluencers’

by Shelley Wood

The pitfalls and payoffs of advertising directly to children have consumed psychologists, pediatricians, marketers and anxious parents for the better part of a century.

 

 

Citizenship in the Age of Trump

by Karen J. Greenberg

Death By a Thousand Cuts

 

 

When bad actors twist history, historians take to Twitter. That’s a good thing.

by Waitman Wade Beorn

Engaging with the public isn’t pedantry; it’s direct engagement.

 

 

Americans don’t believe in meritocracy — they believe in fake-it-ocracy

by Niall Ferguson

This illegal “side door” into college came into existence because the back door of a fat donation — like the $2.5 million paid by Jared Kushner’s father to Harvard — isn’t 100 percent reliable.

</

 

Who’s the snowflake? We tenured professors, that’s who

by Anita Bernstein

Our freedom to say what we want is not only tolerated but celebrated.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171561 https://historynewsnetwork.org/article/171561 0
Andy Warhol: A Lot More than Soup Cans A month ago, I watched a television program that covered, briefly, the art of pop icon Andy Warhol, he of all the Campbell’s Soup cans. The narrator said that Warhol had passed into history and that young people today probably had no idea who he was.

I was startled. Young people did not know who the thin man with the white hair was, the man who hung out with Liz Taylor, Liza Minelli, dress designer Halston and the Jaggers? The man who painted the famous Mao portrait? Truman Capote’s buddy?

I’m a professor, so the next day I asked my classes, 25 students in each, if they knew who Andy Warhol was. I didn’t say artist or painter Andy Warhol, just Andy Warhol.

The hands shot into the air. About 95% of them knew who he was.

Andy Warhol will never pass from the scene. That is proven, conclusively, in the largest exhibit of his work in generations at the Whitney Museum, in New York, Andy Warhol – From A to B and Back Again. It is a marvelous and exciting tribute to his work and is attracting huge crowds.

The crowds are not art aficionados from the 1960s, either, but young women with baby carriages, high school student groups, young couples and foreign tourists. Warhol was an international celebrity and a celebrity superstar in addition to being a memorable artist, and, these crowds indicate, always will be remembered.

“Modern art history is full of trailblazers whose impact dims over time,” said Curator Scott Rothkopf. “But Warhol is that extremely rare case of an artist whose legacy grows only more potent and lasting. His inescapable example continues to inspire, awe and even vex new generations of artists and audiences with each passing year.”

Another curator, Donna De Salvo, said the originally Avant Garde Warhol has become part of mainstream art. “Warhol produced images that are now so familiar that it’s easy to forget how just how unsettling and even shocking they were when they debuted,” she said.

Warhol really became famous not so much because of his new age art, but because of his celebrity. He was friends with many of the biggest entertainment stars in the world, was a fixture at legendary New York nightclub Studio 54 in the 1980s, paled around with fashion designer Halston, drank wine with Liza Minelli and lunched with Liz Taylor. He was almost murdered in 1968 when an irate actress from his film studio, the Factory, shot him several times. The shooting made front page news all over the world. He was a central character in the movie Factory Girl, about Edie Sedgwick, one of his Factory actresses.

Everybody recognized him instantly since he wore those thick glasses and had that mop top of dyed white hair. That fame was why people paid so much attention to his often-bizarre work. Some said that the quiet boy from Pittsburgh, who fell in love with Shirley Temple as a kid created a unique persona of himself that worked well.

The Warhol exhibit, a real achievement in cultural history, occupies all of the fifth floor at the Whitney plus additional galleries on the first and third floors. The best way to start is on the first floor and the gallery of his oversized portraits. They are mounted in log rows across the walls of the room and they introduce you to Andy the celebrity and Andy the artist at the same time. The portraits also tell you a lot about show business and art history in the 1960s and ‘70s. There are lots of famous people on the walls here, like Liza Minelli, Dennis Hopper, soccer star Pele, socialite Jane Holzer and Halston, but lots of people you never heard of, too. 

The third-floor houses wall after wall of his famous “Cow Wallpaper,” adorned with hundreds of similar heads of a brown cow. It is eye-opening and hilarious. 

Another room has a stack of his popular blue and white Brillo pad boxes and a wall full of S & H Green Stamps (remember them?)

There are his paintings of magazine covers and lots of newspaper front pagers (an eerie one about a 1962 Air France plane crash).

You learn a lot about his personal life. As an example, as a young man he became a fan of Truman Capote, who wrote Breakfast at Tiffany’s and called him every single day. 

There are drawings of celebrity’s shoes to show how they represented their personalities. Christine Jorgensen was one of the first modern openly transgender women, so she has shoes that don’t match each other.

Unknown to most, he loved to do paintings of paintings of comic strip characters. Two in the exhibit, of Superman and Dick Tracy, in blazing bright colors, were displayed in a New York City department store window. 

What makes the exhibit so enjoyable at the Whitney Museum, recently opened on Gansevoort Street near the Hudson River, is the way the curators use its space. Unlike most museum exhibits, where everything is scrunched together, the curators used the large, high ceilinged rooms wisely, putting the 350 Warhol pieces, especially the very large ones (some are thirty feet wide) alone on the pristine white walls so they jump off the wall at you. You go around one corner and there is Elvis Presley as a gunslinger in four separate portraits firing one of his six-guns. Next to him is Marlon Brando in a leather jacket and on his motorcycle.

There are weird walls of photos such as most wanted criminals he drew from photos in a New York State Booklet, “13 Most Wanted Men.’ There is a series of his copies of Leonardo da Vinci’s Mona Lisa and then his copy of his copy.

He did inspirational photos and silkscreens. A woman named Ethel Sculls CHEC arrived at his studio one day for what she thought would be a traditional portrait. Instead, Warhol took her to Times Square and had her sit for dozens of photos in the cheapie photo booths there, where all the going-steady high school kids went. The result – a sensational wall of photos of her in different giddy and seductive poses. Brilliant.

There are photos of Jackie Kennedy Onassis. One set is of her smiling on the morning before her husband’s murder and then, in the next strip, is her, somber, at the President’s funeral.  There is a wall full of his famous photo of Marilyn Monroe. There is a world famous, mammoth, and I mean mammoth, portrait of China’s Chairman Mao. One wall is filled with his fabled Campbell’s soup can paintings and another with his Coca Cola works.

Sprinkled among all of these paintings are real life photos and videos of Warhol at work.

There is a large television set on the third floor in which you see a truly bizarre video of Warhol simply eating a cheeseburger for lunch (he’s doing to get sick eating so fast!)

Warhol was also a well-known Avant Garde filmmaker and the museum is presenting dozens of his 16mm movies in a film festival in its third-floor theater. Some of these star the famous Ed Sedgwick, who appeared in many of his films and died tragically of a drug overdose.

Andy Warhol, who died at the age of 58 during a minor operation, led a simple middle-class existence until he arrived in New York. He was born in 1928 in Pittsburgh, was graduated from Carnegie-Mellon University there and then went to New York where he became well-known. He began his career as a commercial artist, earning money for drawings for magazine ads (dozens of them are in the show).

He became famous for his portraits of Campbell’s Soup cans. He painted them because as a kid his family was so poor that he and his brothers had Campbells Soup for lunch every day. Warhol said he had Campbell’s soup for lunch every day for 20 years. He also saw the soup can as a window into America. He was right.

The exhibit is a large open window on American history and culture in the 1960s and ‘70s and how the outlandish Warhol starred in it and, with his genius, changed it.

Andy Warhol not remembered? Hardly.

The exhibit runs through March 31.

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171533 https://historynewsnetwork.org/article/171533 0
Parents, College, Money, and the American Dream

The University of Southern California, one of the schools mixed up in the college admissions scandal. 

 

The front-page news about the college admissions bribery arrests has people talking about social class, fairness, status anxiety, helicopter parenting, and whether an expensive education can translate into a lifetime of wealth and happiness.  None of this is new.  In writing about the history of babies in the 20th century United States, I discovered early 20th century baby books distributed by banks and insurance companies prompting parents to save for college. At a time when less than 20 percent of Americans completed high school and far fewer went on to higher education, financial institutions told parents to start saving for college.  

 

Insurance companies, banks, and savings and loan firms enticed customers by encouraging parental hopes and dreams.  Just as manufacturers of disinfectants, patent remedies, and infant foods turned to baby books to advertise the products parents could buy to keep babies healthy, financial firms sold their services as ways for making babies wealthy and wise--in the future. For all kinds of companies, playing to parental anxieties and aspirations became the means of expanding their clientele. 

 

Consider this example. In 1915 an Equitable Life Assurance baby book advertisement in the Book of Baby Mine began, "Say, Dad, what about my college education?" At the time, high school graduation rates hovered below 13 percent and college attendance and graduation was much lower. Nevertheless, parents looked to the future with great hopes for their offspring. In 1919 the United States Children's Bureau conducted an investigation of infant mortality in the city of Brockton, Massachusetts. An immigrant Italian mother interviewed for the study reported she was saving to send all four of her young children to college. Clearly, in reaching out with a save-for-college message, financial firms were capitalizing on a common but mostly unrealized dream and helping to reinforce the message that college was a pathway to success. 

 

Banks promoted thrift by reaching out to customers via motion pictures, newspaper advertisements, and programs in schools collecting small deposits from children. Competition for savers grew as the number of banks doubled between 1910 and 1913. Accounts for babies soon became part of banks' advertising strategy. Savings and loans and banks gave away baby books with perforated deposit slips, slots for coins, or simply included pages for listing deposits into the baby's bank account. The 1926 Baby's Bank and Record Book even included a section on college savings estimating a future scholar would need $1000--a figure it derived, the advertisement explained, from the University of Pennsylvania catalog. In addition to citing this source, the ad included a helpful chart showing that saving $1 a week would, with compounding interest, yield $1065.72 in fifteen years.  

 

 

The Great Depression wiped out many of the banks and small insurance companies holding the savings of infants, children, and adults, thus erasing the hopes of many who had dreamed their child would obtain a college education.  However, as children withdrew from the workforce because of the lack of job opportunities and New Deal laws limited their employment, high school completion rates grew to 40 percent by 1935.  As scholars have pointed out, G.I. benefits after World War II (the Serviceman's Readjustment Act) and the National Defense Education Act of 1958 both led to big increases in college attendance thanks to the financial support they provided. 

 

What changed in the wake of enhanced federal financial support was not the desire for one's children to acquire more education, but the numbers of young people able to go to college. A quick look through baby books from the first half of the twentieth century shows the "go to college" message being sent and received well before government dollars came in to the picture. Banks and insurance companies knew what customers dreamed of for their offspring and they made it the centerpiece of some of their advertising. Today, the vast majority of students and families still save up and borrow to afford higher education. And, of course, financial firms still promote themselves as critical resources for fulfilling this dream. What just might surprise us is how, for over a century, banks and insurance companies have been delivering this message, aware of what parents thought about when they gazed at their new babies and thought about their futures.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171508 https://historynewsnetwork.org/article/171508 0
The Antigay "Traditional Plan" and the History of the Methodist Church

 

Recently in St. Louis, the United Methodist General Conference strengthened its ban on same-sex marriage, doubled-down on its prohibition against ordaining gay clergy, and sent an unmistakable message to its LGBTQ brethren. Advocates of this so-called “Traditional Plan” believe that Biblical teachings against homosexuality are unambiguous, the teachings of the Bible are eternal and unchanging, and that the church should not bow to political pressure when evaluating their stance on major issues. Advocates of LGBTQ inclusion look to other Biblical precedents, insisting that the Sermon on the Mount and Christ’s injunction to love one’s neighbor as oneself are the only moral guideposts they need when considering same-sex marriage, gay ordinations, and the place of LGBTQ members within the life of the church. Both sides are equally sure that God is on their side. 

 

Although conservatives in the church claim to be protecting the Bible from what they see as a dangerous faction hellbent on watering down Christian teachings, the Traditional Plan was not about defending the entire breadth of what the scriptures have to say regarding marriage and relationships, which would have included commentary on a range of practices. Proponents of the measure rejected proposals to add language condemning adultery, divorce, and polygamy into the resolution and focused entirely on homosexuality. 

 

At this moment, the future of the United Methodist Church is unclear, but feelings are raw and schism is likely. The St. Louis Conference was not routine, but it is not without historical precedent. It is ironic that that the church’s antigay coalition described their plan as “traditional,” because public schisms over social and political issues are deeply woven in the fabric of Methodist history. Whether it was pressure from Prohibitionists which ultimately convinced Methodists to ban fermented wine during communion or democratic rhetoric in the Jacksonian era which coincided with a schismatic wave of anticlericalism, Methodists have never shied away from pressing issues throughout their long history. 

 

But until this month, slavery was the only other issue that seriously threatened the unity of American Methodism. Just as Methodists today look to the Bible for guidance on LGBTQ inclusion, their nineteenth century forebearers turned to scripture when considering the morality of slavery. For instance, southern Methodists pointed to Romans 13:1-2, in which Paul instructs, “Let every soul be subject unto the higher powers. For there is no power but of God: the powers that be are ordained of God. Whosoever therefore resisteth the power, resisteth the ordinance of God: and they that resist shall receive to themselves damnation.” This convinced them that their slave-based hierarchy was divinely ordained and that it was sin to resist the clearly-evident will of God. Furthermore, proslavery theologians argued that if slavery was as grave a sin as their adversaries in the North claimed, there would be some kind explicit Biblical injunction against the practice. But there are no such teachings in the Bible and slavery was omnipresent in both the Old and New Testaments. For them, the truth of the Bible never changed, the institution of slavery had always existed, and would continue in perpetuity. 

 

Opponents of slavery looked to the Sermon on the Mount and the Golden Rule when arguing that it was a sin for Christians to degrade fellow humans as chattel. Like today, bothsides were equally sure that God was on their side. 

 

Tension reached a boiling point at the General Conference of 1844 in New York City. According to church law, bishops could not own slaves, but Bishop James O. Andrew of Georgia had inherited slaves through his wife. Antislavery forces wanted to suspend him from the ministry for violating church law. Southern delegates passionately defended Andrew and the institution of slavery while northerners argued that it was dangerous to vest religious authority in somebody who willingly participated in such a corrupt system. 

 

When neither side showed a willingness to back down, the church split into separate denominations. The new southern branch of Methodism rested on a cornerstone of slavery and white supremacy. The schism opened the door to legal wrangling over church property that embittered both sides for generations and which ultimately had nothing to do with morality and everything to do with wealth and power. The churches reunited in 1939, after a ninety-five-year estrangement.

 

Today, it might seem shocking that Christians would look to the Bible to defend slavery, but advocates were defending more than the institution; they genuinely felt that they were protecting the integrity of the Bible against an attempt by radicals in their own church to dilute the scriptures. This month, proponents of the Traditional Plan tapped into a Methodist tradition that should give them serious pause. Meanwhile, legal challenges over property might again become the ultimate arena for power plays masquerading as morality. 

 

The historical lessons of 1844 are abundantly clear. Methodists have always been committed to faith, prayer, charity, and walking with God, but still turn their backs on the brethren and struggle to live up the highest ideals of Christianity and their even their own motto, “open hearts, open minds, and open doors.” 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171510 https://historynewsnetwork.org/article/171510 0
The Death of Appeasement: the 80th Anniversary of the Invasion of Prague

Hitler entering Prague Castle, 15 March, 1939.

 

A turning point in the history of international relations refers to an event that alters significantly the present process in international relations, which entails a long-lasting, considerable effect in it. A turning point may not necessarily be the trigger to a significant change in international relations, but rather part of the underlying cause leading to it.

The turning point in the history of international relations in the 1930s occurred in 1939. However, rather than the outbreak of World War II, in September of that year, it was the invasion of German troops to what remained of Czechoslovakia in March 1939, following the Munich Agreement of 1938,  that represented that turning point. 

It was a significant landmark as it showed even to the most enthusiast supporters of the appeasement policy towards Nazi Germany that German ambitions went much beyond the supposed rectification of the wrongs done to Germany by the Versailles settlement of 1919, following World War I (or the Great War as it was then known).

The appeasement policy pursued by Britain and France was founded on the premise that Germany was maltreated by the victors of World War I, and that German grievances had some justification and could be accommodated in order to prevent the outbreak of a major European war.

Employing the rhetoric of the parliamentary democracies, Adolf Hitler and the Nazi regime argued that the German people had the right to collective self-determination. Consequently, the prohibition included in the Versailles settlement according to which Austria should remain a separate state and not be allowed to be part of a larger German state was deemed to be unjust. After all, shouldn’t the Austrian people ‘freely’ decide whether they wish to live in a separate sovereign entity or be incorporated into Germany?

By the same token, and following the same logic, Germany argued that the German inhabitants of the Sudeten region in Czechoslovakia should have the right of self-determination. The fact that the area concerned was an integral part of the sovereign territory of Czechoslovakia (incidentally the only truly parliamentary democracy in Central and Eastern Europe) only enhanced the case put forward by Germany. The German population in the Sudeten region was being treated harshly by the government in Prague, claimed Hitler and the Nazi propaganda machine. 

Thus, the case for national self-determination was no less valid as regards the German minority in Czechoslovakia than it was in the case of Austria. The more Germany accused Czechoslovakia for the supposedly inhuman treatment of its Germany inhabitants, the more likely the prospect of a general European war was feared by Britain and France. Why risk such a war if applying the right of collective self-determination could actually avert it? 

Of course, Britain’s Prime Minister, Neville Chamberlain, thought of that conflict as “a quarrel in a far-away country between people of whom we know nothing.” He didn’t seem to be particularly concerned with the fate of the German minority in Czechoslovakia or, indeed, with their right of self-determination. His policy, though, was based on the assumption that German grievances in this matter were related to the supposed wrongs inflicted on Germany and the German people following World War I.

To be sure, the real issue was not the quarrel to which Chamberlain referred to, but rather the ambitions of the Nazi regime. After all, even Chamberlain himself asked Hitler whether he had any further territorial demands, beyond the Sudeten region; to which the German leader replied in the negative. Indeed, Hitler stressed cynically, albeit in a rhetorical way, that he was not interested in adding Czechs and Slovaks to the German Reich. 

Once German troops entered Prague in March 1939, the whole conceptual edifice justifying German demands collapsed. There was no way anyone could logically justify the German move by resorting to the supposed evils imposed upon Germany by the Versailles settlement. The cherished principle of collective self-determination could apply now in reverse: it was the Czech people who were denied their right of self-determination. 

In a sense, it could be argued that the day German forces occupied Prague was the day that the anti-appeasers in Britain, led by Winston Churchill, turned from a cornered minority into a solid majority. It was also the watershed that altered once and for all the character of international relations in Europe: the time for pretense was over. The sense of remorse over the post-World War I settlement and the widely-acceptable principle of national self-determination could no longer justify any policy aimed at preventing war with Nazi Germany. 

March 1939 was the real turning point. September 1939 was to be the climax.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171456 https://historynewsnetwork.org/article/171456 0
Why is the Mona Lisa not the Mona Lisa?

 

One of the great riddles of modern times is why a 500-year portrait of a Florentine housewife, of no rank or title, is today the most famous painting in the world. But the mystery around that portrait has now deepened. New findings suggest that Leonardo da Vinci’s most celebrated work may not depict M(ad)onna Lisa, wife of the silk merchant Francesco del Giocondo, at all. For example, art experts have long believed that the painting dates from Leonardo’s late period, given its heavy sfumato or dark, “smoky” treatment. But a handwritten note, discovered in the library of the University of Heidelberg, has revealed that Leonardo painted the work as early as 1503, at least a decade before Leonardo’s late period. What’s more, a 16th century biographer, Giorgio Vasari, claims that Leonardo worked on it for only four years and then left it unfinished. That’s a bigger problem, because if anything, the Mona Lisa in the Louvre is as polished as any Leonardo work would ever get. And to top it off, Leonardo himself once told a visitor in 1517 that the portrait was made for Giuliano de’ Medici, brother of Pope Leo X, under whose patronage Leonardo lived in Rome between 1513 and 1516.

 

Why is this important? The simple answer is: we believe these sources don’t agree because they are not talking about the same painting. In other words, there must be two versions of the Mona Lisa, both painted by da Vinci. Would this be a radical theory? Not at all. We know, for example, that Leonardo painted two versions of The Virgin of the Rocks, now in the Louvre and the National Gallery in London, just as he painted two versions of The Virgin of the Yarnwinder, now in private collections. In many of these works, Leonardo allowed collaborators to “fill in” the less important parts, such as the background and the clothing, while he concentrated on the critical stuff: the faces and hands. This process served two purposes: it allowed Leonardo’s assistants to learn the master’s technique, and it increased the number of works that could be sold. Leonardo worked very slowly and only produced 18 paintings that we know of, which is a problem when you have a large workshop and many mouths to feed.

 

 

But if Leonardo painted two Mona Lisa’s, where is the other one? Diligent research into a number of possible candidates, notably by Salvatore Lorusso and Andrea Natali at the University of Bologna, has produced a winner: a portrait known as the Isleworth Mona Lisa, long believed to be a copy. Close inspection reveals that unlike all other 16th century versions of the Mona Lisa, the Isleworth is not a copy at all. In key respects, including its size, its composition, its landscape, and its support (it was painted on canvas, rather than wood), it is strikingly different from the Louvre Mona Lisa. What’s more, it depicts Lisa del Giocondo as a beautiful young woman. That makes sense, because in 1503 Lisa was just 24 years old, quite unlike the matronly figure who stares at us from behind her bulletproof glass at the Louvre. And here is the clincher: the Isleworth is unfinished, just as Vasari wrote. Most of the background never progressed beyond the gesso, the underpaint.

 

So why has no one else looked more closely at the Isleworth version? As we argue in our book, the answer is that for much of the past 500 years, she was not on public view. First acquired by an 18th century English nobleman, the work was discovered in 1913 by a British art dealer, Hugh Blaker, during a visit to a stately home in Somerset. The Isleworth then spent much of the 20th century in vaults, in part to protect her from two world wars. In the 1960’s, she was purchased by a collector, Henry Pulitzer, who soon thereafter locked her up in another vault, where she was discovered in 2012. And now that’s going to change, because there are plans to put her on exhibit in the land of her birth, later this year—as part of the commemoration of the 500th anniversary of Leonardo’s death in 1519. That’s when all the world will be able to see her in all her glory—as the true likeness of a lovely young mother who would remain on Leonardo’s mind for the rest for his life.

 

That brings us to the final question: if the Isleworth Mona Lisa is indeed the earlier version, then who is the older woman in the Louvre? Is she a later version of Lisa del Giocondo, painted in middle age? But why would Giuliano de’ Medici, a powerful aristocrat who took several mistresses, want a likeness of the wife of a silk merchant? Could the Louvre Mona Lisa depict one of his paramours instead? A number of candidates come to mind, but the solution that seems obvious to us is also the simplest: the Louvre Mona Lisa is no longer the likeness of a particular person. While Leonardo used the same “cartoon,” the preparatory drawing, from the earlier version, he was no longer interested in painting an actual portrait. Instead, he wanted to create an idealized image of motherhood and the mysterious forces of creation, as evidenced by the primordial landscape in the background. Remember, Leonardo was taken away from his mother, Caterina, when he was only five years old. That’s why Leonardo was so struck by Lisa, because in 1503 she was close in age to his last memory of his mother. And that’s why, for the Louvre painting, Leonardo chose Lisa again—not to paint her likeness, but to capture the eternal grace and mystery of womanhood.

 

Don't miss the TV special based on the author's research: The Search for the Mona Lisa, narrated by Morgan Freeman and produced by Pantheon Studios. Timed to coincide with the 500th anniversary of Leonardo’s death on May 2, 2019, the film shows that the portrait has become the center of a swirling controversy. If she is not the Mona Lisa, then who is she? Why did Leonardo da Vinci paint her? Using newly discovered evidence, and featuring Italian star Alexandro Demcenko as Leonardo, the film is a thriller-like pursuit for the real identity of the most famous portrait in the world.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171455 https://historynewsnetwork.org/article/171455 0
London at War, 1941, and a New Alice in Wonderland LONDON, ENGLAND: The German Luftwaffe staged yet another bombing raid on London last night. Prime Minster Winston Churchill said the German Air Force, sneaking through the Royal Air Force defenses in the sky, dropped more than 1,100 bombs on the southeastern area of London. 122 people were killed and 42 injured in the raid, yet another strike on England by Hitler’s Germany

Below ground, hiding with tens of thousands of others in London’s underground train stations, was Alice Spencer, a teenager whose boyfriend, Alfred, is slowly dying of tuberculosis and lying on a nearby cot. She cares for him in the crowded station while reading her favorite book, Lewis Carroll’s classic Alice’s Adventures in Wonderland. There is a series of explosions, the lights go dim and suddenly, without warning, Alice, looking for an escape from the grimness of the scene, and the poor health of her love, tumbles down an underground rabbit hole, pulling some of the other people in the train station with her. They all find themselves in Wonderland, populated by the Queen of Hearts, Mr. Caterpillar, the White Rabbit and others, a thoroughly zany place to be while a World War rages above them.

This is the very unusual start of Alice By Heart, a new musical with book by Steven Sater and Jessie Nelson and music and lyrics by Duncan Sheik and Sater, that just opened at the new MCC Theater at 511 W. 52d. Street, in New York. It starts slowly and it takes a while to figure out who is who and what the authors are trying to do with the 154-year-old Carroll classic story of the girl who chased a rabbit and fell head first down a hole into Wonderland. About twenty minutes into the musical, though, everything starts to click. What follows is a highly amusing, thoroughly enjoyable and at times brilliant show, a unique and highly creative re-invention of Carroll’s timeless story.

In the play, the British Alice goes through many of the same adventures as the Alice in the book, cavorting with the Caterpillar, chasing the White Rabbit and shaking her head at the antics of the Mad Hatter. She is put on trial by the Queen of Hearts, a delightful character who must have the loudest and scariest screech in the world. She could wake up people in Brazil with that screech. At the end of the Wonderland trip, she returns to the shelter to try to save her boyfriend.

What makes Alice by Heart succeed is the stellar skill of its ensemble cast. Each person not only acts out his/her role, but pays careful attention to the movements of the others. The two stars of the show are Molly Gordon as Alice and Colton Ryan as Alfred. Others in the fine cast include Kim Blanck, Noah Galvin, Grace McLean, Catherine Ricafort, Heath Saunders, Wesley Taylor and others.

 Jessie Nelson does a fine job of directing the play. The choreography, by Rick and Jeff Kuperman, is very impressive. 

The musical has some wonderful special effects. As everybody knows, the Queen is always asking for people to decapitate Alice (“Off with her head!). They do that in the show. As Alice sings, you see her shadow on a large white sheet. After a few seconds her shadow is rather neatly beheaded. In another scene one actress douses Alice with a magical dust that slowly hovers in the air. People in gas masks run amuck. At the end of the play, there is a wonderful musical number about the Mock Turtle in whish soldiers sing and dance in a wondrous mass. There are people popping out from underneath a woman’s skirt, a caterpillar who grows in size as people jump on and off of him. The magnificently dressed Mad Hatter leaps onto and off of a table.

The musical does have its flaws. It has a slow and dreary start. The music is OK, but after while some songs sound just like another. There is also too much bouncing back and forth between the underground station and Wonderland.

The writers should also have included more history on the “Blitz” bombing of London by the Luftwaffe during the war. My father was a GI in London in WW II and he told me really scary stories about hiding out in the underground during the bombings. The playwrights should have given audiences a better picture of that. The British use of their subways as bomb shelters was fascinating. During the height of the bombings the Luftwaffe struck just about every night of the week. The subways held roughly 150,000 men, women and children, who got on line at 4 p.m. to secure a spot in the underground train shelters. The shelters were run by the Red Cross, the Salvation Army and other charitable groups.  Concerts, films, plays and books donated by local libraries were used as entertainment for the residents.

The shelters were not completely safe. Direct bomb hits wiped out some, killing hundreds. 173 people were trampled to death in a panic that followed a woman’s fall down the stairs in the Bethnal Green subway station.

Lewis Carroll’s two books, Alice’s Adventures in Wonderland (1865) and Through the Looking Glass(1871) generated forty movies and television show and, of course, the rock and roll hit “White Rabbit,” performed by Grace Slick and the Jefferson Airplane in the late 1960s.

Despite these small complaints, Alice by Heart is a smart, nifty show about London in World War II and yet another colorful tale of the Mad Hatter and Wonderland.

Can you really spend a better evening than chasing a white rabbit trotting through the forest with a pocket watch in his hand?

PRODUCTION: The play is produced by the MCC Theater. Set Design: Edward Pierce, Costumes: Paloma Young, Lighting: Bradley King, Sound: Dan Moses Schreier. The play is directed by Jessie Nelson and choreographed by Rick and Jeff Kuperman. It runs through April 7.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171534 https://historynewsnetwork.org/article/171534 0
Why the U.S. Bombed Auschwitz, But Didn't Save the Jews

Hungarian Jewish Women and children after their arrival at Auschwitz.

 

Seventy-five years ago this week—on March 19, 1944—German troops marched into Hungary. The country’s 800,000 Jews, the last major Jewish community to have eluded the raging Holocaust, now lay within Hitler’s grasp.

The railway lines and bridges across which Hungary’s Jews would be deported to the Auschwitz death camp in Poland were within range of American bombers. So were the gas chambers and crematoria in the camp itself. The refusal of the Roosevelt administration to drop bombs on those targets is one of the most well-known and troubling chapters in the history of international responses to the Holocaust.

What few realize today is that U.S. planes actually did bomb part of Auschwitz—but they left the mass-murder machinery, and the railways leading to it, untouched. Why?

The same week that the Germans occupied Hungary, two Jewish prisoners in  Auschwitz were in the final stages of plotting to escape, something that only a tiny handful of inmates had ever accomplished. Their goal was to alert the Free World that the gas chambers of Auschwitz were being readied for the Jews of Hungary. They hoped these revelations would prompt the Allies to intervene.

On April 7, 1944, Rudolf Vrba, 19, and Alfred Wetzler, 25, slipped away from their slave labor battalion and hid in a hollowed-out woodpile near the edge of the camp. On the advice of a Soviet POW, the fugitives sprinkled the area with tobacco and gasoline, which confused the German dogs that were used to search for them.

After three days, Vrba and Wetzler emerged from their hiding place and began an eleven-day, 80-mile trek to neighboring Slovakia. There they met with Jewish leaders and dictated a 30-page report that came to be known as the "Auschwitz Protocols." It included details of the mass-murder process, maps that pinpointed the location of the gas chambers and crematoria, and warnings of the impending slaughter of Hungary's Jews. "One million Hungarian [Jews] are going to die," Vrba told them. "Auschwitz is ready for them. But if you tell them now, they will rebel. They will never go to the ovens." 

 

What FDR Knew, and When

The fate of Hungarian Jewry unfolded before the eyes of the world. Unlike previous phases of the Holocaust, which the Germans partially succeeded in hiding from the international community, what happened in Hungary was no secret.

A common refrain among defenders of President Franklin D. Roosevelt’s response to the Holocaust is the claim that he and his administration learned about the deportations from Hungary too late to do much about it. For example, a recent essay in The Daily Beast, by journalist Jack Schwartz, claimed that “The Allies learned of the Hungarian deportations and their lethal destination in late June”—that is, not until five weeks after the deportations commenced. 

But in fact, Washington knew what was coming. At a March 24, 1944, press conference, FDR, after first discussing Philippine independence, farm machinery shipments, and war crimes in Asia, acknowledged that Hungary’s Jews “are now threatened with annihilation” because the Germans were planning “the deportation of Jews to their death in Poland.” The president blurred the issue by coupling it with a remark about the danger that “Norwegians and French” might be deported “to their death in Germany,” but the key point is clear: If we wonder “what did they know, and when did they know it,” the answer with regard to Hungary is that the Roosevelt administration knew plenty, and knew it early.

The Holocaust in Hungary was widely reported, and often in timely fashion, by the American news media (although it was not given the prominence it deserved). For example, on May 10, nine days before the deportations to Auschwitz began, the New York Times quoted a European diplomat warning that the Germans were preparing “huge gas chambers in which the one million Hungarian Jews are to be exterminated in the same fashion as were the Jews of Poland.” 

Likewise, on May 18, the Times reported that “a program of mass extermination of Jews in Hungary” was underway, with the first 80,000 “sent to murder camps in Poland.” The notion that the Roosevelt administration only learned about all this in “late June” is preposterous.

 

Appeals for Bombing

Meanwhile, copies of the Auschwitz escapees’ report reached rescue activists in Slovakia and Switzerland. Those activists then authored an appeal to the Roosevelt administration to bomb “vital sections of these [railway] lines, especially bridges” between Hungary and Auschwitz, “as the only possible means of slowing down or stopping future deportations.” The plea reached Washington in June.

Numerous similar appeals for bombing the gas chambers, or the rail lines and bridges leading to them, were sent to U.S. officials by American Jewish organizations throughout the spring, summer, and fall of 1944.

Assistant Secretary of War John McCloy was designated to reply to the requests. He wrote that the bombing idea was "impracticable" because it would require "diversion of considerable air support essential to the success of our forces now engaged in decisive operations." He also claimed the War Department's position was based on "a study" of the issue. But no evidence of such a study has ever been found by researchers. 

In reality, McCloy's position was based on the Roosevelt administration’s standing policy that military resources should not be used for "rescuing victims of enemy oppression."

The aforementioned Daily Beast article claimed that the administration’s rejection of the bombing requests “reflected military reality as perceived by a defense establishment with stretched resources trying to meet the diverse demands of an all-encompassing war.”

That’s nonsense. The “military reality” was that at the same time McCloy was saying Auschwitz could not be bombed, Auschwitz was being bombed. Not the part of Auschwitz where the gas chambers and crematoria were situated, but rather the part where slave laborers were working in German oil factories.

On August 20, a squadron of 127 U.S. bombers, accompanied by 100 Mustangs piloted by the all-African American unit known as the Tuskegee Airmen, struck the factories, less than five miles from the part of the camp where the mass-murder machinery was located.

 

What Elie Wiesel Saw

Future Nobel Laureate Elie Wiesel, then age 16, was a slave laborer in that section of the huge Auschwitz complex. He was an eyewitness to the August 20 bombing raid. Many years later, in his best-selling book ‘Night’, Wiesel wrote: “If a bomb had fallen on the blocks [the prisoners’ barracks], it alone would have claimed hundreds of victims on the spot. But we were no longer afraid of death; at any rate, not of that death. Every bomb that exploded filled us with joy and gave us new confidence in life. The raid lasted over an hour. If it could only have lasted ten times ten hours!” 

There were additional Allied bombing raids on the Auschwitz oil factories throughout the autumn. American and British planes also flew over Auschwitz in August and September, when they air-dropped supplies to the Polish Home Army forces that were fighting the Germans in Warsaw. They flew that route twenty-two times, yet not once were they given the order to drop a few bombs on the death camp or its transportation routes.

Adding insult to inaccuracy, Jack Schwartz claimed (in The Daily Beast) that “in Palestine, the Jewish Agency [the Jewish community’s self-governing body] overwhelmingly opposed the bombing [of Auschwitz] on the grounds that it would likely take Jewish lives,” and “American Jewish leaders were equally divided over the issue, which led to recriminations during and after the war.”

Wrong, and wrong again. The minutes of Jewish Agency leadership meetings show they opposed bombing for a period of barely two weeks, and even then only because they mistakenly thought Auschwitz was a labor camp. Then they received the Vrba-Wetzler “Auschwitz Protocols,” revealing the true nature of the camp. At that point, Jewish Agency representatives in Washington, London, Cairo, Geneva, Budapest and Jerusalem repeatedly lobbied U.S., British and Soviet officials to bomb Auschwitz and the routes leading to it.

As for American Jewish leaders, a grand total of one of them urged the Allies to use ground troops against Auschwitz instead of air raids. By contrast, pleas in support of bombing were made in Washington by multiple representatives of the World Jewish Congress, Agudath Israel, the Labor Zionists of America, and the Emergency Committee to Save the Jewish People of Europe (the Bergson Group). Calls for bombing also appeared in the columns of a number of American Jewish newspapers and magazines at the time.

 

Motives for Rejection

Now we come to the vexing question of why the Roosevelt administration rejected the bombing requests.

The explanation that the administration gave at the time—that bombing Auschwitz or the railways would require diverting bombers from battle zones—was clearly false, since we know that U.S. bombers did bomb other targets within the Auschwitz complex (the oil factories).

A second argument has been made by some FDR apologists: that bombing was a bad idea because some of the Auschwitz inmates would have been killed. But that does not hold up, either—first, because that was not the reason given for the rejections at the time; and second, because it fails to explain why the administration refused to bomb the railway lines and bridges, which would not have involved any risk to civilians.

So what, then, was the real reason for the administration’s rejection?

In all likelihood, it was the result of several factors. One was old-fashioned antisemitism. The antisemitic sentiments rife among senior officials of the State Department and War Department have been amply documented. What about the White House? Jack Schwartz, in The Daily Beast, mocked any suggestion that President Roosevelt harbored antisemitic feelings, pointing out that he “surrounded himself with Jewish advisers” and “staffed the New Deal…with Jewish activists.” In other words, some of FDR’s best friends were Jewish.

A more informed perspective would consider Roosevelt’s actual statements on the subject. For example, as a member of the Harvard board of governors, he helped impose a quota on admitting Jewish students so they would not be “disproportionate,” as he put it. He called a questionable tax maneuver by the owners of the New York Times in 1937 “a dirty Jewish trick.” He said in 1938 that the behavior of “the Jewish grain dealer and the Jewish shoe dealer” was to blame for antisemitism in Poland. 

FDR continued to make such remarks (behind closed doors) in the 1940s. He complained to his cabinet in 1941 that there were “too many Jews among federal employees in Oregon” (which he had recently visited). In 1942, he used the slur “kikes” in reference to Jewish Communists. At the Casablanca conference in 1943, he said strict limits should be imposed on North African Jews entering professions, in order to “eliminate the specific and understandable complaints which the Germans bore towards the Jews in Germany, namely, that while they represented a small part of the population, over fifty percent of the lawyers, doctors, school teachers, college professors, etc, in Germany, were Jews.” 

Do such statements reflect antisemitism? Or when it comes to assessing antisemitism, should there be one standard for revered former presidents and a different standard for everyone else?

Another factor in the decision not to bomb Auschwitz was a practical consideration: rescuing Jews meant the Allies would be stuck with a lot of Jewish refugees on their hands. At one point during the war, a senior State Department official warned his colleagues that any U.S. action to rescue Jewish refugees was “likely to bring about new pressure for an asylum in the Western hemisphere.” Another official characterized Jewish refugees as a “burden and curse,” and he worried about the “danger” that the Germans “might agree to turn over to the United States and to Great Britain a large number of Jewish refugees.”

This is not to say that antisemitism and the fear of pressure to admit refugees were the decisive factors. More likely they served to buttress or reinforce the main factor, which was the overall mindset in the administration that America had no national interest nor moral obligation to pursue humanitarian objectives abroad. 

This attitude was articulated, most notably, in the War Department’s internal decision, in early 1944, that it would not utilize any military resources “for the purpose of rescuing victims of enemy oppression unless such rescues are the direct result of military operations conducted with the objective of defeating the armed forces of the enemy.”

Bombing bridges and railway lines over which both deported Jews and German troops were transported could have qualified as necessary for military purposes. But not when the prevailing attitude in the White House and other government agencies was one of hardheartedness when it came to the Jews, reinforced by antisemitism and nativism. 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171509 https://historynewsnetwork.org/article/171509 0
History Will Clash With History in the 2020 Election

 

Americans love predicting the future based on historical precedent. “No president has ever been reelected when the unemployment rate was above X percent,” “Missouri has predicted the winner for the last so-and-so many years,” etc. 

As we look to the 2020 presidential election, three different historical precedents all suggest a different outcome. In other words, “history clashes with history” and, at the end of it all, the historical record will change. 

First, three consecutive Presidents have won and completed two terms of office only twice in American history. The first period is that of Thomas Jefferson, James Madison and James Monroe from 1801-1825.  Then, from 1993-2017, Bill Clinton, George W. Bush, and Barack Obama all served two terms.

Never before has the fourth President in a row won two terms. If Donald Trump goes on to win a second term and finishes that term in January 2025, it will be unprecedented. This legacy suggests it would be unlikely Trump will win. 

A second historical trend, however, indicates that it is extremely unlikely a Republican would lose the presidency to a Democrat. A political party has lost control of the White House after serving one term just once in the past 31 election cycles. The norm has been that a political party serves at least eight years before being ousted. For example, Democrats from 1853-1861 (Franklin Pierce, James Buchanan), Democrats from 1913-1921 (Woodrow Wilson), Republicans from 1953-1961 (Dwight D. Eisenhower), Democrats from 1961-1969 (John F. Kennedy, Lyndon B. Johnson), and Republicans from 1969-1977 (Richard Nixon, Gerald Ford).  Also, parties have held control of the White House for 12 years, as from 1789-1801 (Federalists George Washington and John Adams), 1829-1841 (Democrats Andrew Jackson, Martin Van Buren), 1921-1933 (Republicans Warren G. Harding,  Calvin Coolidge, and Herbert Hoover), and 1981-1993 (Republicans Ronald Reagan and George H. W. Bush).  We also had 16 years of the same party from 1897-1913 (Republicans William McKinley, Theodore Roosevelt, William Howard Taft), 20 years of the same party from 1933-1953 (Democrats Franklin D. Roosevelt and Harry Truman), and 24 years of the same party from 1801-1825 (Democratic Republicans Thomas Jefferson, James Madison, James Monroe), and 1861-1885 (Republicans Abraham Lincoln, Andrew Johnson, Ulysses S. Grant, Rutherford B. Hayes. James A. Garfield, and Chester Alan Arthur).

The only times when a party controlled the White House for only one term was 1841-1845 (William Henry Harrison, John Tyler), 1845-1849 (James K. Polk), 1849-1853 (Zachary Taylor, Millard Fillmore), 1885-1889 (Grover Cleveland), 1889-1893 (Benjamin Harrison), 1893-1897 (Grover Cleveland), and 1977-1981 (Jimmy Carter).  So, a party has lost control of the Presidency after only one Presidential term only once in the past 31 election cycles.  If a Democrat wins in 2020, therefore, it would go against historical precedent. 

The third historical trend is that most presidents who served just one term faced a significant challenge from within their party prior to the general election. Every President since 1900 who has lost reelection, except for one case, did so after their party nomination was challenged.  This is the case of William Howard Taft in 1912 (challenged by Theodore Roosevelt), Gerald Ford in 1976 (challenged by Ronald Reagan), Jimmy Carter in 1980 (challenged by Ted Kennedy and Jerry Brown), and George H. W. Bush in 1992 (challenged by Pat Buchanan)  The one exception is Herbert Hoover in 1932. 

The ongoing investigations into Donald Trump’s Presidential campaign, his Presidential administration, and every other aspect of his life and business activities coupled with the likelihood of an economic slowdown, makes it harder to think Trump will win reelection. Whether he will face a serious challenge within his party seems unlikely at this time, but it would certainly not be unprecedented. 

Regardless of what happens, history will clash with history, and history in some fashion will be changed as a result.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171511 https://historynewsnetwork.org/article/171511 0
The Cult of Trump

 

I am a ferociously independent  and passionately moderate voter. But in 2016 I voted a straight party ticket for the Democrats. It was the only way - however meek - to send a message to the Republican Party: You gave us Trump, shame on you, now get rid of him.  

But Republican legislators, who have much to lose in the long run, cave in to the Trump base fearing they might be “primaried” and lose their seats, and give the disaster that is the Trump presidency a free pass.  They are wrong morally and politically to do so.  What happened to the Republican Party? Where is it hiding?  Wherever you are, come back, we need you badly.  

Part of the problem is that many Republicans confuse Donald Trump with a Republican. He is not a party president, and barely a party member. He is instead, the leader of a cult. This transformation was noted a few months back by then Republican Senator Bob Corker of Tennessee, who openly asked if his party was “becoming a cultish thing”.  Donald Trump does not represent traditional Republican values, policy positions, or core principles such as limited government, individual freedom, respect for democratic norms, freedom of expression and press, active and strong democratic alliances, a pro-democracy foreign policy, and the dignity of public service.  No, this is not your daddy’s Republican Party, it is Trump’s personal cult. 

As the Republicans morph from national party to cult of personality, it might be useful to reflect on the role cults have played in American politics.  Ordinarily, when we think of cults we think of religious cults, but there have been political cults in America as well (see: Dennis Tourish and Tim Wohlforth’s ON THE EDGE: POLITICAL CULTS RIGHT AND LEFT, 2000).  Not surprisingly, cults have had a very short shelf-life in the United States. A largely pragmatic, non-ideological nation, Americans have been suspicious of extremism and narrow politics. Part of the reason for the failure of cults to catch on in the U.S. can be seen in Alexis de Tocqueville’s 1835 reminder in DEMOCRACY IN AMERICA that “the spirit of association” exerts a powerful influence in the United States, and that such associations, often cross-cutting ideologically and politically, help produce a more moderate and perhaps even a more tolerant political atmosphere.  

Parties in the U.S. have succeeded by being “big tents" that are somewhat inclusive, large, and centrist (center –right for the Republicans, center-left for the Democrats). By contrast, cults are narrow, extreme, and exclusive. Traditionally, parties served as gatekeepers keeping radical extremists at the fringes and not allowing them to capture the party.  In Europe, the rise of fascism in the 20th Century was seen by many as a cult movement, and in North Korea today, some see the 60 year rule of the Kim family as a three-generation “cult of personality." But in the United States, such takeovers by cults have been largely unknown.  

What are the key characteristics of a cult, and how closely does Trump fit the bill?  Cults blindly and mindlessly follow a charismatic leader. Donald Trump recognizes this element in his base with such sayings as  "I could stand In the middle Of Fifth Avenue and shoot somebody and I wouldn't lose any voters.” Indeed, such is the blind loyalty to Trump that he is probably right.  Cults worship their leader. Even if the leader says things like “John McCain isn’t a hero,” or trashes Gold Star families, his base applauds and follows their Pied Piper wherever he wants to take them. 

The cult leader’s word is gold to his followers. And so, his base turns a blind eye in the face of thousands (yes, thousands) of lies he tells. Cults have their ritualistic chants. “Lock her Up, Lock her Up” and “build the Wall” are shouted at Trump rallies from coast to coast. Cult leaders claim to be on a “special mission” and as Trump says “Only I can do it.” Cults have insiders and the rest of the world is an outsider to be berated and hated.  Cult leaders are not accountable, and thus Trump says he will not release his tax returns as previous presidents have routinely done. Cults believe the ends justify the means, and thus the President bullies, demeans, and calls others ugly names which seems to the rest of us undignified and unpresidential, but to the Trump cult is fully justified (my mother would have washed my mouth out with soap if I had said such things). 

Former cult members write scathing exposes of their terrible lives within the cult. After only 2 years in office, a spate of Trump administration tell-all books are coming out describing the horrors of working for such a monstrous boss. Cults have a persecution complex. How many times does Trump tweet out messages condemning Saturday Night Live for its impersonations of the President? How many times does Trump blame the press for his failings? Cults engage in group-think. Cults kowtow to the leader’s every whim. They show disdain for non-members. Cults are paranoid. How many tweets does one have to read to see that our President thinks the media, our allies, college professors, and authors are out to get him? Cults control the information members receive. And of course, President Trump calls the media “the enemy of the people,” not to be believed, and that his base should “not believe what you see and hear, believe me.”  Cults tolerate even celebrate the inappropriate and egregious behavior of their leader. And so, Trump is not to blame for the strippers, and Playboy models with whom he may have had relationships while still married. Boys will be boys, or demeaning references to women is just “locker room talk.”  I could go on.

What to do?  The Republicans gave us Trump, they should  now clean up their mess.  Someone must take him on in the primaries. Republicans need to return to their core values and principles and not be pet poodles for Donald Trump’s excesses. Our system works best when we have two strong parties that vie for power but can come together at times for the good of the nation. Caving in to the cult leader is not politics, it is party suicide. If the Republican Party is to survive into and beyond the next decade, it must wrestle from the Trump cult, control of the party. If not, it deserves to be electorally defeated and to collapse into the dust bin of history.  

Donald Trump did not create the conditions that allowed for his rise. Global events, easily witnessed in the increasingly fractious politics of Europe, are challenging liberal democracies with a brand of illiberal democracy that threatens rule of law systems across the globe. Donald Trump is the American manifestations of this dangerous drift. He is riding a wave he did not create but has been masterful at exploiting. But just as the United States resisted the temptations of Communism and Fascism in the 1930s and 40s, and instead, committed to a rule of law system for our country, we need now to recommit to liberal democracy and the rule of law in our age. 

Where will we be when this long national nightmare is over?  Will our democracy be stronger? Our society more just and equal? Our politics more civil and our language more compassionate? Will we move towards what our better angels would have us do, or will we follow the cult of leadership towards a politics of fear and division. Come home Republicans.  We need you.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171507 https://historynewsnetwork.org/article/171507 0
The History of International Women’s Day and the Ongoing Fight for Gender Equality Steve Hochstadt teaches at Illinois College and blogs for HNN.

 

Theresa Serber Melkiel

 

Last Friday, March 8, was International Women’s Day. You might not have known that, since little notice is given to this date in the US, even though American women initiated it. Here in Berlin, one could not help but be aware of this special day, because the city government had declared it a holiday, and everything was closed except restaurants and museums.

 

A “National Women’s Day” was first declared by women in the Socialist Party of America for February 28, 1909. It was proposed by Theresa Serber Malkiel (1874-1949), whose story exemplifies the history of the uneasy connection between leftist politics and women’s rights in Europe and America, and the continued relevance of a “women’s day”.

 

As part of the emigration of 2 million Jews from the increasingly antisemitic Russian Empire between 1881 and the beginning of World War I, the Serber family moved from Ukraine to New York in 1891. Theresa went to work in a garment factory. At age 18, she organized the Woman’s Infant Cloak Maker’s Union of New York, mostly Jewish women workers, and became its president. Like many trade unionists, she gradually came to believe that socialism was the only path to liberation for workers and for women. She led her union into the Socialist Labor Party, the first socialist party in the US, the next year. Angered at the authoritarian tendencies of the SLP leader, Daniel De Leon, she and others joined with Midwestern socialists Eugene Debs and Victor Berger to form the Socialist Party of America in 1901.

 

At that time, both in the US and in Europe, socialists were the only political group to openly advocate women’s equality. In contrast to suffragists, socialists argued that gaining the vote was only the first step in creating an egalitarian society. But Theresa Serber almost immediately attacked the tendency of socialist men to say they favored gender equality, but to do nothing to bring it about, even within their own ranks. She formed separate women’s organizations to reach out to women workers and discuss their particular issues. She denounced the relegation of women in the Party to traditional women’s roles: women were “tired of their positions as official cake-bakers and money-collectors.” In 1909 she published an essay, “Where Do We Stand on the Woman Question?” criticizing her socialist “brothers” for their attitude toward female colleagues: “they discourage her activity and are utterly listless towards the outcome of her struggle.”

 

That year, Serber was elected to the new Women’s National Committee of the Socialist Party, and she promoted the idea of a “National Women’s Day” on February 28. In 1910, she published “The Diary of a Shirtwaist Worker”, a novel about the 3-month strike by about 20,000 mostly Jewish women factory workers in New York, the largest strike by women to that point in American history, which won better pay and shorter hours.

 

In 1910, German socialist women at the International Socialist Women's Conference in Copenhagen proposed creating an annual Women’s Day to promote equal rights. By 1914, March 8 was established as the day for demonstrations across Europe and America. The importance of this event grew when a women’s strike on March 8, 1917, in St. Petersburg began the Russian Revolution.

 

Women won the vote across Europe and America over the next few years: Russia 1917, Germany 1918, United States 1920, England 1928, although many individual American states had already given women the vote. Some nations moved slowly toward women’s suffrage: France and Italy only granted women voting rights in 1945.

 

But as socialist women had argued for decades, neither one celebratory day a year nor the right to vote brought equal rights. March 8 was declared a national holiday in many communist countries, but women continued to occupy secondary social, economic and political roles. Even after feminists in the US began in the 1960s to use the day to protest their continued subordinate status and the United Nations declared International Women’s Day in 1975, equality was still far away.

 

The socialist origins of a day devoted to women’s rights exemplifies the long-lasting political controversy over gender equality. The idea of equal rights was heretical for conservatives: a German poster calling for the vote for women on March 8, 1914, was banned by the Emperor’s government. Issues of equal rights continue to be marked by partisan political division in the US. The Lily Ledbetter Fair Pay Act was passed in 2009, supported by Democrats in the House 247-5 and in the Senate 56-0, and opposed by Republicans 172-3 in the House and 36-5 in the Senate. Democrats support the #MeToo movement and Republicans mock it. The Republican Party itself, as represented in Congress, is overwhelmingly male: 93% in the House and 85% in the Senate. Democrats are represented by a more equal, but not yet equal gender division: about 62-38 male in both chambers.

 

The same differences exist in Germany, but with more women overall. From left to right, the percentages of women delegates in the Bundestag, the federal legislature, are: Left 54%, Greens 58%, Social Democrats 43%, Free Democrats 24%, Christian Democrats 21%, and right-wing Alternative for Germany 11%.

 

A major point of discussion in German politics is the introduction of a gender quota system to insure equal representation in legislative assemblies. The Left Party proposed in November a law that would raise the proportion of women in the Bundestag, but it was voted down by a majority led by the Christian Democrats and Free Democrats. The far right Alternative for Germany was most vehemently against any effort to raise the proportion of women.

 

In the state of Brandenburg, ruled by a leftist coalition of Social Democrats, Greens and Left Party, the first German law requiring all parties to put forward equal numbers of men and women in their lists of candidates starting in 2020, the Parity Law, was passed this January.

 

The Social Democrats in Berlin proposed at the end of 2018 that March 8 should be a new holiday, and this was passed in January with the support of the Left and Greens. A coalition of activists used the March 8 holiday as a Kampftag, day of struggle, including a demonstration of about 10,000 people. Their demands included that abortion be fully legalized, pay be equalized, and more action be taken against sexism in daily life, especially violence against women.

 

International Women’s Day serves to highlight the remaining gender inequality in our society. The #MeToo movement exemplifies the much more vigorous public discussion of how to keep moving toward equality and the need for significant behavioral changes for both men and women to make that possible.

 

The goal is to make International Women’s Day superfluous.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/blog/154193 https://historynewsnetwork.org/blog/154193 0
Roundup Top 10!  

 

The government effort to make FOIA “as bad as possible”

by Nate Jones

The Department of Justice's historical effort to weaken the Freedom Of Information Act and why Congress must strengthen the law.

 

Why the College-Admissions Scandal Is So Absurd

by Alia Wong

For the parents charged in a new FBI investigation, crime was a cheaper and simpler way to get their kids into elite schools than the typical advantages wealthy applicants receive.

 

 

The real history of women wouldn’t look quite so nice on a tote bag

by Laurie Penny

“Empowerment” has always been more palatable and easier to sell than the idea of women simply taking power, and it’s more cheerful than the reality that plenty of women’s history has been defined as much by frustration and pain as by perky self-actualization.

 

 

How presidential empathy can improve politics

by Jeremi Suri

The legacy of FDR and his fireside chats.

 

 

The Black Gun Owner Next Door

by Tiya Miles

I’m an African-American historian and, on most issues, decidedly liberal. Could I rethink my anti-gun stance?

 

 

America’s Long History of Hysteria about Women’s Veils: Jeanine Pirro and Ilhan Omar

by Juan Cole

In fact, nothing is more American historically than veiling and debates on veiling.

 

 

The Case for Reparations

by David Brooks

A slow convert to the cause.

 

 

What's Behind the Lamentations Over History?

by James Grossman

Max Boot’s questions imply change over time. But we can’t know why something has declined if we don’t know what conditions have changed.

 

 

Getting the Right History vs. Getting the History Right

by L.D. Burnett

An excellent summary of recent popular critiques of historians--and a rebuttal.

 

 

Woodrow Wilson and ‘the Ugliest of Treacheries’

by Erez Manela

After World War I, America was supposed to lead the fight against colonialism. What happened?

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171505 https://historynewsnetwork.org/article/171505 0
Mennonite Values

Menno Simons

I claim to be "genetically Mennonite." Of course, since Mennonites are a religious group, not a racial/ethnic group, the claim is oxymoronic. Nevertheless, I mean it seriously as well as tongue-in-cheek. It does turn out, I think, that all Loewens in the world, at least all I have ever met, are of Mennonite origin. Add anything -- "Loewenberg," "Loewenstein" -- and it's Jewish. Subtract -- "Loewe," "Lowe" -- and it's likely Jewish but not always. But "Loewen," ironically meaning "lions," is usually Mennonite. 

Mennonites are followers of the Protestant minister Menno Simons, who lived in Holland 1496-1561. Mennonites were the first group in the Western World to come out against slavery and against war. Particularly that last stand -- against military service -- has caused them centuries of hardship and grief. 

"Old Order Mennonites" are also called Amish, and they famously forbear modern technology. Most "regular" Mennonites look like everyone else. "My" Mennonites, in Mountain Lake, MN, were good farmers, among the first to electrify. Besides, my dad stopped being a Mennonite and a believer when he was about 24. I was born when he was 39. So I was definitely "regular." Indeed, I grew up Presbyterian, since that church was closest to my house, and since Mom was a Christian. 

Nevertheless, my sister and I recently talked with each other about these matters, and we agreed that some Mennonite values seeped into our upbringing. We both seem to favor the underdog, for example. We both have worked for social justice. We are not impressed by mansions or BMWs. Today I am happy to choose my Mennonite heritage, if not religiously, well, then, as a statement of my values.

In particular, on the last page of the coffee-table book, In the Fullness of Time: 150 Years of Mennonite Sojourn in Russia, by Walter Quiring and Helen Bartel (3rd edition, 1974), are nine lines. Perhaps they are by Menno Simons; I have asked Mennonite scholars but they do not know. They sum up Mennonite values for me. I am particularly taken with the two words "we hope." What a modest claim! We hope that the good and the mild will have the power. Surely they ought to! 

Whose is the Earth?

Whose is the Earth? The toiler's. Who rules the earth? The wise one. Who has the might? Only the good, we hope, and mild. Vengeance and fury devour themselves. The peaceful abide and save. Only the wisest shall be our guide. The chain does men no honour and even less the sword.  

At the end of my life, I publish these lines thinking that they may come to be meaningful to you. You can claim them just as well as I can! You don't have to be genetically Mennonite to do so! Remember, genetically Mennonite is a contradiction anyway. You don't even have to attend a Mennonite church. (I go Unitarian. But that's another story.) "The chain does men no honor and even less the sword." 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/blog/154192 https://historynewsnetwork.org/blog/154192 0
What Historians Are Saying About the College Admissions Scandal Click inside the image below and scroll to see tweets.

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171480 https://historynewsnetwork.org/article/171480 0
The Most Alarming Argument in Jill Lepore's These Truths

 

“Hiroshima marked the beginning of a new and differently unstable political era, in which technological change wildly outpaced the human capacity for moral reckoning.” We find these words near the beginning of “The Machine (1946-2016),” the last part (some 270 pages) of Jill Lepore’s lengthy and highly-praised These Truths: A History of the United States. The rest of this section provides little hope that the outpacing she writes of is narrowing. This failure of ours is what is most alarming about these years.

Lepore’s survey of our post-WWII years addresses computing developments, polling, and political polarization. UNIVAC, the Universal Automatic Computer, was first revealed to the public in 1951. Along with subsequent computing, it helped turn “people into consumers whose habits could be tracked and whose spending could be calculated, and even predicted.” It also wreaked political “havoc, splitting the electorate into so many atoms,” and it contributed to newer forms of alienated labor.

Lepore thinks that conservatives took over the Republican Party in the late 1970s and early 1980s and gained a “technological advantage” over Democrats that “would last for a long time.” In this same period, corporations increasingly used computers to conduct their own polls, the accuracy of which Lepore often questions. By the 1990s, conservatives were increasingly using “targeted political messaging through emerging technologies” and were contributing to “a more atomized and enraged electorate.”

Although collapsing communist regimes and the end of the Cold War, culminating in the disintegration of the USSR in 1991, boosted Americans’ confidence in the future, Lepore believes “they were unable to imagine the revolution in information technology that would resist regulation and undermine efforts to establish a new political order.”

Despite the early Republican advantage in information technology, its impact on Democrats was also great. In the 1990s, Silicon Valley entrepreneurs and other professionals came to dominate the party, which deemphasized concerns of blue-collar workers, as it “stumbled like a drunken man, delirious with technological utopianism.” In February 1996, in what “would prove a lasting and terrible legacy of his presidency,” Bill Clinton signed the Telecommunications Act.  By deregulating the communications industry, it greatly reduced antimonopoly stipulations, permitted media companies to consolidate, and prohibited “regulation of the Internet with catastrophic consequences.”

Despite claims that the Internet helped democratize political life, Lepore thinks that social media, expanded by smartphones, “provided a breeding ground for fanaticism, authoritarianism, and nihilism.” She writes of how the alt-right used web sites like Breitbart to spread its influence and how the Internet was “easily manipulated, not least by foreign agents. . . . Its unintended economic and political consequences were often dire.” The Internet also contributed to widening economic inequalities and a more “disconnected and distraught” world.

Beginning in the 1990s the concept of innovation “gradually emerged as an all-purpose replacement” for progress. The newer goal was more concerned with profit than any moral improvement, and it was often perceived as “disruptive innovation.” One of its proponents was Mark Zuckerberg, who in 2004 founded Facebook. Lepore quotes him as saying, “Unless you are breaking stuff, you aren’t moving fast enough.”

Newspapers were one of the casualties of this disruption. Compared to them, Internet information was “uneven, unreliable,” and often unrestrained by any type of editing and fact-checking. The Internet left news-seekers “brutally constrained,” and “blogging, posting, and tweeting, artifacts of a new culture of narcissism,” became commonplace. So too did Internet-related companies that feed people only what they wanted to see and hear. Further, social media, “exacerbated the political isolation of ordinary Americans while strengthening polarization on both the left and the right. . . . The ties to timeless truths that held the nation together, faded to ethereal invisibility.”

During the twenty-first century political polarization accelerated as the Internet enabled people “to live in their own realities.” Lepore quotes conservative talk-radio host Rush Limbaugh as saying in 2009 that “science has been corrupted” and “the media has been corrupted for a long time. Academia has been corrupted. None of what they do is real. It’s all lies!” Instead the “conservative establishment” warned audiences away from any media outlets except those that reinforced right-wing views. Such polarization also affected people’s ability to deal with our most pressing global problem—climate change—because, as Limbaugh believed, the science of the “alarmists” could not be trusted.

Although one can argue that Lepore pays insufficient attention to all the plusses of technological change, her main point that our moral advances have failed to keep pace with technological developments is irrefutable. One can further argue that many of our main problems today, such as climate change, nuclear buildups, cybersecurity, growing economic inequality, and the Trump presidency, are related to our inability to relate wisely to our technological changes. The popularity of our tweeting president, for example, was greatly boasted by his starring role in the twenty-first-century reality TV show The Apprentice.     

More than four decades ago economist and environmentalist E. F. Schumacher bemoaned that “whatever becomes technologically possible . . . must be done. Society must adapt itself to it. The question whether or not it does any good is ruled out.” Adecade ago I concluded that “it was indeed evident how difficult it was for people’s prudence, wisdom, and morality to keep pace with technological change.” More recently, I updated this perspective by citing the brilliant and humane neurologist Oliver Sacks, who shortly before his death in 2015 stated that people were developing “no immunity to the seductions of digital life” and that “what we are seeing—and bringing on ourselves—resembles a neurological catastrophe on a gigantic scale.” 

Undoubtedly, how to insure the use of digital and other technology to improve the common good is a tough problem. One place to look is to futurists. Psychologist Tom Lombardo is one of the wisest ones. He recognizes that “the overriding goal” of technology has often been “to make money . . . without much consideration given to other possible values or consequences,” but in his 800-page Future Consciousness: The Path to Purposeful Evolution he details a path by which we can evolve toward a more noble way of managing technology: by developing “a core set of character virtues, most notably and centrally wisdom. ” 

Another source of wisdom regarding technology is from religious and philosophical thinkers. In the 1970s Schumacher in his chapter on “Buddhist Economics” in Small Is Beautiful sketched out a way wholly different than in the West for looking at technology and economics. More recently, in an encyclical on climate change—which the nonbeliever neurologist Sacks referred to as “remarkable”—Pope Francis devoted many pages to technology and acknowledged that at present it “tends to absorb everything into its ironclad logic.” But in opposition to our present “technocratic paradigm” he called for a “bold cultural revolution” based on noble values and goals. 

Finally, on a history site such as HNN it is appropriate to ask, “Does history give us any hope that such a ‘bold cultural revolution’ can occur?” Can our approach to technology change from that of the dominant Western one of the last few centuries? Despite indicating our many post-WWII failures to cope wisely with technological change, Lepore does provide examples of movements and individuals that changed our history’s trajectory. 

She writes of the Second Great Awakening, a religious revival movement that swept over the USA in the 1820s and 1830s. It increased church membership from one out of ten Americans to eight out of ten. She recalls the long struggle to end U. S. slavery from Benjamin Franklin, whose “last public act was to urge abolition,” to Frederick Douglas, whose writings helped inspire Lincoln and continue to inspire Lepore. She notes that after the money-grubbing Gilded Age, the Progressive Era emerged, and that “much that was vital” in it grew out of the Social Gospel movement, which “argued that fighting inequality produced by industrialism was an obligation of Christians.” 

She recounts the many battles for civil rights from the Civil Rights Act of 1866, through Martin Luther King’s efforts and the Civil Rights Act of 1964, to the contested battles for the rights of blacks, women, immigrants, and LGBTs during the Trump presidency. She also details Franklin Roosevelt’s New Deal, whose scope was “remarkable” in combatting the Great Depression, when “nearly five in ten white families and nine in ten black families endured poverty,” and during which President Herbert Hoover argued against government relief, believing it would plunge the nation “into socialism and collectivism.” 

One significant historical change that Lepore pays scant attention to is the end of the Cold War, noting simply, “by 1992, more than four decades after it began, the Cold War, unimaginably, was over.” That “unimaginable” ending, however, was due to individuals (like Soviet leader Mikhail Gorbachev and Ronald Reagan) who acted in unexpected ways to carry out steps that other individuals (like activist Andrei Sakharov and other protesters) and movements had long been demanding. In other parts of the world leaders like Gandhi and Nelson Mandela also produced results like nonviolent résistance and the end of apartheid that changed history’s trajectory. 

As discouraging as post-WWII efforts to manage technology wisely have been, there may be, paradoxically, glimmers of hope emerging from our present dire climate-change situation. In a recent New York Times op-ed, “Time to Panic,” we read that “we’re at a point where alarmism and catastrophic thinking are valuable, for several reasons.” One is that “politics, suddenly, is on fire with climate change.” Just as the catastrophe  of the Great Depression led to the imaginative New Deal, so too the present climate-change crisis might soon alarm us enough to spark new actions and ways of interacting with our planet—and with technology in general.    

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171379 https://historynewsnetwork.org/article/171379 0
What I’m Reading: An Interview With Civil War Historian Anne Sarah Rubin

 

Anne Sarah Rubin is a Professor of History at the University of Maryland, Baltimore County, where she teaches courses on the Civil War, American South, and the Nineteenth Century United States. She is also the Associate Director of the Imaging Research Center. Find her at her website.

 

What books are you reading now? 

I'm working on a project about starvation in the Civil War and Reconstruction South, so I just finished Amy Murrell Taylor's book Embattled freedom: journeys through the Civil War's slave refugee camps. I'm also working on a digital project about African Americans in early republic Baltimore so I am digging into Martha Jones' Birthright citizens: a history of race and rights in antebellum America; I can see using this in my Civil War course next fall. Finally, I always read something for fun before bed, often true crime or fiction. I'm in the middle of The Feather Thief: Beauty, Obsession, and the Natural History Heist of the Century by Kirk Wallace Johnson and next up is Washington Black by Esi Edugyan.

 

What is your favorite history book?

It's hard to choose just one! Edmund Morgan's American Slavery, American Freedom was mind-blowing when I first read it as an undergrad, thought-provoking as a graduate student, and a pleasure to teach with. I think Seth Rockman's Scraping By is so impressive in illuminating the lives of people who we thought we couldn't find. Finally, I think Thavolia Glymph's Out of the House of Bondage is such a powerful work, and one that I use every year in my Civil War class.

 

Why did you choose history as your career?

I always loved history and thinking about the past – my parents took us to all different historical sites like Williamsburg and Old Sturbridge Village when my brother and I were kids. Then I took AP US History as a high school junior, taught by Eric Rothschild and Ted Morse (more on them below) and I realized that you could make a career out of figuring out the past – why did people do things? What was it like? So I am the rare person who chose a career at 16 and stuck to the plan!

 

What qualities do you need to be a historian?

Curiosity is number one—you need to want to know what the past was like and to find all sorts of aspects of the past interesting. Beyond that you need to love to read, and be willing to write, and rewrite, even if you don't love the process. I think that historians also need to be tenacious. It takes a long time to do good research, which is often tedious. And good writing also takes a lot of time. You just need to keep plugging away at a project.

 

Who was your favorite history teacher?

Eric Rothschild at Scarsdale High School in Scarsdale, NY. He taught my AP US history class (along with Ted Morse) and showed us how much fun history could be. He used lots of primary sources, sang to us, showed political cartoons and popular art. He also took our class to the AHA and OAH annual meetings, so we could see the wide variety of work that historians did. Eric loved teaching, and his joy was infectious.

 

What is your most memorable or rewarding teaching experience?

In 2015 I taught a class called Replaying the Past, about using video games to teach history. My students (a mix of advanced undergrads and MA students) were the clients for students in UMBC's video game design major, and together we built an educational game about the Pratt Street Riots in Baltimore in April, 1861. The premise is that you are a fox running around the city collecting documents about the riot. Then you read the documents to put them in chronological order. Besides working with game designers, my students also built their own board games and interactive digital fiction. It was interesting to see them think through using the same corpus of research in different ways and for different audiences, and I learned a lot about gaming and digital history myself.

 

What are your hopes for history as a discipline?

Honestly, I just hope that it survives as a distinct scholarly practice. Historians learn to think logically and systematically, to analyze arguments, and to organize reams of evidence into their own arguments. It’s a habit of mind. I'm all for working across the disciplines—I am the associate director of a digital media lab at UMBC and work every day with programmers, artists, geographers, etc.—but the value that I bring to the table is my historical thinking. I worry sometimes that there are too many majors being offered at universities, and that History is falling out of fashion.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

I have some 19th century editions of books like Albion Tourgée's A Fool's Errand, but I wouldn't call myself a collector. I collect kitschy Civil War items: Playmobile Civil War soldiers and funny magnets for example. My favorite one is an antique handmade cookie-cutter shaped like Abraham Lincoln. Sometimes I make Lincoln cookies for my students.

 

What have you found most rewarding and most frustrating about your career? 

Teaching is rewarding, especially at a place like UMBC where I can have a student in multiple classes over the years and watch her or him grow intellectually. Working on books is both incredibly frustrating, because it’s time-consuming and difficult, but also tremendously rewarding. To hold your own book is to have achieved something permanent in an ephemeral world.

 

How has the study of history changed in the course of your career?

The profession, and therefore the work that people produce, has become much more diverse. As a Civil War historian, it’s been a pleasure to see more and more women and people of color come into the field and make it their own.

 

What is your favorite history-related saying? Have you come up with your own?

I don't have a favorite one for history. But I often tell students and colleagues that the best paper/dissertation/book is a finished one! I also always tell students that the people in the past were not better or worse than we are today—sort of an antidote to the idea of a "greatest generation." 

 

What are you doing next?

I'm working on a project about starvation in the Civil War South, from the start of the war through the famine of 1867. I'm trying to use culinary history to get at what people were really eating, from the perspectives of elite whites, poor whites, and African Americans, particularly those who ran away during the war. I also want to explore the different groups and agencies providing relief to blacks and whites after the war. Right now, I am still in the research phase, and it hasn't yet come together for me. But it will. Because the best book is a finished book.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171452 https://historynewsnetwork.org/article/171452 0
Hitler's Own Maginot Line

 

"Monsieur Maginot built a fortified line," noted the German justice inspector Friedrich Kellner in his diary in June 1940, just after Hitler's army burst through the French fortifications. If France really expected to keep its neighbor from storming its territory, he added sardonically, it should have covered every foot of its border with "dynamite, deep pits of hydrochloric acid, traps of the largest extent, trip-wire guns, electrically charged fences -- and with thousands of heavy caliber cannons playing a major role." Water wells in the fields would have to be poisoned, and the more persistent intruders greeted by "machine gunners in swiveling steel towers." The humiliating failure of the vaunted Maginot Line thrilled the vast majority of Germans, who adored their Führer and supported his ruthless agenda. "The foolish people are intoxicated by the victories," wrote Kellner. He had opposed the Nazis from the beginning. A political organizer for the Social Democrats, he had campaigned against them during the ill-fated Weimar Republic. When Hitler came to power, Kellner moved his family to the small town of Laubach where he found a position as courthouse manager. On September 1, 1939, when the Nazi war machine introduced the world to blitzkrieg, Kellner began to record Nazi crimes and the German people's complicity in Hitler's program of terror. He occasionally forgot himself and gave voice to his feelings and was written up as a "bad influence" by the local Nazi leader. The SS placed him and his wife, Pauline, under surveillance, and only Kellner's position in the courthouse kept them from arbitrary arrest by the Gestapo.  If Adolf Hitler's megalomania had been satisfied after he avenged Germany's WWI defeat by overcoming the Maginot Line, he might not have had to create one of his own. But two years later Hitler was anxiously constructing a line of defense to secure his ill-gotten gains. He promised that his "Atlantic Wall" -- thousands of fortifications of concrete and steel stretching over 1,500 miles -- would keep out the hordes. Should anyone miraculously make it through the Wall, declared a boastful Hitler to laughter and thunderous applause from his audience, "It will only be a matter of luck if he remains nine hours."  Joseph Goebbels, the Minister of Propaganda, contributed with special presentations in the papers and over the radio, calling the Wall "the most enormous protective cordon of all times." In weekly newsreels, a narrator described the panoramas of the fortifications: "We admire the many and confounding systems of the trenches, crossbars and barricades, of the bunkers and machine gun emplacements, the threatening, elongated muzzles of the super-heavy cannons." To those who might dare to test the Wall, the narrator added, "We understand the nervous mood on the British island, where they hope in vain to grate the German people's nerves by shouting about invasion." "The Atlantic Wall is presently a favorite topic in the press and on the radio for rousing the people," wrote Kellner. "But it ranks along with the Maginot Line that made France believe it was safe." He was unimpressed when Hitler assigned Germany's most popular military leader to command the project. "General Field Marshal Rommel, the Great Retreater, who at one time had 'the gates of Egypt' in his hand, is being brought out of mothballs," said Kellner. "He is the darling of the propaganda machine, and his blemished fame will fill the forgetful ones with hope."

 

A photo of Friedrich Kellner's diary, courtesy of the author.

Kellner was certain neither Rommel nor his battalions nor the fabled Atlantic Wall would repel what was coming. The invaders might not even bother with the Wall, he suggested. There were other borders they could test. If they entered Germany from elsewhere, the Wall would prove a farce. "Besides," Kellner wrote, "they can fly over it." In any case, Kellner was tired of the boasting and bluster in Germany, and the procrastination of the Allies, and he wrote, "One would like to proclaim, 'Let us finally see the action.' Then the proof will be furnished whether the legendary Atlantic Wall is to be breached or not." That day of action arrived three weeks later, on June 6, 1944. "Finally!" Kellner wrote in capital letters at the top of that diary entry. "With the utmost inner excitement we heard the announcement today landings were made on the northern French coast." The Wall fell, and rightly so in the diarist's mind. "The human community does not end at border crossings artificially placed within a country," said the Social Democrat Friedrich Kellner, "but has to embrace everything that carries a human face."

 

For more on Friedrich Kellner, read My Opposition: 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171378 https://historynewsnetwork.org/article/171378 0
Lessons from Studying Corporate History and the Case of IBM

 

There are usually three audiences for a long-lived corporation’s history: industry analysts trying to figure out if the firm will die off soon (and if clients should sell off stock); senior business management studying how to keep their companies successful for years, if not decades; and business and economic historians.  The first two constituencies often turn quietly to historians for these lessons. For example, Harvard’s business historian Alfred D. Chandler, Jr. was a welcome guest in classrooms and boardrooms and  Louis Galambos and Geoffrey Jones are followed by many businesses. All three historians spent decades studying the role of large multinational corporations.

 

IBM fits into that galaxy of firms; the body of literature about it generated by historians, economists, journalists, employees, and industry experts over the last half century is extensive. For nearly a century, other multinational companies looked to see what IBM was doing and drew lessons from its experience.  It is no accident, therefore that when I wrote IBM: The Rise and Fall and Reinvention of a Global Icon that I emphasized IBM as an icon. My study of the company’s 130 year historyisthe first such comprehensive book to appear in over 15 years and the first to be written by an historian since Robert Sobel did so in 1981.

 

Looking at IBM through the lens of a large information ecosystem highlighted two findings.  First, by the end of the 1920s and continuing until the late 1980s, if one wanted to understand how to use data processing—IT in today’s language—they turned to IBM and its many thousands of experts, user communities, and industry associations that it populated, often dominated with its own perspectives.  That practice put IBM in the middle of much of what happened with computing in all societies for most of the twentieth century and facilitated its success and growth.  Second, that ecosystem’s existence led to the realization that IBM was less of a monolithic institution, which both it and its historians had long touted and accepted.

 

I argue that IBM was a collection of different factions (think product lines and divisions) that collaborated, competed with each other, and that shared a common culture.  It proved useful to think of IBM much the way one would about a midsized American city of several hundred thousand people.  Their mayor (the CEO at IBM) had to deal with factions and neighborhoods scattered across scores of countries, persuading them to do his bidding, while employees worked to succeed within the economies of so many countries.  That perspective is applicable to such other enterprises as GM, Ford, Phillips, Shell, and Exxon/Mobile.  All have long histories, multiple product lines, and subgroups competing for resources. 

 

 

If IBM is a city that shares a common language (in this case English) and set of clearly articulated and accepted values (corporate culture) in the middle of an information ecosystem, a new perspective of IBM and other firms emerges.  The story becomes more complicated, of course, as we move from simple discussions of organization and strategy to the more sociological views advocated by business historians over the past three decades.  But it also allows us to explore old themes in new ways, such as Chandler’s fixation on unified strategies now explained as evolving dynamic strategic intents, or Philip Scranton’s advocacy of looking at innovation and structures in smaller units.  Pro- and anti-Chandlerian views that dominated much of business historiography over the past 15 years become less relevant.

 

The history of IBM exposes the emerging notion that information ecosystems and their underlying information infrastructures leads to useful methods for studying such large organizations at both macro and micro levels, from grand strategy of “The Corporation,” to how a sales office or factory worked, linking together the experiences of both types of history that reflected the realities memoirists described.  In IBM’s case, we have the benefit of several dozen memoirs, a well-stocked corporate archive, and a nearly vast literature on the daily affairs of the company to support research.  When viewed through the lens of ecosystems and infrastructures, we see a new IBM.

         

Briefly summed, IBM was in constant churn for 130 years, always challenged by technological complexities, effective and dangerous competitors, antitrust litigation that followed its waves of success three times, rivalries among executives promoting their personal careers, customers dependent on IBM, others that wanted new products or solutions to problems, and always, the angst of possibly going out of business, as was feared in the 1910s, early 1920s, during the Great Depression of the 1930s, in the early 1990s, and that periodically an analyst hints at today.  But it did not go out of business.  No longer can one view IBM just as a slick, monolithic, well run operation.  It had feet of clay, like other large organizations.

         

Why did IBM succeed?  IBM developed a corporate culture aligned with what business it was in—the processing of data—and tuned it to meet the realities of every era.  Second, because of the technological churn that was always its reality, it was constantly willing to retire old products and bring in new ones, to replace old employee skills with new ones, and to alter long-standing operational practices into new ones.  Each represented painful transformations that put business prospects at risk, cost some employees their careers while opening up opportunities for others, and always against the background of growing customer dependencies on the use of data processing from punch card tabulating equipment through computers then PCs then to the Internet and now to ubiquitous computing appearing in all corners of modern life.  

 

For many decades what happened at IBM mirrored events in scores of other similar enterprises and with so many now appearing around the world its experiences have much to teach each audience. Nowhere does this seem so urgently true than for such firms as Amazon, Google, Facebook, Microsoft, and Apple. It is no accident that the last two firms have formed various business relations with IBM, Microsoft since the early 1980s and Apple in the last decade.

         

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171385 https://historynewsnetwork.org/article/171385 0
Saving Lives in the Crimean War: “They Are My Sons”

 

Mary Seacole was a Creole, born in Jamaica to a Jamaican woman who had married a Scottish soldier. Her mother, a nurse, ran a boarding house for invalid soldiers there. Mary lived in Jamaica until she was 39 and then, widowed, traveled to Panama, where she opened up a hotel. She developed numerous medical skills over the years and when the Crimean War broke out in 1853, raced to England in an attempt to join Florence Nightingale and her cadre of 39 nurses to care for the wounded British troops in the war.

She was rejected by Nightingale (Mary said it was because she was black) and then, with a friend she traveled to Crimea and set up a small hotel/hospital behind the battle lines at Balaclava and went to work as a nurse, caring for the soldiers, who appreciated everything she did for them. She also risked her life treating soldiers on the battlefield. A British journalist covered her and she became famous all over the world.

Her story is told well, and with a considerable amount of emotion, by Jackie Sibblies Drury in the play Marys Seacole, that opened recently at Lincoln Center’s Claire Tow Theater, in New York.

You realize before you sit down and stare at the very contemporary set of a hospital waiting room that this is a very unusual play. It is a hard-hitting drama that hammers away at your senses, full of vivid skits and scenes. Characters hurtle back and forth through time between Jamaica of 2019 and Crimea of the late 1850s. It depicts Mary’s life, and her struggle in the war, very well and tells yet another true story of a brave woman in history.

The hospital set is used as a theater in itself in this story of Jamaica in the 1840s and Crimea in the 1850s. Added on to the war story are numerous contemporary events, such as a dress rehearsal for a mas shooting exercise, Mary’s disputes with her dead mother’s ghost and the anguish of a pregnant woman. The last half of the play is in 1850s Crimea and Mary’s heroism there and more visits from her mom. If you pay careful attention to the rather chaotic plot, it all makes surreal sense, but if you don’t pay attention you can get a bit lost.

Director Lileana Blain-Cruz does a fine job of telling the 1853 tale and the contemporary story at the same time. She also is skilled at keeping the pace of the play moving along at a brisk speed even though she works hard, with her actresses in this all female play to build deep, rich characters slowly.

She gets good work from Quincy Tyler Bernstine as a wonderful Mary, who is a tornado of emotions at times. The director also gets superlative work from Karen Kandel as Duppy Mary, her mom’s ghost. Others in the talented ensemble cast include Gabby Beans as Mamie, Lucy Taylor as May, Marceline Hugot as Merry and Ismenia Mendes as Miriam.

There is a point, about two thirds through the play, when the ensemble goes through a torrent of activity, people running madly across the stage, arguing and defending themselves. There is, well, a lot of high-pitched screaming. Fans of Super Bowl winners have not screamed this loud. It is a magical moment.

Lincoln Center has printed a short history of Mary’s life (the Marys in the title refers to the different Marys in the story) that is slid into the program so that you know her history and that of the world in the 1850s. It is helpful. 

A problem with the play is that you learn absolutely nothing about the Crimean War. You don’t even learn who won the conflict (Alfred Lord Tennyson’s immortal poem ‘The Charge of the Light Brigade” was written about that war).

The war, that lasted until 1856, was started over the rights of Christian visitors to the Holy land, at the time part of the Ottoman empire. It pitted Russia against the Ottomans, France, the British and Sardinia. Russia lost. A play about a nurse in the Crimean war must have more information on the war. It should also tell more about Mary’s life as a nurse there, especially her visits to the battlefield while gunfire was exchanged. People do not know much about that war, and more information om stage would be valuable. 

 The play also starts slowly, mopes for a while and has too many people with clipped Jamaican accents. These are just minor complaints, though.  After about half an hour the magic of the playwright and director take hold and Marys Seacole soars.

PRODUCTION: The play is produced by Lincoln Center. Scenic Design: Mariana  Sanchez, Costumes: Kaye Voyce, Lighting: Ilyoun Chang, Sound: Palmer Hefferan. The play is directed by Lileana  Blain-Cruz. It runs through April 17.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171479 https://historynewsnetwork.org/article/171479 0
Robinson Crusoe’s Wall

 

The question of building a wall on the southern border of the United States has been elevated to a national emergency, despite the objections of national security experts who say that a wall, however long or high, will not secure the border. Opponents of the wall also say that its cost, not only in funds previously allocated for other projects but also in the erosion of our democratic system of government, is not worth the benefit that may be obtained. Yet there is a persistent resolve at the highest levels of government and among a significant number of voters that we must have a wall, no matter what. Perhaps we can look to history, especially literary and cultural history, to explain this passionate desire to build a wall.

Robinson Crusoe, a novel written by Daniel Defoe in 1719, has much to tell us about walls. In the first century after its publication, Robinson Crusoe became a popular book for children because of the salutary model it was thought to provide for young Englishmen and women. One feature that remained in every edition, no matter how much abridged, was Crusoe’s wall. Thousands of young readers from Britain, America, and many other nations learned from Robinson Crusoe that building a wall was the first step toward creating a civilized domestic space and eventually extending it into an empire.

In the central episode of the novel, the mariner Robinson Crusoe is washed ashore on a deserted island in the Caribbean Sea. To protect himself against wild animals, he builds a wall across the mouth of a cave made of posts set in the earth, backed with layers of turf two feet thick. The wall, a half-circle twenty-four yards in length, takes him fourteen weeks of strenuous labor to build, “but I thought I should never be perfectly secure ’till this Wall was finish’d.”

The completion of the wall enables Crusoe to improve his circumstances, both material and spiritual. He builds a domestic space; he plants crops and raises goats; he reads his Bible and gives thanks for his salvation. But all this security is swept away one day when he discovers a single unfamiliar footprint in the sand. He retreats in terror behind his wall, where he remains for several months. At last he emerges and builds a second wall, “thickened with Pieces of Timber, old Cables, and every Thing I could think of, to make it strong.” This new wall, fortified with seven gunports, can only be climbed with a series of ladders which Crusoe can take down when he is inside his cave. Upon finishing this second wall, he rejoices again in his new sense of security: “I was many a weary Month a finishing [it], and yet never thought my self safe till it was done.”

The object of Crusoe’s terror is the indigenous people of the northeastern coast of South America, then known as Caribs, who visit the island occasionally to celebrate their victories and sacrifice their prisoners. In Defoe’s day it was commonly believed that the Caribs were cannibals, though that point is contested now. Crusoe’s providential rescue of one of their victims, whom he names Friday, allows him to initiate a military-style campaign against the Caribs. Driven by fear and anger, Crusoe attacks the visitors deliberately, not reflecting at first that in doing so “I should be at length no less a Murtherer than they were in being Man-eaters, and perhaps much more so.” Crusoe and Friday kill or wound twenty-one of the “Savages,” whom he implicitly blames for their own deaths.

Crusoe’s last exploit on the island involves assisting in the re-capture of a ship whose crew has mutinied. Instead of hanging the mutineers, Crusoe proposes that they should be marooned on the island, where they may repent and reform themselves as he has done. In effect, he converts his island paradise into a penal colony, leaving the mutineers his cave, his tools, and his wall upon his return to England. On a subsequent visit to the island, years later, he finds that the mutineers have quarreled, fought amongst themselves, and lost their island to the Spaniards. The ending suggests what happens when the lesson of the wall and the purpose it serves are not understood. 

Defoe’s Robinson Crusoe suggests that there is nothing better than a wall to provide a sense of security, so long as the wall is a symbol of civic order and domesticity. If those values are lacking, the wall is no better than a penitentiary, locking up disorderly elements within. Before extending the wall on the southern border, we need to decide whether its purpose is to strengthen a civil society, or to perpetuate division and discord on both sides of the wall. If we merely lock ourselves up in our own fear and anger, we are likely to repeat the conclusion of Robinson Crusoe.

 

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171453 https://historynewsnetwork.org/article/171453 0
How the Allies Won on D-Day

 

The following is an excerpt from Soldier, Sailor, Frogman, Spy, Airman, Gangster, Kill or Die: How the Allies Won D-Day

 

George Lane [a Hungarian living under a pseudonym who signed up for the elite British-led X-Troop that consisted of foreign nationals whose countries had been overrun by the Nazis] viewed his life in much the same way as a professional gambler might view a game of poker: something to be played with a steady nerve, a dash of courage and a willingness to win or lose everything in the process. 

 

His addiction to risk had driven him to join the commandos; it had also led him to volunteer for a perilous undercover mission codenamed Operation Tarbrush X. In the second week of May 1944, Lane was to smuggle himself into Nazi- occupied France using the cover of darkness to paddle ashore in a black rubber dinghy. His task was to investigate a new type of mine that the Germans were believed to be installing on the Normandy beaches.

 

Operation Tarbrush X was scheduled for 17 May, when a new moon promised near- total darkness. Lane selected a sapper named Roy Wooldridge to help him photograph the mines, while two officers, Sergeant Bluff and Corporal King, would remain at the shoreline with the dinghy. All four were fearless and highly trained. All four were confident of success. 

 

The mission got off to a flying start. The men were ferried across the Channel in the motor torpedo boat and then transferred to the black rubber dinghy. They paddled themselves ashore and landed undetected at exactly 1.40 a.m. The elements were on their side. The rain was lashing down in liquid sheets and a stiff onshore squall was flinging freezing spray across the beach. For the German sentries patrolling the coast, visibility was little better than zero.

 

The four commandos now separated, as planned. Bluff and King remained with the dinghy, while Lane and Wooldridge crawled up the wet sand. They found the newly installed mines just a few hundred yards along the beach and Lane pulled out his infrared camera. But as he snapped his first photograph, the camera emitted a sharp flash. The reaction was immediate. ‘A challenging shout in German rang out and within about ten seconds it was followed by a scream which sounded as if somebody had been knifed.’ Soon after, three gunshots ricocheted across the beach.

It was the signal for a firework display unlike any other. The Germans triggered starshells and Very lights (two different types of flare) to illuminate the entire stretch of beach and then began firing wildly into the driving rain, unable to determine where the intruders were hiding. 

 

Lane and Wooldridge scraped themselves deeper into the sand as they tried to avoid the bullets, but they remained desperately exposed and found themselves caught in a ferocious gun battle. Two enemy patrols had opened fire and it soon became apparent that they were shooting at each other. ‘We might have laughed,’ noted Lane after the incident, ‘if we had felt a bit safer.’

 

It was almost 3 a.m. by the time the gunfight ended and the German flashlights were finally snapped off. Sergeant Bluff and Corporal King were convinced that Lane and Wooldridge were dead, but they left the dinghy for their erstwhile comrades and prepared themselves for a long and exhausting swim back to the motor torpedo launch. They eventually clambered aboard, bedraggled and freezing, and were taken back to England. They would get their cooked breakfast after all. 

George Lane and Roy Wooldridge faced a rather less appetizing breakfast. They flashed signals out to sea, hoping to attract the motor torpedo boat and then flashed a continuous red light in the hope of attracting attention. But there was never any response. As they belly- crawled along the shoreline, wondering what to do, they stumbled across the little dinghy. Lane checked his watch. It was an hour before dawn, precious little time to get away, and the Atlantic gale was whipping the sea into a frenzy of crests and troughs. It was not the best weather to be crossing the English Channel in a dinghy the size of a bathtub. 

 

‘Shivering in our wet clothes, we tried to keep our spirits up by talking about the possibility of a Catalina flying boat being sent out to find us and take us home.’ Wooldridge glanced at his watch and wryly remarked that it was the date on which he was meant to have been going off on his honeymoon. Lane laughed at the absurdity of it all. ‘There he was, poor bugger, with me in a dinghy.’

 

Any hopes of being rescued by a flying boat were dealt a heavy blow in the hour before dawn. As the coastal town of Cayeux- sur- Mer slowly receded into the distance, Lane suddenly noticed a dot in the sea that was growing larger by the second. It was a German motor launch and it was approaching at high speed. He and Wooldridge immediately ditched their most incriminating equipment, including the camera, but kept their pistols and ammunition. Lane was considering a bold plan of action: ‘shooting our way out, overpowering the crew and pinching their boat’.But as their German pursuers began circling the dinghy, Lane was left in no doubt that the game was up. ‘We found four or five Schmeisser machine guns pointed at us menacingly.’ The two of them threw their pistols into the sea and ‘with a rather theatrical gesture, put up our hands’.

They were immediately arrested and taken back to Cayeux- sur- Mer, zigzagging a careful passage through the tidal waters. Lane swallowed hard. Only now did it dawn on him that he had paddled the dinghy through the middle of a huge minefield without even realizing it was there. ‘It was an incredible bit of luck that we weren’t blown to bits.’ 

 

The two men feared for their lives. They were separated on landing and Lane was manhandled into a windowless cellar, ‘very damp and cold’. His clothes were drenched and his teeth were chattering because of the chill. He was also in need of sustenance, for he had not eaten since leaving England. 

 

It was not long before an officer from the Gestapo paid him a visit. ‘Of course you know we’ll have to shoot you,’ he was told, ‘because you are obviously a saboteur and we have very strict orders to shoot all saboteurs and commandos.’ Lane feigned defiance, telling his interrogators that killing him would be a very bad idea. The officer merely scowled. ‘What were you doing?’

 

Lane and Wooldridge had cut the commando and parachute badges from their battledress while still at sea, aware that such badges would condemn them to a swift execution. They had also agreed on a story to explain their predicament. But such precautions proved in vain. The German interrogator examined Lane’s battledress and told him that he ‘could see where the badges had been’. Lane felt his first frisson of fear. ‘They knew we were commandos.’ 

To read more, check out the book!

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171458 https://historynewsnetwork.org/article/171458 0
Three Unexpected Deaths That Shaped Presidential History \

 

As I’ve written before, random circumstances often shape history. Unexpected tragedy in the lives of three political leaders—John F. Kennedy, Jr., Paul Wellstone, and Joe Biden—further demonstrate the profound impact of chance on political history. 

John F. Kennedy Jr. was the “golden lad": the most famous offspring of John F. Kennedy, he was a public celebrity who published “George” magazine. He was rumored to be planning to run for the open US Senate seat in New York vacated by Senator Daniel Patrick Moynihan in 2000.  With the famous Kennedy name behind him, plus his good looks and winning personality, it seemed as if he was likely to announce his campaign and even win. Then, he and his wife were suddenly killed in a small plane crash off Martha’s Vineyard, Massachusetts on the evening of July 16, 1999.  

First Lady Hillary Rodham Clinton won the seat of Senator Moynihan in 2000.  Hillary Clinton sought the Presidency in 2008, and when that candidacy failed, she served as Secretary of State to President Barack Obama. She ran again in 2016, winning the popular vote, but losing the Electoral College to Donald Trump.  Without the Senate seat, Clinton would have had no opportunity to seek the Presidency twice. Some believe JFK, Jr would have defeated Clinton in the Democratic primary, gone on to win the Senate seat, and could have sought the Presidency in 2008, potentially derailing Barack Obama’s campaign.  This is, of course, speculation, but it's also a reasonable possibility. 

Paul Wellstone was a Minnesota Senator representing the Democratic Farmer Labor Party from 1991 until his tragic death in another small plane crash on October 25, 2002. His wife and daughter also died in the crash, just two weeks before the 2002 election. Wellstone was a rising progressive star in the Democratic Party and a likely candidate for the Presidential nomination in 2004.  In the 2004 presidential election, Democratic Presidential nominee John Kerry lost to George W. Bush in large part because Kerry lost the state of Ohio. When one considers that Wellstone was from an agricultural state similar to Ohio, one can speculate that Wellstone might have drawn a lot more interest and support than Kerry or the early “flash in the pan” candidate, former Vermont Governor Howard Dean. If Wellstone had won, he would have been the first Jewish President.

Finally, Joe Biden’s political career was altered by personal tragedy. Biden has had the longest career of anyone who ran for the Presidency.  He ran in 1988, but was forced out by accusations of plagiarism and suffered an aneurysm shortly after. He again ran in 2008, but he could not compete with the stardom of Barack Obama and Hillary Clinton. His 36 years in the Senate and 8 years as an actively engaged Vice President made him a leading contender in the lead-up to the 2016 election until his beloved son, Beau Biden, died of cancer in 2015. This led to his decision not to run for President in 2016 and instead Hillary Clinton and Bernie Sanders battled it out. 

The fact that Clinton lost the states of Pennsylvania, Michigan and Wisconsin by small official margins, and that the white working class of those states went to Trump, made many feel Democrats could have won if Biden was the nominee. There is no certainty, of course, that Biden would have overcome both Hillary Clinton and Bernie Sanders and won the Democratic nomination in 2016, but many think that his known appeal to the white working class over his career would have helped him win the states Clinton lost. Whether Biden will decide whether to run in 2020, and have the ability to accomplish what might have been in 2016, is of course open to speculation.

While there are no guarantees that John F. Kennedy, Jr., Paul Wellstone, and now Joe Biden would have been elected President, it is certainly interesting to think how the future of America in the early 21st century might have been very different.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171459 https://historynewsnetwork.org/article/171459 0
A Historian Reflects on a Long Life

 

A lifelong historian, beginning with a spring, 1936 term paper on Franklin D. Roosevelt’s rise to the Presidency, Vaugh Davis Bornet frequently contributes to HNN. At 101 years old, he offers advice on living a long life, with a little history along the way in this personal essay.  

 

Recently I saw the movie South Pacific all the way through.  The Frenchman in exile has decided that the rest of his life (half is gone) has to be with Mitzi Gaynor, the Navy Nurse.  His two children are not enough, and neither is his ample estate.  He will live long if and only if he has  loving companionship! Yes!  

Beth Bornet was with me for 68 Anniversaries, beginning after our wedding in late 1944.  It is the place to start on the subject of  seeking and getting Longevity beyond any doubt.  After all, she planned and prepared every meal of those at Home for over half of my life, satisfied yearnings, selected and vetoed our guests, handled children.

So it is out of the question for me to start explaining/bragging over my oh-so-sensible choices on “how to live.”  After all, my partner in Life is the prime consideration as I ask myself:  “Why did you live so long?”  To you I say, “Choose and preserve Companionship, to guarantee a long and healthy life.”  (Anyway, that is my educated opinion.)

Next, I do think that Being Active is important. Those who lived long whilst drifting slowly and quietly and inactively through all those years do have a point. It can’t be denied.  But I do think I’m right as rain in urging activity, avoiding “just sitting there.”

Yes, I’m a Navy veteran, over 5 years active; 18 busy reserve.  Not sure the difference this participation has made, overall; did get help with medical bills, without doubt.

Move about, talk, play, be doing something all or most of the time.  (That doesn’t mean I am hostile to Rest, Relaxation, and quiet Reflection now and then. But it does seem that friendly companions around most of the time has to be related to wanting to Stay Alive and quite possibly to succeeding!)

Minding the business of your communities is related to longevity.  Join things. Meddle.  Speak up! Be alive and show it.  Make a difference.  Each of us, in his and her way, certainly did.  No detailed proof necessary.

I am going to waste little time on urging No Smoking.  It is now so obvious.  When I changed seats at Rotary in the early ‘60s to move away from second hand smoke I had no idea how important was each and every move.

This body of mine looks about the way it should, but really it is better.  I took real, no kidding, real barbell  weightlifting deadly serious in spring, 1941 as we worriers  thought war was around the corner for my age group.  I learned and did the snatch, press, clean and jerk, and dead lift.  A few years earlier I had been a high school tennis player of ability (Duke offered a scholarship--refused), and I won five victories in a row as an intermural baseball pitcher; I adored swimming.

With a few exceptions, it was one drink before dinner, and that was it.  In diet: lots of fresh and salt water fish; crab; poached eggs; meat—but no fetish; salads and vegetables “if I must.” But:  in later life, bananas, and lots of grapefruit shipped in from Texas and/or Florida, and even at the store in bottles out of season.

After a real, no kidding heart infarction (attack) in 1977 there was pretty long companionable walking in our Ashland, Oregon hills. There was always a large dog to demand that we go:  rain or snow! In late retirement, I sought out  a two-way exercise machine in our place; I have long given it a daily 20 minute workout, tension on full, with both arms and legs involved.  It’s totally routine, expected.

I confess there has been one of three different heart pills only once a day for over thirty years, prescribed by “the Best.” Since I was inheritor of an oddity, I long since had a pacemaker installed in my right shoulder; I am on my third as I write.  They keep heartbeats above 60, night and day.

Anything else?  I’m in bed, mostly asleep, 10:30 PM to 7:30 AM, nightly.  Little change.  I live, eat, and behave normally—I do think. I was in a terrible one car auto wreck  at over 70 mph in a bit earlier decade.  Then I was the victim of a freak accident that broke  open my left femur its whole length—forcing a form of incarceration nearly four months.

But: I had a 30 ft. above ground swimming pool for over 30 years and gave it the use it was sold for, relying on a battery of solar panels tp extend the season.  Great!

For all I know I could be deceased tonight or next week.  I am, after all, 101 and four months.  Ambulatory, I am unsteady; have had eyelid operations; hearing OK but unsharp just a bit. Skin not what it was. Still have nice cuttable hair on my head.  No fingerprints left!  Feet learning numbness; let’s wish them well.  Care to trade?  More sensitive to pepper than you.  Use potassium instead of salt; fake sugar in packets; on advice of one with a Ph.D. in “Nutrition” at the VA, eat minimum bread.  Fast typist, but lean on the keyboard now and then and swear under my breath. 

Why in the world did I write this article instead of doing something else?  Well, around me where I reside are a number of older males and females who are starting to push 100 just a bit,  and some are well over that.  Somebody may want to read it! And my sense of obligation is well developed, especially when I am pretty sure I can extend their life of chatting and musing maybe more than a few months!

LONG LIFE MAY BE GOTTEN, that is, if you want to have it.  This lifetime Research Historian wishes you well, and does hope you get what you want out of the rest of your hopefully happy life.

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171386 https://historynewsnetwork.org/article/171386 0
What Historians Are Tweeting: The Women Historians Who Inspire on International Women's Day Dr. Sarah Bond (@SarahEBond) asked "How did your female mentor make a difference in your career?" Here are some of the responses. 

Click inside the image below and scroll to see tweets.

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171457 https://historynewsnetwork.org/article/171457 0
Can Artists Remake Society?

In Grey (1919) by Kandinsky, exhibited at the 19th State Exhibition, Moscow, 1920

 

One hundred years ago today, fighting raged in the streets of Berlin. Kaiser Wilhelm II had abdicated in November 1918, and a new socialist government, led by reform-minded members of the German Socialist Party (SPD), had declared a democratic republic. Thousands of workers and sailors, dissatisfied with the moderate stance of the SPD leaders, demanded more radical policies, and revolted in Berlin in January. Not yet demobilized soldiers, the so-called Freikorps, fresh from defeat in World War I, were employed by the new government to destroy the revolt. The Freikorps killed hundreds of workers and assassinated two leaders of the newly founded German Communist Party, Rosa Luxemburg and Karl Liebknecht.

 

The radicals tried again in March to overthrow the SPD government – they called a general strike on March 3, which developed into renewed street fighting. Again the much more heavily armed Freikorps were dispatched by the government to put down the revolt. Gustav Noske, the new Defense Minister, issued a fateful order: “Any individual bearing arms against government troops will be summarily shot.” The ruthless Freikorps, led by extreme conservative officers who hated any manifestation of workers’ power, including the SPD government, hardly needed any encouragement. With few losses, they killed over a thousand workers. When several hundred unarmed sailors demanded back pay at a government office on March 11, twenty-nine were selected out and murdered.

 

Berlin was relatively quiet for a year. On March 12, 1920, the Freikorps, sporting swastikas on their helmets, and other right-wing military formations marched on Berlin in an attempt to create an authoritarian government. Military leaders on the government side refused to fire on fellow soldiers. The SPD government had to flee and a group of extreme conservatives declared themselves rulers of Germany. Adolf Hitler flew into Berlin to support the coup. Across Germany, army commanders and bureaucrats fell into line. This attempt to end the life of the new German democracy finally brought all leftist parties together in a call for a general strike, in which millions of workers paralyzed the country as protest against the so-called Kapp putsch. After four days, the putsch collapsed and the SPD government returned to Berlin.

 

The conspirators were treated leniently in comparison to the leftist rebels. Kapp and the other leaders were allowed to leave the country. Most participants were given amnesty. The Freikorps were eventually dissolved and many of their members later joined the Nazi Party. 

 

Its violent birth severely weakened the first German democracy, the Weimar Republic. The far left continued to advocate revolution. The far right was never reconciled to democracy and used violence against its representatives. The Nazi Party, while never gaining a majority among voters, was tolerated and supported by business and military leaders and conservative politicians, and was able to overthrow Weimar democracy bloodlessly in January 1933, and later murder 96 members of the German parliament, the Reichstag.

 

The city of Berlin is now commemorating the hundredth anniversary of the revolution of 1918-1919 with a broad palette of museum exhibitions, educational events, discussions, and tours under the title “100 Years of Revolution – Berlin 1918-19”.

 

One of the most striking changes triggered by the November Revolution in Germany, and more generally the revolutions in eastern Europe provoked by the Russian Revolution, was the conquest of the art world by a radically new conception of the nature of visual expression. The political revolution encouraged and was welcomed by young German artists, who sought to overthrow the traditional reliance of visual artists on more or less realistic representations of the material world. Calling themselves the Novembergruppe, an “association of radical fine artists”, they, like their colleagues in the new Soviet Union, rejected most accepted artistic conventions and called for a radical rethinking of what art meant. Breaking out of the stultifying traditionalism of German official art, the Novembergruppe offered artistic “freedom for every pulse”. But their ambitions went beyond aesthetics to seek the “closest possible mingling of the people and art”. “We regard it as our principal duty to dedicate our best energies to the moral construction of this young, free Germany.”

 

Among the many “pulses” that the Novembergruppe promoted was a rejection of all forms of artistic realism in favor of pure abstraction. Following the lead of Russian innovators like Kazimir Malevich, the painters Wassily KandinskyOtto Freundlich,Walter Dexel and others created non-objective works of color and form. They invited the Dutch abstractionist Piet Mondrian and the Russian Lazar El Lissitzky to exhibit with them in Berlin.

 

Also exhibiting more recognizably political works challenging the German economic, military, and religious elite, the Novembergruppe caused outrage in the early 1920s. By the later 1920s, they had achieved astounding success. Their paintings, sculptures, and architectural drawings became accepted and copied. The innovative artists of the 1920s revolutionized our conceptions of the nature of art. In nearly every cultural field, forms of creative expression which had been deemed distasteful, even repulsive, by the cultural elite became first acceptable and then dominant. Without the innovations of the 1920s, it is not possible to understand contemporary music, painting, or architecture.

 

Yet the broader ambitions of the German cultural radicals of the 1920s fell flat. Their radical ideas had little appeal to broader masses of the population, who still sought traditional forms of beautiful art. Art did not transform life. Their radical politics had restricted appeal. After 1933, the Nazis exploited popular preference for traditional art to categorize the Novembergruppe as “degenerate”.

 

In modern society, we are used to political art. Artists often express political beliefs through artistic creations as a means of influencing popular opinion. Some are individually successful, such as Margaret Atwood’s The Handmaid’s Tale or Norman Rockwell’s painting about school integration “The Problem We All Live With”. But a collective ambition to remake society through art has been absent since the idealism of Russian and German artists of the 1920s ended in disaster in the 1930s. The social vision of the Bauhaus has been subsumed in capitalist commercialism at IKEA. The Novembergruppe’s radical manifestoes are now museum pieces on display at the Berlinische Galerie for 10 Euros.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/blog/154191 https://historynewsnetwork.org/blog/154191 0
Writer Amiri Baraka and the Endless Controversies

 

LeRoi Jones (he later changed his name to Amiri Baraka) was one of the most controversial playwrights and poets in American history. He wrote the plays Dutchman, Black Mass, Home on the Range and the Police, among others, dozens of books of poems, plus fiction and non-fiction works. He gave hundreds of talks on writing and Civil Rights across the country and befriended Martin Luther King Jr., Jesse Jackson, Andrew Young and numerous other African-American leaders of the 1960s and 1970s. He was famous for his stand on black nationalism, but infamous, too, for that stand and his anti-Semitic essays and comments. Many people loved him and many people hated him. He was a lightning rod of emotions for Americans during a turbulent era.

His story is now being told in the play Looking for LeRoy, by Larry Muhammad, at the New Federal Theater’s Castillo Theater at 543 W. 42d Street, in New York, where it opened Saturday. It is a searing, strident and alarming play about a racial rabble rouser and yet, at the same time, a very deep, warm and rich play about a man looking back on his life and talking about what he did and what he did not do.

Playwright Muhammad smartly wrote it as a two man play, using the character of Baraka and adding a fictitious young intern. Taj, who serves as both friend and enemy of the playwright, letting Baraka bask in his glory at some points and forcing him to confront severe charges against him over the years from both black and white audiences at others.

Director Petronia Paley has done a superb job of taking a tight two men play and working it so that you see a whole nation of trouble and a landscape of characters alongside Amiri and Taj at the same time.

Baraka, who died in 2014, was a very complicated man. He was a playwright, poet, speaker and in many ways, the conscience of black America. Kim Sullivan, a highly skilled actor, plays him exactly that way. In some scenes he smiles and nods his head knowingly while discussing some well-remembered moments in his life and at others he flies into a rage, stomping about the stage as he remembers other, not-so-fine moments.

His counterpart, Taj, is played with enormous energy by the talented Tyler Fauntleroy, who is an emotional whirlwind on stage. The battles between the writer and intern are wondrous to behold in the gorgeous set by Chris Cumberbatch that serves as Baraka’s apartment.

The play is split int two sections. In the first, the young, headstrong intern who had met Baraka several times years ago, sees him as a celebrity. In the second, Taj, angrier as his internship goes on, becomes Baraka’s enemy and grills him like he was a one-man Congressional Committee.

The first section is good, but, like many plays, it starts off very slowly. As the two men spar over Baraka’s view of the theater, and race, though, it starts to sizzle. The second section is even better because the intern confronts Baraka on not only his work, but his life. Many people charged that as Baraka moved more into politics his writing suffered and Taj hammers him on that. He continually sticks intellectual pins into him and the playwright winces.

The play has its problems. It starts slowly and never really explains how famous Amiri Baraka was (sort of like Spike Lee today). Playwright Muhammad also holds his fire and does not present the very outspoken Baraka, despised by so many, until the last twenty minutes of the play, when some of the playwrights withering language is used. That portrait of the playwright should have come much sooner.

There are also too brief references to important things. Baraka’s enthusiastic support for Kenneth Gibson in his successful drive to become Newark, New Jersey’s first black Mayor, aided substantially by Baraka, is glossed over in a few seconds, as was the Presidency of Barrack Obama and the playwright’s lengthy duels with members of the Jewish faith (he was fired as New Jersey’s poet laureate over anti-Semitic writings).  

These are small criticisms, though. Looking for LeRoy is an impressive look at a provocative writer and electric speaker.

Amiri Baraka would love this play if he was around to see it, although he probably would have insisted on a few more protest posters, louder microphones and a few books about his friend Malcolm X on his coffee table.

 

PRODUCTION: The play is produced by Woodie King. Jr.’s New Federal Theater. Sets: Chris Cumberbatch, Costumes: Kathy Roberson, Lighting: Antoinette Tynes, Sound: Bill Toles. The play was directed by Petronia Paley. It runs through March 31.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171404 https://historynewsnetwork.org/article/171404 0
Life during Wartime 485: “Fatal Embrace"

Previous installments are archived at http://www.joshbrownnyc.com/ldw.htm

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/blog/154190 https://historynewsnetwork.org/blog/154190 0
Roundup Top 10!  

 

These women were denied veteran status for decades. Congress can’t overlook them again.

by Elizabeth Cobbs

Sens. Jon Tester (D-Mont.) and Marsha Blackburn (R-Tenn.) now propose to honor the women of the Signal Corps with the Congressional Gold Medal. 

 

The Rise of the Pedantic Professor

by Sam Fallon

When academic self-regard becomes an intellectual style.

 

 

The toxic legacy of the Korean War

by Mary L. Dudziak

The conflict upended the constitutional balance. It has been cited by presidents ever since.

 

 

Women in Ancient Rome Didn’t Have Equal Rights. They Still Changed History

by Barry Strauss

If we look hard at the history, we discover some women who made their mark, either working within their prescribed gender roles as wives, lovers, mothers, sisters or daughters, or exercising so much political, religious or, even in a few cases, military power that they smashed those roles altogether and struck out on their own. 

 

 

The History of Sexism in the Southern Baptist Church

by Susan M. Shaw

Recent media reports have revealed decades of abuse by Southern Baptist pastors. Here is the history behind the reports.

 

 

Barack Obama’s Presidential Library Is Making a Mockery of Transparency

by Anthony Clark

The leader of the “most transparent administration in history” has been anything but transparent when it comes to plans for his presidential center.

 

 

Grant’s First Tomb

by Jamelle Bouie

Ulysses S. Grant, inaugurated as president 150 years ago today, missed a chance to reconstruct the South economically as well as politically.

 

 

The Island That Changed History

by Sergey Radchenko

A 1969 border clash between Moscow and Beijing pushed the two apart, and opened the door for Nixon to go to China.

 

 

Policing black Americans is a long-standing, and ugly, American tradition

by Vanessa Holden and Edward E. Baptist

A new database of all the fugitive slave ads from U.S. and colonial history reveal how white Americans trained and incentivized themselves to police black Americans’ movements.

 

 

Michael Cohen’s testimony exposed a direct parallel between Trump and Watergate

by Shane O'Sullivan

Payoffs kept Watergate hidden, but eventually whistleblowers like Cohen flipped.

 

 

Five Reasons Why Republicans Won’t Abandon Trump Like They Ditched Nixon

by Ed Kilgore

There are five reasons a broader Republican backlash like the one that helped push Nixon out of office won’t happen if Mueller’s suggestions of law-breaking are limited to obstruction of justice.

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171451 https://historynewsnetwork.org/article/171451 0
The English Diggers, the "Commons," and the Green New Deal

 

In April of 1649 a group of radicals, some veterans of the recent English civil wars, occupied a bit of common grass on a plot known as St. George’s Hill in Surrey and they began to grow vegetables. These radicals called themselves “True Levellers” to distinguish themselves from another, more moderate political faction, but everybody else called them “Diggers” in reference to both a scriptural passage as well as their literal activity on the hill. In a spirit of communal fellowship, they invited the people of Surrey, then suffering under exorbitant prices food prices, to “come in and help them, and promise them meat, drink, and clothes.” 

Gerard Winstanley was the major theorist of the group, and in his manifesto The True Levellers Standard Advanced he advocated for a form of public ownership of land, seeing the idea of a commons not as radical, but rather as a restitution. In Winstanley’s understanding the commons were a feature of English rights, that had been violated in the development of privatization, whereby enclosures had begun to partition off formerly collective lands, which were now owned by individual aristocrats and noble families. The result, since the end of the fifteenth-century, had been increasing inequity, with the landless poor often not having space on which to graze their animals. There was an explicitly ecological gloss to Digger politics, with Winstanley claiming that “true freedom lies where a man receives his nourishment and preservation, and that is in the use of the earth.” The dream of the commons as exemplified by the Diggers has something to say in our current moment, as we face not just rising prices on vegetables, but indeed the possibility of complete ecological collapse. 

Critics of supply-side economics point to the Reagan and Thatcher revolutions of the 1980’s as being a moment whereby the traditional social contract, which held that a democratically organized state had a responsibility of care towards collective rights, had begun to fray. This analysis isn’t wrong, that the conservatives of that decade attacked the welfare state in favor of privatization, a shell-game redefinition of “freedom” whereby the undemocratic allocation of resources and power was shifted to the few, who’ve demonstrated a perilously uncaring stewardship towards the environment. But I’d argue that there has long been a Manichean struggle between an understanding of democratic control of resources versus a certain aristocratic libertarianism. The “reforms” of the later go back far in our history, with thinkers like Winstanley understanding what’s lost when the commons are turned over to the control of fewer and more powerful people. Results of that ignoble experiment now threaten to end life on earth as we know it.   

If it seems as if there has been an increasing attack on the idea of the commons when faced against the neo-liberal forces of privatization, then perhaps we should draw some succor from the English historian Christopher Hill, who noted in his classic The World Turned Upside Down: Radical Ideas During the English Revolution that the “Diggers have something to say to twentieth-century socialists,” and perhaps twenty-first century socialists as well. In 2019, I would argue, the idea of the “commons” as a space of collective ownership, responsibility, engagement, and possibility must be a metaphor that the left draws from rhetorically, wrenching it free from the realms of theory and philosophy, and which we can use to more fully define a concept of freedom which is true for the largest possible number of humans. 

A good idea never really dies. Even the right-leaning The Economist in an article entitled “The Rise of Millennial Socialism” admits that the newly resurgent left in both Great Britain and the United States’ Democratic Party has “formed an incisive critique of what has gone wrong in Western societies.” Partially this has been by recourse to traditional labor politics, but as The Economist notes it’s only on the left that there has been any credible attempt to solve the apocalyptic threat of climate change, the right either burying their heads in the sand or engaging in irrational and unempirical denialism.  Part of the new socialist movement’s environmental approach is a return to how the Diggers understood stewardship of the land, so that in policy proposals like Representative Alexandria Ocasio-Cortez and Senator Edward Markey’s Green New Deal we arguably have the emergence of a new ethic that could be called the “people’s right to the natural commons.” As Jedediah Britton-Purdy wrote in The New York Times, “In the 21st century, environmental policy is economic policy.” 

Just as the Diggers hoped to redefine the commons back towards its traditional understanding, so too do todays eco-socialists see this as a fruitful moment in which to expand the definition of freedom as meaning “something more than the capitalist’s freedom to invest or the consumer’s freedom to buy,” as the authors of a recent Jacobin article on the Green New Deal write. Kate Aronoff, Alyssa Battistoni, Daniel Aldana Cohen, and Theo Riofrancos write that for too long the “Right has claimed the language of freedom. But their vision of freedom as your right as an individual to do whatever you want – so long as you can pay for it – is a recipe for disaster in the twenty-first century, when it’s clearer than ever that all our fates are bound up together.” In opposition, the authors argue that the Green New Deal presents the freedoms of a commonwealth, the “freedom to enjoy life, to be creative, to produce and delight in communal luxuries.” I’d add a freedom of access to our collectively owned environment, including its climate, its land, its oceans. As Woody Guthrie sang with patriotic fervor, “This land is your land, and this land is my land.” 

Increasingly the mass of people has come to understand that the exorbitant wealth of the upper fraction of the 1% signal something more than mere luxury, but rather the transfer of undemocratically manifested political power and the ability to do with the earth and its climate whatever they want, even if the entire ecosystem lay in the balance. By contrast, eco-socialism requires a return to the Diggers’ promise, not the abolishment of private property, but an equitable say in how the resources which belong to the common treasury of all people of the earth should be allocated. Why should executives at Exxon have any ability to decide that they’re alright with apocalypse just because it helps their shareholders? Remember that we’re all shareholders of the earth. Far more pragmatic to consider the potential of what philosophers Michael Hardt and Antonio Negri write about in Commonwealth, when they argue that an embrace of the commons is a rejection of “nihilism,” for in turning away from an apocalyptic capitalist economics we can rather imagine “opening up the multitude’s process of productivity and creativity that can revolutionize our world and institute a shared commonwealth.” 

If it seems as if the Leveller’s nascent eco-socialist revolution failed, that’s not because there isn’t a golden thread connecting them to their own past and our current moment. Such beliefs about the commons were held by those participants of the aforementioned Peasant’s Rebellion in the fourteenth-century, and similar ideas about a collective right to some part of the environment can be seen everywhere from the commons at the center of many colonial New England towns goi to the environmental progressivism of President Theodor Roosevelt and the establishment of national parks. A collective right to a natural common, whereby we once again reaffirm the interdependent and communal ownership of the earth sounds as a radical idea, but a shift towards understanding our environmental crisis in this manner might be the spiritual change required to fully grapple with climate change. 

At St. George’s Hill, Winstanley didn’t understand the occupation as being anarchic, but rather conservative in the truest sense of that abused word, as the root for the word “conservation” and as a return to a “merry old England.” As historian Peter Linebaugh explains, the commons have “always been local. It depends on custom, memory, and oral transmission for the maintenance of its norms rather than law, police, and media.” For the Diggers, it was nascent capitalism which was truly radical, and they who rather advocated a return to how they defined the natural state of things. Half a century before the occupation at St. George’s Hill, and the anonymous author of a broadsheet ballad of 1607 wrote that “The law locks up the man or woman/Who steals the goose from off the common/But lets the greater villain loose/Who steals the common from the goose.” The Diggers’ rhetoric has even earlier precursors, their politics recalling a rhyming couplet of the Lollard priest John Ball who helped lead the Peasant’s Rebellion of 1381, and who used to sing a song asking where aristocrats could possibly have been in a state of nature, for “When Adam delved and Eve span, /Who was then the gentleman?” 

In The Century of Revolution: 1603-1714, Hill writes that “Freedom is not abstract. It is the right of certain men to do certain things.” We think of “freedom” as an innate quality – and it is – but we have those with a very limited definition of the word run rough-shod over our environment, where the freedom which has been defined is the right of a very small number of men to make momentous and detrimental changes to the world itself. A world which should be our shared commonwealth. Of course, proposals like the Green New Deal are upsetting to elites who have long profited by their small definition of freedom as being merely their freedom to exploit the earth. Surrey nobles were also less than pleased by the presence of hundreds of radicals encamped on St. George’s Hill, and by August of 1649 the Diggers lost a court case advocating for their squatter’s rights, so they voluntarily abandoned the plot before the threat of violence would have forced them to do so. Linebaugh writes that the “commons is invisible until it is lost.” Today St. George’s Hill is the site of an exclusive gated community and a tennis club. Maybe it’s time to tear down some of those enclosures again? 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171387 https://historynewsnetwork.org/article/171387 0
Genocide Denial In Bosnia: A Danger to Europe and to the World

The Srebrenica Genocide Memorial in Potočari

 

On July 11, 1995, Ratko Mladic and his Serb (VRS) paramilitary units arrived to the sleepy eastern Bosnian city of Srebrenica. Designated a “safe zone,” by the United Nations, civilians from neighboring cities and municipalities clamored to the area in hopes of salvation and safety. That day, over 8,000 young Bosniak men and boys were brutally executed by Mladic’s troops in the biggest single massacre of the Bosnian genocide. It was an event unseen in Europe since the murderous Holocaust campaigns carried out by Hitler’s Nazi regime.  Today, this event, the details around it, and the nature of the killings has become political fuel for nationalist politics in Bosnia and Herzegovina.  Despite the annual exhumation of new mass graves, genocide denial has once again raised its ugly head, just as it did in the aftermath of World War II.

 

Despite thousands of testimonies, photographs, video evidence and overwhelming physical evidence in the form of mass graves, Bosnian Serb and Serbian politicians such as Milorad Dodik (currently a member of the Presidency of Bosnia and Herzegovina) continue to question and deny that a genocide took place in Srebrenica and the wider Bosnia and Herzegovina.  These are by no means passive remarks but rather a targeted campaign of denial. The latest iteration of this heinous and destabilizing action is Republica Srpska’s (one of the political entities created under the Dayton Agreement) so called “truth investigation,” into the Srebrenica genocide.  The implications could not be any more clear: a rise in nationalist fervor and fascistic political ideologies (the same one that fueled the last wars in the Balkans), historical revisionism, political instability, and perhaps most worrying a return to the denial of human rights, the truth, and reconciliation in the country and this precarious part of Europe. 

 

Misinformation campaigns are nothing new.  Nazi authorities and their co-conspirators denied the killing of over 6,000,000 Jews during the war, and many who were sympathetic to their cause, continued to do so afterwards.  This did not simply stop at passively dismissing or denying the Holocaust, but ramped up through targeted campaigns of misinformation. Nazi propaganda dehumanized Jews and cultivated support for the mass murder of Jews before, during, and after the war. Their supporters, such as historian Harry Elmer Barnes, actively supported reports denying the existence of Nazi death camps and even published literature on the topic.  Neo-Nazi “think tanks,” (akin to RS’s investigative body) opened old wounds by downplaying the death count or actively denying the existence of a well-planned, full fledged campaign of extermination. 

 

Dodik and authorities in the Republika Srpska seem to have taken a page out of this playbook. During the war, mass graves were routinely covered up and concealed, and the bodies of victims moved.  Today, this makes identifying the victims very difficult since there is significant mingling of remains.  For example, one victim’s remains were found at two separate locations, over 30km away from each other.  The disinformation and deceit did not stop with the halting of hostilities. Serb nationalist politicians and their supporters routinely downplay the genocide or dismiss it outright, refusing to accept blame or to begin a process of reconciliation. They are aggressively pursuing a policy of genocide denial and introducing unsubstantiated doubt in an effort to destabilize the country, and further, deny the humanity of the victims of the genocide. In 2004, the Appeals Chamber of the International Criminal Tribunal for the former Yugoslavia (ICTY), located in the Hague, ruled that the massacre in Srebrenica constituted genocide, which is a crime under international law.  This ruling was further upheld in 2007 by the International Court of Justice (ICJ) in 2007. These rulings matter little to nationalist leaders such as Dodik and those of his ilk.  Ultimately, they have very little respect for international bodies, considering them nothing more than attack dogs against the Bosnian Serb people. Their tools of the trade have been misinformation campaigns, propaganda, and political investigations.  What they fail to understand is that genocide denial has further societal implications. The distrust and feelings of enmity in Bosnia cannot subsist without the truth being taken seriously, and authorities formally apologizing and undertaking actions to prevent similar atrocities from ever happening again.  

 

Ultimately, why is this so important?  The same de-humanizing philosophy which fed into ethnic and religious tropes leading to genocide is back, perhaps stronger than ever. The denial of history and the truth has become normalized in many parts of the world, sometimes through masked efforts at legitimacy.  In this moment it is especially important for scholars, journalists, and other professionals to stand up for the truth and demand a platform which overshadows lies and misinformation. Historical revisionism threatens not just the sense of justice for families in Bosnia, but the democratic process in the region.  If Europe is indeed serious about protecting democracy and individual rights, it needs to respond to attacks on the truth first. 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171382 https://historynewsnetwork.org/article/171382 0
The Entangled History of “America First” and “The American Dream"

“America First.” 

This simple two-word phrase, which had lain dormant for decades, was suddenly placed front and center in the 2016 Presidential campaign of Donald Trump.

For the Trump campaign, unconcerned about historic meanings or previous connotations, “America First” in 2016 meant higher tariffs, protecting Midwestern manufacturing jobs, trashing NAFTA and turning away from NATO and other longstanding global alliances. 

As Sarah Churchwell explains in her new book, “Behold America,” Trump adopted the phrase as a wedge issue, designed to polarize Americans along racial and geographic lines and peel off blue-collar Democratic voters.

There is a great historical irony in his provocative use off the phrase, because “America First” had first entered the national political dialogue in 1915 when a progressive Democrat, President Woodrow Wilson, sought to unify the country during the horrifying first years of World War I.  

Wilson, running for re-election in early 1915, used the phrase to justify non-intervention in the bloody conflict. In a major political address, he advocated a carefully calibrated American neutrality. 

Wilson proclaimed that “Our whole duty for the present…is summed up in the motto: America First. Let us think of America before we think of Europe, in order that America may be fit to be Europe’s friend when the day of tested friendship comes.” 

He added that by remaining neutral, and thinking of “America First,” the nation was not being ignorant, or self-centered.  Instead, American neutrality meant “sympathy for mankind. It is fairness, it is good will at bottom. It is impartiality of spirit and judgment.”

So how did we get from that benevolent meaning of “America First” to its use as a provocative threat by candidate Donald Trump? 

Churchwell, professor of American Literature at the University of London, unravels the complicated history behind the “America First” and the equally problematic phrase, “the American Dream” in her new book Behold America. 

Churchwell reports that the first written use of the phrase came in 1884, when an Oakland, California newspaper ran “America First and Always” in the headline above a report on a looming trade war with the British Empire.  It fell into disuse until President Wilson resurrected it in his 1915 re-election campaign.

In April 1917, America declared war on Germany and the concept of a “fair-minded” neutrality vanished. After the World War ended and the Versailles Peace Treaty negotiated in 1919, the phrase “America First” continued to be used, but with new meanings. 

For example, Warren Harding used the slogan “Prosper American First” in his successful 1920 campaign. He called Wilson’s proposed League of Nations treaty a “supreme blunder.”  One newspaper, in endorsing him, cited the fact that Harding would usher in “an era of nationalism, instead of internationalism.”   

According to Churchwell, the massive Harding victory (he won 60 per cent of the vote) “legitimized” the phrase for many Americans. It was soon adopted by anti-immigrant and anti-Catholic groups including the newly resurgent Ku Klux Klan.  For these groups, “America First” meant White supremacy and returning the nation to its “Anglo Saxon” or “Nordic” origins, and restricting immigration from Italy and Eastern Europe. 

The 1924 President Calvin Coolidge signed the Johnson-Reed Immigration Act, which fulfilled the hopes of those who saw “America First” as invocation of racial supremacy.  The new act severely restricted immigration from southern and eastern Europe by imposing quotas based on the 1890 census.  It also, in effect, banned immigrants from China and Japan.  

In the 1930s, as the nation descended into the Great Depression, references to “America First” rapidly declined.  In 1940, however, the phrase roared back into national prominence with the founding of the America First Committee (AFC), which chose aviator Charles Lindbergh as its spokesman.

The AFC, funded by wealthy businessmen and run primarily by Ivy League law students, was launched nationwide shortly after Germany conquered France. The committee claimed 800,000 dues-paying members within its first year. The AFC vehemently opposed American entry into World War II and directly attacked President Roosevelt’s aid to Britain. Charles Lindbergh, speaking at rallies across the country, suggested that American Jews were behind the effort to support Britain because they were angry at Germany for its vicious anti-Semitic policies.  

On December 7, 1941, Japan bombed Pearl Harbor and America entered the war the next day. The America First Committee collapsed overnight.  

While the majority of “Behold America” is devoted to exploring the tangled evolution of “America First,” Churchwell also examines the changing meaning of “the American Dream.” Trump, of course, famously declared “The American Dream” is dead in his campaign speeches, blaming the loss of upward mobility on unfair competition by China and a “flood” of immigrants.   

As Churchwell noted in a recent interview with Smithsonian magazine, “the American Dream” has always been about economic success, but 100 years ago “the phrase meant the opposite of what it does now.” It was a “dream of equality, justice and democracy,” not just a vision of a large house full of expensive possessions.  

The author lamented that “that “the American dream isn’t dead…we just have no idea what it means anymore.”   

“Behold America” is extensively researched and generally well written, guiding the reader through a century of political dialogue. However, it is a one-dimensional work dependent on newspaper articles, editorials and letters to the editor.  It is an etymological study of two specific political phrases, rather than a broader look at America’s self-image. 

Churchwell bases her research exclusively on print sources, citing hundreds of newspaper articles and handful of novels.  She only mentions one movie:  D. W. Griffith’s “Birth of a Nation” (in 1915 it became the first film ever shown in the White House). She also completely ignores theater, music and radio, despite the fact that broadcasts reached millions of Americans in their home. She skips over President Roosevelt’s “Fireside Chats” on radio and the widely popular weekly political commentary (often openly anti-Semitic) of Detroit’s Father Coughlin.  

She also ignores the motion picture industry. Hollywood had a major influence on American perceptions of opportunity and social justice. One only has to think of movies like The Grapes of Wrath, Mr. Smith Goes to Washington or It’s a Wonderful Lifeto realize how they shaped depictions of the American Dream. 

Behold America brushes aside the impact of these newer, influential media.  Perhaps Churchwell’s reliance on print sources is due to her background as a professor of Literature. Her previous books include Careless People: Murder, Mayhem, and the Invention of the Great Gatsby.

In the introduction to Behold America, Churchwell notes that “We risk misreading or own moment if we don’t know the historical meanings of expressions we resuscitate or perpetuate.” 

This is certainly true and despite the book’s narrow focus, readers interested in American politics will find the book offers important new context on the contested meanings of “America First” and “The American Dream.”

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171383 https://historynewsnetwork.org/article/171383 0
What I’m Reading: An Interview With Public Historian Amanda Higgins

 

Amanda Higgins is a public history administrator, working outside the academy. She often describes herself as academic-adjacent. Her work is closely aligned with academic pursuits and she loves talking with students, especially graduate students who are thinking about careers outside the academy. A scholar of 20th-century Kentucky and American history, Amanda’s understanding of the not-so-distant past helps her to connect to and build lasting relationships with people across the commonwealth. She also oversees such outreach activities as oral history efforts, the Kentucky Historical Markers program, Kentucky History Awards and the Local History Trust Fund. She holds a Ph.D. in American history. She can be reached on Twitter at @Doc_Higgs.

 

 

What books are you reading now?

 

Beyond the texts that I pull to support ongoing projects related to my public history work, I’m reading: Jeanne Theoharis, A More Beautiful and Terrible History: The Uses and Misuses of Civil Rights History; Kevin M. Kruse and Julian E. Zelizer, Fault Lines: A History of the United States since 1974; and Rebecca Traister, Good and Mad: The Revolutionary Power of Women’s Anger.

 

Reading these books alongside each other is like taking a graduate seminar in contemporary history. I do miss the lively discussion, but the books speak so nicely to each other. 

 

What is your favorite history book?

 

I struggle with the idea of a favorite, because I am always bouncing between projects, eras, and interpretations. One day I’ll be working on historical marker text about indigenous Kentucky and the next my own research rooted in the twentieth century. The best history books, in my mind at least, weave complex interpretations with compelling narrative. 

 

Books like Hasan Kwame Jeffries, Bloody Lowndes, Timothy B. Tyson, Radio Free Dixie, and Donna Murch, Living for the City helped me frame my dissertation project and were models for the best parts of my work. 

 

In Kentucky history, my friend Patrick A. Lewis creates arguments with clauses and structure that I deeply admire. I wish I was even half the writer of my advisor, Tracy A. Campbell, who makes the most mundane details compelling. So, that doesn’t answer the question, but it does name check some of the folks I recommend that others should read! 

 

Why did you choose history as your career?

 

I am nosy by nature and ask many, many questions. I grew up in a home full of books and was encouraged to read anything I wanted at a young age. I loved stories about people, especially people who weren’t like me. I started college as a journalism major, but did not enjoy my introductory class. I gravitated toward my history courses because I enjoyed reading, identifying and engaging with arguments, and digging for information. I thought I’d turn the history major into a law degree, but in the fall semester of my senior year I took a US Legal History course and a Constitutional Law course. I hated the law parts of the classes and loved the policy and implications of the laws—how the laws affected peoples, unintended consequences of rulings, precedent and challenges—and skipped the LSAT. I took the GRE, went to graduate school, and became a historian because I had more questions and wanted to do history. 

 

What qualities do you need to be a historian?

 

Endless curiosity and a dogged determination to find answers. 

 

Who was your favorite history teacher?

 

I’ve been very fortunate throughout my life to be surrounded by incredible educators. My seventh grade social studies teacher was the first teacher who showed me that history was more than names and dates. She tied history to relevant, contemporary topics and encouraged us to be independent and critical thinkers. 

 

What is your most memorable or rewarding teaching experience?

 

In the penultimate year of my doctoral program I was the primary instructor in a course called “the World at War.” The course subject isn’t my favorite or my specialty, but I had a promising student who decided she wanted to be a historian that semester. She was majoring in business or some “sensible” career path that pleased her parents, but she didn’t like those courses. History made her mind race, helped her to understand her world, and ignited a passion in her. We talked through the arguments for and against majoring in history, how she could “sell” the change to her parents, and what her future may look like. She became a history major, graduated with honors, earned a Master’s in public history and is doing a fantastic job as the second in command at a small museum. 

 

In helping her think about what her future could be, I also articulated what I wanted for myself. She helped me much more than I helped her, by asking questions about my career goals and skills. Her continued success brings me so much joy!

 

What are your hopes for history as a discipline?

 

That we get over ourselves and invite folks into the process of history. The best historians are removing the layer between the finished thing and the work to get to that finished project. History is powerful and it matters deeply for a healthy and engaged citizenship, but as historians, we’re not always good at or comfortable with showing our work. To steal a line from my advisor, we hide behind—or even in—our footnotes. We should stop doing that. 

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

I don’t own rare books, but I get to work amongst them every day at the Kentucky Historical Society. 

 

My home is full of mid-century bourbon decanters that my partner and I salvaged from my grandfather’s bar after he passed. Jim Beam used to (maybe still does?) put out a collectible decanter every year. They are such fun little pieces of Americana. Living and working in Kentucky means you’re never far from bourbon and I do enjoy historical ephemera from the industry.

 

What have you found most rewarding and most frustrating about your career?

 

The most rewarding parts of my job are the colleagues and friends who I get to collaborate with on my many projects. Seeing friends and collaborators succeed, helping connect a good idea with the right person to make sure that idea becomes a project, and championing the good history work I get to be a part of everyday sustains me through self-doubt or bad moments.

 

The frustrating parts are not anything unique or noteworthy. I like my job. I’m proud of the choices I made to get where I am and willing to take most of the frustrations that come with working in an institution to do work worth doing. 

 

How has the study of history changed in the course of your career?

 

My career is quite young. I’ve only been at this professionally for about five years now. Still, I am so impressed by the research fellows who come through the Kentucky Historical Society’s program. Their projects are inventive and inspiring. The way many of the fellows are using court records to build digital projects, or ARC-GIS mapping to illustrate the networks of enslaved labor, or material culture to understand the lived experiences of working class families is incredible. 

 

The other thing I’m really excited about is the way folks are thinking about projects that span multiple formats. The people I interact with on a daily basis aren’t thinking that the monograph is the only outlet for publication or that the monograph is the last the project will see. They’re (we’re) proposing multifaceted projects—monographs, and scholarly essays, but also public programming, exhibitions, digital tools and games, and experiential learning classes. By democratizing the end product, we’re making the most cutting edge, relevant, and impactful history more available, especially to folks who aren’t as likely to pick up or engage with a scholarly monograph. 

 

What is your favorite history-related saying? Have you come up with your own?

 

“For history, as nearly no one seems to know, is not merely something to be read. And it does not refer merely, or even principally, to the past. On the contrary, the great force of history comes from the fact that we carry it within us, are unconsciously controlled by it in many ways, and history is literally present in all that we do. It could scarcely be otherwise, since it is to history that we owe our frames of reference, our identities, and our aspirations.”—James Baldwin, “The White Man’s Guilt,” in Ebony Aug. 1965 (pg. 47).

 

I haven’t come up with one myself, at least not anything worth repeating and definitely not alongside James Baldwin!

 

What are you doing next?

 

I’m currently managing a three-year IMLS-funded diversity and inclusion initiative at KHS, working on the planning stages of a new exhibition, a new research project I’m not ready to put into the world yet, and collaborating with a number of public history projects throughout Kentucky. 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171180 https://historynewsnetwork.org/article/171180 0
Will the U.S. Government Abide by the International Law It Created in Venezuela?

 

The Trump administration’s campaign to topple the government of Venezuela raises the issue of whether the U.S. government is willing to adhere to the same rules of behavior it expects other nations to follow.

During the nineteenth and early twentieth centuries, U.S. foreign policy was characterized by repeated acts of U.S. military intervention in Latin American nations.  But it began to shift in the late 1920s, as what became known as the Good Neighbor Policy was formulated.  Starting in 1933, the U.S. government, responding to Latin American nations’ complaints about U.S. meddling in their internal affairs, used the occasion of Pan-American conferences to proclaim a nonintervention policy.  This policy was reiterated by the Organization of American States (OAS), founded in 1948 and headquartered in Washington, DC.

Article 19 of the OAS Charter states clearly:  “No State or group of States has the right to intervene, directly or indirectly, for any reason whatever, in the internal or external affairs of any other State.”  To be sure, the Charter, in Article 2(b), declares that one of the essential purposes of the OAS is “to promote and consolidate representative democracy.” But this section continues, in the same sentence, to note that such activity should be conducted “with due respect for the principle of nonintervention.”  The U.S. government, of course, is an active member of the OAS and voted to approve the Charter.  It is also legally bound by the Charter, which is part of international law.

The United Nations Charter, also formulated by the U.S. government and part of international law, includes its own nonintervention obligation.  Attempting to outlaw international aggression, the UN Charter declares, in Article 2(4), that “all Members shall refrain in their international relations from the threat or use of force against the territorial integrity or political independence of any state, or in any other manner inconsistent with the Purposes of the United Nations.”  Although this wording is vaguer than the OAS Charter’s condemnation of all kinds of intervention, in 1965 the UN General Assembly adopted an official resolution that tightened things up by proclaiming:  “No State has the right to intervene, directly or indirectly for any reason whatever, in the internal or external affairs of any other State.”

Unfortunately, the U.S. government has violated these principles of international law many times in the past―toppling or attempting to topple numerous governments.  And the results often have failed to live up to grandiose promises and expectations. Just look at the outcome of U.S. regime change operations during recent decades in Iran, Guatemala, Cuba, Chile, Cambodia, Haiti, Panama, Nicaragua, Afghanistan, Iraq, Libya, Syria, and numerous other nations.

Of course, there are things worth criticizing in Venezuela, as there are in many other countries―including the United States.  Consequently, a substantial majority of OAS nations voted in January 2019 for a resolution that rejected the legitimacy of Nicolas Maduro’s new term as president, claiming that the May 2018 electoral process lacked “the participation of all Venezuelan political actors,” failed “to comply with international standards,” and lacked “the necessary guarantees for a free, fair, transparent, and democratic process.” 

Nonetheless, the January 2019 OAS resolution did not call for outside intervention but, rather, for “a national dialogue with the participation of all Venezuelan political actors and stakeholders” to secure “national reconciliation,” “a new electoral process,” and a peaceful resolution to “the current crisis in that country.”  In addition, nonintervention and a process of reconciliation between Venezuela’s sharply polarized political factions have been called for by the government of Mexico and by the Pope.

This policy of reconciliation is far from the one promoted by the U.S. government.  In a speech to a frenzied crowd in Miami on February 18, Donald Trump once again demanded the resignation of Maduro and the installation as Venezuelan president of Juan Guiado, the unelected but self-proclaimed president Trump favors. “We seek a peaceful transition to power,” Trump said.  “But all options are on the table.” 

Such intervention in Venezuela’s internal affairs, including the implicit threat of U.S. military invasion, seems likely to lead to massive bloodshed in that country, the destabilization of Latin America, and―at the least―the further erosion of the international law the U.S. government claims to uphold. 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171384 https://historynewsnetwork.org/article/171384 0
Remembering Audrey Hepburn's Best Role

 

As International Women's Day (March 8) comes right after the Academy Awards, it's worth remembering a past winner for best actress. For she almost never got the chance to achieve that dream.

As a young girl, this future actress was living in Nazi-occupied Holland during World War II. As the war continued so did food shortages. It became a famine known as the Hunger Winter of 1944-45. People were starving to death. Some children never got to realize their dreams because their lives were lost to malnutrition.

This future Academy Award winning actress almost became one of a lost generation. Her name was Audrey Hepburn. She thankfully survived and so did others thanks to food aid.

The Allied Forces airlifted food into the Netherlands (Holland) near the end of the war after reaching agreement with the German forces. Truck convoys would soon follow upon liberation. The Supreme Allied Commander, Dwight Eisenhower, had organized a relief plan so food stocks were available to move in so people could eat again.

Audrey Hepburn would go on to become a star, winning the Best Actress Academy Award in 1954 for Roman Holiday. Her best role though was later in life becoming an ambassador for UNICEF, the United Nations agency which fights child hunger and disease. Audrey's own experience living in famine moved her to help children facing that same plight.

After visiting famine-ravaged Ethiopia in 1988, Audrey said “there is a moral obligation for those who have, to give to those who have nothing."  If we remember any of Audrey's lines it should be that one. There are many children today, like Audrey, who are living in areas threatened by famine.

Yemen, South Sudan, the Central African Republic, Syria, Afghanistan, Haiti, Mali and so many other nations are facing major hunger emergencies. There are future doctors, scientists, researchers, teachers, writers, farmers and even actresses among this population. But they may not get the chance to realize their potential if they are lost to hunger.

Even children who survive famine may become stunted for life. Audrey herself may have suffered some lifelong health issues from being malnourished during the war.

For the sake of humanity, we have to save these children. In a world where food is abundant no child should go hungry and lose their future.

Audrey thought her role as ambassador was to educate the world about the nightmares of famine. She knew people were good and would help once they realized something terrible was happening. As she told the Christian Science Monitor in 1992 “the world is full, I’ve discovered, of kind people. And I've also discovered once they know, they give, they help.  It’s not knowing that holds them up.

This is especially true because many people who can help are far away from the hunger emergencies. The starvation in civil war-torn Yemen or South Sudan is not often seen in media coverage. They need ambassadors, maybe you, to change that. You could educate others for example, that 70 percent of Yemen’s population lives in hunger and relief agencies are short on funds to help them.

We need to support relief agencies like UNICEF, the World Food Program, Save the Children, Catholic Relief Services, Mercy Corps and others fighting hunger. These charities on the frontlines of famine are desperately short on funds. Much more resources are put into military expenditures rather than feeding the hungry.

We should increase funds for the U.S. Food for Peace program, which was started by Eisenhower. The McGovern-Dole school lunch program, which feeds children in developing countries, should also see more funding.

We need to step up our diplomatic efforts to resolve conflicts that are causing so much hunger. We need to fortify peace agreements with food aid.

Global conflict, hunger and displacement are at the highest levels since the World War II era. We still have time to save many children who are suffering. As Audrey reminds us, we have a moral imperative to take action and save lives.

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171403 https://historynewsnetwork.org/article/171403 0
Is Bernie Sanders Actually A Democratic Socialist?

 

In his “State of the Union” address, President Trump railed against “socialism,” taking aim at rising left-wingers in the Democratic Party,such as Bernie Sanders and Alexandria Ocasio-Cortez who often identify themselves as “democratic socialists.” But are these policies actually socialist? According to Webster Dictionary, socialism is “a system or condition of society in which the means of production are owned and controlled by the state,” where as social welfare is “organized public or private social services for the assistance of disadvantaged groups.” Social welfare was in fact an antidote to socialism initiated by one of the most conservative politicians in world history, Otto von Bismarck, the first Chancellor of newly unified Germany. 

 

When Germany was unified in 1871, Europe had already witnessed a steady increase of socialist-communist influence. “A spectre is haunting Europe – the spectre of communism,” Karl Marx declared in 1848 while revolutions were spreading all over the continent. In the 1870s, the Social Democratic Party rose quickly in Germany, and Bismarck called those socialists “this country’s rats” and “enemies bent on pillage and murder.” Bismarck tied the socialist party to the attempted assassination on William I and banned the party. At the same time, he led the legislative action to establish a social welfare system in order to reduce the appeal of radical socialism/ communism to the working class and to increase commoners’ loyalty to the German state. Three important legislations laid the foundation of German social welfare system: the Health Insurance of Workers Law of 1883, the Accident Insurance Law of 1884, and the Old Age and Invalidity Insurance Law of 1889. Thus, the German social welfare system, arguably the first of its kind in human history, was created as an antidote to undermine socialist/communist radicalism in politics.

 

The hatred of Bismarck and many other establishments against socialism/communism was based on the ideology’s doctrine of abolition of private ownership via class struggle and “proletarian dictatorship.” Given what happened in Soviet Union under Stalin, or China under Mao, when millions landlords and business owners were killed and their properties were confiscated, Bismarck’s harsh words against radicals bending on “pillage and murder” may not seem too off the mark. However, Bismarck or any other politician could not possibly foresee the split of socialist/communist parties during the First World War. Many radical leaders in the Second Communist International abandoned Karl Marx’s calling for proletarian international solidarity against their own “bourgeoise” national government, and they became “national socialists.” At the same time, Lenin split with those “national socialists,” and formed the Third Communist International (Comintern). He condemned his fellow socialists as “revisionists” who betrayed Marx’s famous saying: “working class has no fatherland.” He went on to wage revolution against the Russian “bourgeois” state and succeeded. 

 

The split of socialist camp in Europe had profound social and political consequences, including bloodshed. A case in point was Mussolini in Italy, who betrayed his father’s socialist belief and enlisted himself to fight in the war. Later, his Strom Troopers were fighting on the streets against their former comrades, who seemed to be loyal to Moscow rather than to Rome. One of Mussolini’s admirers was Hitler, who later named his party as “national socialist party” (NAZI). The elite establishments in both Italy and Germany faced a tough choice between Marxist socialists determined to wage class struggle against private ownership and national socialists who wanted to make their nation “great again.” The nationalists seemed to be more popular and gained more seats in parliaments than their former comrades. The rest was history; Mussolini and Hitler gained power in their respective countries. 

 

In the US, the influence of socialism was on the rise during the “Gilded Age” because of unregulated and fast capitalist development. But Eugene Debs’ Socialist Party did not succeed by any means, thanks to the “antidote” provided by progressive reformers, who were largely middle-class professionals. They tried to prevent social revolution of the European typeand they did not want to see the tragedies in St. Petersburg or of the Paris commune occur in the New World. Their strategy was to educate the public to push for legislations on behalf of public interest, especially thepoor and marginalized, and against the greedy instinct of the corporate world. Consequently, we have “Workmen’s Compensation Law” and federal regulatory agencies such as FDA in place to make unregulated capitalist market economy behave more rationally, and to prevent the cumulation of private “wealth against common wealth.” 

 

Thanks in part to the effectiveness of the “antidote,” Socialism has never been influential in America. Therefore, the public is much less informed about the nature and history of socialism than the European counterpart. Some political hackers could take advantage of this knowledge gap to accuse someone they disliked, such as Obama, as a “socialist,” or to name the “Obama Care” as a “socialist legislation.” People who would buy what the political hackers would sell need to know the basic definition of socialism, which is an ideology advocating “public,” “collective,” or “common” ownership of productive means against private ownership. In practice, socialism became quite popular in Western European countries, such as UK or even Canada, where many “Crown Corporations” were owned and controlled by the government. The problem was that they were all losing money, and dependent on taxpayers’ support to stay alive. During the 1980s, the so-called Reagan-Thatcher decade of conservatism, these socialist enterprises were all but privatized. In the 1990s, Tony Blair led the Labor Party to get rid of the “common ownership” clause in its constitution, moving it from the left to the center. That allowed the Labor Party to win election after election.

 

At the same time, Bill Clinton moved Democratic Party to the center, and won the elections twice. In the era of Trump, some elected Democratic officials again seem to move decidedly toward the left, some openly call themselves “democratic socialists.” Given the history of socialism in this country and around world, is this really a winning strategy? If you are in favor of “Medicare for all,” you don’t have to call yourself a “democratic socialist” because Medicare is not exactly an enterprise of “productive means,” as is US Steel or Exxon-Mobile. This is especially true if you argue that health care is a human right for everyone. Of course, you should call yourself a socialist if you really believe in the abolition of private ownership and social control of productive means by the state/government. For the sake of history, please tell your voters what you really believe, and don’t just throw out a label without a precise definition.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171380 https://historynewsnetwork.org/article/171380 0
Battlefield Tourism At Ypres and How Public History Sites Change

 

Ypres or ‘Wipers’ (the soldiers’ slang name) in Belgium has been a magnet for battlefield tourists since 1918. The first wave of (mainly Anglophone) visitors in the 1920s thought of themselves as ‘pilgrims’ rather than ‘tourists’, but the boundaries between commemoration and commerce were as fluid then as they are now. After a brief slump in the 1970s, the tourists have been returning to West Flanders in ever greater numbers, especially since the 1990s. 

Visiting Ypres, or Ieper to use its modern name, is an amazing experience. First, there is the sheer wonder of wandering around a seemingly historic city which, on closer inspection, proves to be of very recent completion. Then, there is the impressive scale of the massive Cloth Hall, the great medieval trading market which attracted merchants from across Europe. But, that too proves to be a bit of curiosity when stared at, as the mix of very smooth, sharply cut stone merges with the pock-marked, scarred and worn pillars along the ground floor. Next to the Cloth Hall is a soaring medieval cathedral, but enter inside and it feels so new you almost expect it to squeak as it comes out of the shrink-wrap. Finally, there is the Menin Gate, a huge memorial to the British and Commonwealth missing of ‘the salient’. Tucked into the ramparts, the Menin Gate almost leaps out on the visitor walking along the street from the central square (the Grote Markt). 

It is the Menin Gate that provides the key to the rest of the mystery, for it commemorates the fact that this charming West Flanders city witnessed some of the most intense and prolonged fighting on the Western Front between 1914 and 1918. Inscribed on the Gate are the names of some 56,000 soldiers how have been missing since the war. During that fighting Ypres was reduced to rubble and ashes only to rise again in replica form. And that is an underlying theme of our work: the recycling, rebuilding, reconstruction of images, stories, and histories of Ypres which stands alongside the physical construction of memorials, monuments and cemeteries in a reconstructed landscape. It is about construction and reconstruction; the encoding and reinterpreting of a major historical event within its original space, and how the battlefield of Ypres could be brought home. 

Battlefield tourism (or ‘pilgrimage’) has been at the heart of this often deeply emotional process of bringing the war home since 1918. For the British the city was indeed a site of pilgrimage, as summed up in the title of an early guidebook, Ypres: Holy Ground of British Arms. The British thought the ground was made sacred by the bloody sacrifice of countless soldiers from the British Empire. Running alongside this reverential mode was that of the tourist, as British people sought to buy souvenirs and have home comforts. By the mid-twenties Ypres businesses were appealing directly to the British. ‘If you want a cup of good strong English tea have at the Tea Room’, was one café’s advertisement.

What the Baedeker guidebook noted in 1930 still holds true today, namely that Ypres had ‘acquired a certain English air’. Today’s visitor, walking the streets of the ‘medieval’ town, attending the daily remembrance ceremony at the Menin Gate or visiting a war cemetery on the outskirts of the city, is unlikely to bump into a German battlefield tourist; even at the German war cemetery at Langemarck (some six miles to the north-east of the city), anglophone visitors greatly outnumber their German counterparts. This was not always the case. ‘Langemarck’ had once occupied a special place in the German memory of the Great War. It was the site where war volunteers had allegedly marched into death singing ‘Deutschland über alles’ in November 1914. The memory of their noble ‘self-sacrifice’ became a rallying cry for the political right during the 1920s and 1930s. When the Wehrmacht overrun Belgium in May 1940, this was celebrated as ‘Second Langemarck’. After 1945, following a hiatus of several years, German veterans of the First World War returned to the city for the 50th anniversary celebrations, often at the invitation of the city keen to foster a spirit of reconciliation. With the passing away of the veterans during the 1970s, the presence of German visitors in and around Ypres declined dramatically – precisely at the moment when British battlefield tourism started to increase again. Most German veterans went to their graves with their war stories, and only a negligible number of families toured the battlefields in the hope of recapturing something of the war experiences of their grandfathers. Flanders – the mere mention of the name could still send shock waves through the generation of survivors in the 1950s – faded from German collective memory and largely disappeared from tourist itineraries.  The new generation, it seems, no longer felt a deep emotional connection to the war dead and the landscape of the Ypres salient.

Today’s Ieper still has thousands of British visitors, with tourism as important to the economy of the city as it was in the twenties. But, in addition to the British, the Australians, Canadians, New Zealanders and also Americans are now coming in even greater numbers, as well as people from many other nations fascinated and intrigued by meeting the last great eyewitness left of the Great War: the landscape. Modern Ieper is a world forged and shaped in the furnace of a conflict that ended one hundred years ago this November.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171381 https://historynewsnetwork.org/article/171381 0
What FDR Can Teach Us About Congress and National Emergencies

Franklin D. Roosevelt, Sam Rayburn, and Alben Barkley

 

Seventy-five years ago this week, there was a serious conflict between President Franklin Roosevelt and Congress. The United States was at war, indisputably a national emergency. Today we face a serious conflict between President Donald Trump and Congress. President Trump has declared a national emergency in order to spend monies appropriated by Congress for other purposes in order to build a wall between the United States and Mexico. Only Trump’s supporters, a minority of the country, see an emergency. If Trump is not stopped, we will have taken a serious step toward authoritarian government. We may draw some lessons from the conflict between Roosevelt and Congress in 1944 that may be helpful today.

As a follow-up to his call for an Economic Bill of Rights in his January 11, 1944, State of the Union address, Roosevelt had proposed to raise $10.5 billion for the prosecution of the war and domestic needs. The resulting Revenue Act raised only $2.1 billion and included tax cuts and new benefits for bondholders and the airline, lumber, and natural gas industries. On February 22, 1944, Roosevelt issued a veto message, charging that the measure enacted by Congress was “not a tax bill but a tax relief bill providing relief not for the needy but for the greedy.” Although Roosevelt was right in his criticism, the reaction on Capitol Hill was outrage. 

The next morning, Senate Majority Leader Alben Barkley of Kentucky, hitherto a close supporter of the president, charged that the president had given a “calculated and deliberate assault upon the legislative integrity of every Member of Congress.” Barkley announced that he would resign as majority leader because he needed to maintain “the approval of my own conscience and my own self-respect.”  He concluded his speech to a packed Senate with a call for action: “if the Congress of the United States has any self-respect yet left, it will override the veto of the President and enact this tax bill into law, his objections to the contrary notwithstanding.” The Senate gave Barkley a standing ovation. The House overrode the president’s veto later that day. The Senate did so the following day, February 24, 1944. 

Instead of persisting in his criticism of the Congress, Roosevelt sent Barkley a conciliatory message, hoping that he would not resign and that “your colleagues will not accept your resignation; but if they do, I sincerely hope that they will immediately and unanimously re-elect you.” Shortly before the Senate override of the veto, Barkley resigned and was unanimously reelected majority leader. 

An immensely popular president who had been elected three times and would be reelected to a fourth term in less than nine months time, Roosevelt nevertheless knew he needed to work with Congress and respect its authority. Although the media at the time characterized Barkley’s resignation and the Congressional override of Roosevelt’s veto as a crisis, the rift was quickly healed and Roosevelt and the Congress continued to work together on the war emergency. The controversy, it’s true, meant that Roosevelt no longer considered Barkley as a potential running mate in 1944. That was a sacrifice Barkley was willing to make. He later served as vice president under Harry Truman.

What lessons may we draw today from this controversy?  To maintain its role as the holder of “All legislative Powers” of our constitutional government, the Congress should vote to cancel Trump’s emergency declaration as provided by the National Emergencies Act of 1976 and then vote to override his likely veto. Senate Majority leader Mitch McConnell should follow the example of fellow Kentuckian Alben Barkley and support such legislation and an override of a presidential veto. Because the stakes are the survival of representative government, a grass roots movement should make their voices heard to stop the imposition of authoritarian government.

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171377 https://historynewsnetwork.org/article/171377 0
Roundup Top 10!  

Michael Cohen revealed Trump doesn’t understand America’s racist past

by Elizabeth A. Herbin-Triant

Ghettos exist because of housing discrimination.

 

I wrote about the waning popularity of history at universities. Historians weren’t happy.

by Max Boot

Americans are in vital need of the instruction that historians can provide. Instead of responding defensively to criticism, historians would be better advised to think about what all of us — I include myself — can do to counter the abysmal ignorance that has made so many people susceptible to a demagogue like Donald Trump.

 

 

Black women led the charge against R. Kelly. They’re part of a long tradition.

by Danielle McGuire

Why has it taken more than 20 years and testimony by about 50 accusers to get to this moment?

 

 

Revisiting The American Nazi Supporters of "A Night at the Garden"

by Margaret Talbot

One advantage to living through Trumpism is that it has compelled a reckoning with aspects of our country’s past that, for a long time, many Americans preferred not to acknowledge.

 

 

What journalists miss when they ignore history

by Kathryn Palmer

Media historian Earnest Perry explains why journalists should put more history in the headlines.

 

 

The Academy Is Unstable and Degrading. Historians Should Take Over the Government, Instead.

by Daniel Bessner

Were Mills and Chomsky correct to assume that radical intellectuals could have little effect on U.S. policy?

 

 

Beyond Slavery and the Civil Rights Movement: Teachers Should Be Integrating Black History Into Their Lessons

by Melinda D. Anderson

Much of what students learn about black people’s distinct American story is hit-or-miss.

 

 

2020 Will See a Monumental Clash Over America’s Place in the World

by Stephen Wertheim

Is it time for the U.S. to confront other great powers — or to retreat?

 

 

Stop calling Trump “medieval.” It’s an insult to the Middle Ages.

by Eric Weiskott

It’s not only ahistorical. It obscures uniquely modern evils.

 

 

Obama Makes It Harder to See the Arc of History Bend

by John Gans

My old boss’ post-presidential center is a missed opportunity.

</

 

The important way the 2008 crisis was worse than the Great Depression

by Matt O'Brien

The 2008 crisis is still with us to this very day even though it officially ended almost a decade ago.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171402 https://historynewsnetwork.org/article/171402 0
Levittown, PA and The “Northern Promised Land That Wasn't”

 

Racially incendiary comments and actions by the Trump administration—including his attacks on undocumented immigrants and demand to build a wall to keep out Latino refugees from Central and South America—have escalated racial and ethnic tension in the United States. The overt and covert racism that helped elect Donald Trump in 2016 is a deep-rooted phenomenon in both the Southern and Northern United States. 

 

In a 1960 essay in Esquire magazine, James Baldwin wrote that when Blacks migrated North during and after World War 2, “they do not escape Jim Crow: they merely encounter another, not-less-deadly variety.” Some believed Northern racism, while covert, was worse than racism in the South because “At least there, you haven't got to play any guessing games!” Rosa Parks, who moved from Alabama to Detroit in 1957, described that city as the “northern promised land that wasn't.”

 

Northern racism, while often less violent than that of the South, was endemic. Palisades Amusement Park in Fort Lee, New Jersey barred Blacks from using its swimming pool into the 1950s. In Connecticut, a law permitting local and private control over Long Island Sound beaches kept the beaches racially segregated until the 1970s.

 

In February 1964, civil rights activists in New York City led a one-day school boycott to highlight their demands to desegregate the city's schools and improve the conditions in schools predominantly attended by Black and Puerto Rican students. Almost half a million children stayed home from school. White New Yorkers opposed the boycott. In a September 1964 New York Times poll, a majority of White New Yorkers “believed the Negro civil rights movement had gone too far.”

 

In Levittown, Pennsylvania Northern racism escalated to the hateful and more overt levels of Southern white behavior towards Blacks. Taking advantage of federally subsidized mortgages for World War II veterans, Levitt & Sons built suburban Levittowns in New York, Pennsylvania, and New Jersey that barred African Americans. Clause 25 of the original lease for the Levitt houses, stated that the houses could not “be used or occupied by any person other than members of the Caucasian race.” Although the company was forced to remove the clause after the 1948 Shelly vs. Kraemer Supreme Court Case, the Levitts continued to use extra-legal means to keep the developments racially segregated.  William J. Levitt denied having any anti-Black bias, but claimed “I have come to know that if we sell one house to a Negro family, then 90 to 95 per cent of our white customers will not buy into the community. That is their attitude, not ours.”

 

In 1957, Levittown’s racial policy finally boiled over into Southern-style racial violence. Levittown, Pennsylvania, located about a half hour northeast of Philadelphia in southern Bucks County, boasted 15,500 ranch-type dwellings. Escalating racial attacks on a Black family that purchased a home in the community were documented in a series of New York Times articles from August 16 through December 10. The articles also reported on efforts by some community members to welcome the new family.

 

On August 15, 1957, Pennsylvania Governor George Leader ordered a State Police detachment to Levittown “to cope with an outbreak of bad feeling over the arrival of the first Negro homeowner.” For three consecutive nights, a mob of 200-350 white “men, women and children” stoned a house in the Dogwood Hollow sub-divisionpurchased by an African American family. The group smashed a window and at least six of the rioters were arrested. The Levittown protests were led by a group called the Levittown Betterment Association, which identified its main purpose as “restoring of our entire white community.”

 

William Myers Jr., who purchased the Levittown, PA home, was born in York, Pa., completed a year at the Hampton Institute in Virginia before he was drafted into the military in 1943. After the war Myers returned to Hampton and completed his degree. He planned to earn an engineering degree at the Drexel Institute in Philadelphia while working as a laboratory technician for the C. V. Hill Company in Trenton, New Jersey. His wife, Daisy Myers, was a graduate of Virginia Union College in Richmond and did graduate work at New York University. In an interview with the New York Times, Myers explained that the “tumult” took him by surprise. Before purchasing the house he had talked with several of his neighbors-to-be in Levittown’s Dogwood Hollow section and their reactions were cordial, sometimes even warm. “I expected some trouble, but I never thought it would be so bad.”

 

William and Daisy Myers

 

Despite the attacks on his new home, prior to and after the family moved in, Myers announced that he, his wife Daisy, and their three children would not be deterred by the hostilities. “We are church-going, respectable people. We just want a nice neighborhood in which to raise our family and enjoy life.” Myers, who paid $12,150 for the three-bedroom house, reported he received an offer of $15,000 to resell it, but declined the offer because “we didn’t buy a home in Levittown to make money.”

 

On August 20, the Levittown protests turned violent. “Club-swinging troopers of the state police and local township” dispersed a crowd of approximately 300 people by the Myers house after a police sergeant was knocked unconscious by a thrown rock. According to reports, the crowds now mostly consisted of teenage boys, so the police instituted a 9 P.M. curfew for “youths under 16 not accompanied by a parent.”

 

Racially charged incidents continued into September. On September 6, a five-foot cross was burned on the front lawn of Lewis Wechsler, who lived next door to the Myers. Wechsler had helped the Myers family move into their home and “stood by them when crowds staged protests on the streets last month.” Three weeks later, on September 25, police discovered the letters “K.K.K’” smeared in red paint eighteen-inches high on the wall of the Wechsler house and a large anti-integration poster on the Myers lawn depicting a white woman kneeling in fear before a Negro man.” William Myers also reported to the police that they had been receiving “obscene phone calls” and anonymous letters that “have the earmarks of the Ku Klux Klan.”

 

On September 24 police placed a round-the-clock guard on the Myers house with extra night-time patrols in the area  “because a vacant house next to the Myers was being used as a gathering place for residents who objected to their living in the community.” Eldred Williams, a Levittown resident and spokesman for the Levittown Betterment Association, claimed he was the caretaker for the property and “attached an American and a Confederate flag” to the house. The “social club” was eventually shuttered by court order

 

The incidents continued in October when “anti-Negro leaflets” were placed on car windshields during a local parent-teacher meeting. The leaflets, carrying no identification, depicted Negroes dancing with white girls and carried the legend ‘Wake Up, America, for Your Country’s Sake.’” In December, the Commonwealth of Pennsylvania finally asked the Bucks County Common Pleas Court to grant a permanent injunction against Levittown residents who were continuing to harass the Myers family and the incidents subsided.

 

The good news from Levittown was that every white resident and local organization was not racist. A local committee was formed opposed to “violence and racial bias.” The Myers family was welcomed by Lewis Wechsler and a group called the Dogwood Hollow Neighbors, who invited William and Daisy Myers to join. The Reverend Ray Harwick, minister of the Levittown Evangelical and Reformed Church, and Rabbi William Fierverker of the Levittown Jewish Center, strongly supported the Myers and rallied their congregants to welcome them. As tension escalated in the Dogwood Hollow section of Levittown, the Bristol Township Board of Commissioners unanimously adopted a declaration pledging protection to all persons and their property “regardless of race, religion, or color.” At the same time school officials announced that they had hired two African American teachers to work at locals schools, although they denied the hiring had anything to do with efforts to integrate the community. 

 

The Myers family also received support from the American Friends Service Committee, representatives of the National Jewish Labor Committee, and regional officials of the United Steel Workers union. Their lawyer, Samuel Snipes was a Quaker peace activist who worked for the United Nations in Germany after World War II helping to relocate refugees.  In a 1999 interview Snipes told about how while he was standing outside the Myers’ house several hundred protesters called him a “betrayer of white people” and shouted “thirty pieces of silver.” He believed this taunt was somehow meant to compare his support for the Myers to the actions of Judas Iscariot.

 

American suburbs still often reflect segregation patterns established in the 1940s and 1950s. As of the 2010 census, Levittown, Pennsylvania remained 87.7% White with 5.1% residents of Hispanic or Latino ancestry, and only 3.6% Black or African American and 3.8% other. One good sign for the future is that each of the electoral districts that make up Levittown, Pennsylvania voted Democrat in the 2016 Presidential election by at least 10%.

 

 

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171333 https://historynewsnetwork.org/article/171333 0
What Can the Humanities Teach Us About Big Data?

 

Like many Americans, I have a love-hate relationship with technology: I inwardly cringe when my preschooler clamors for screen-time with our iPad instead of storytime with a book. Our municipalities, our government, our insurers, and even the vendors of books are awash with technology as well. At a recent hackathon, the expert from the local transit authority confessed that with logs of accidents, and data about the wealth and race of inhabitants, they have too much data to inform decision-making.  

 

Understanding how the humanities have traditionally approached big problems can informhow experts in data science can model meaningful conclusions based on the same skillful concern with answering questions based on a serious inquiry. Humanists, after all, are experts at probing the largest questions of our species. One example might be mastering what philosophers have said about topics like justice or gender since Aristotle, unpacking the values behind those concepts,  and coming to a new understanding of how those ideas are changing in our own day.  The traditional role of the humanities is to elevate the ambitions of human beings, asking what it means to be a citizen, an heir to the legacies of learning on many continents, or an individual with the capacity of dissent.

 

Now more than ever, it is important for those who work with big data to train in the questions of the humanities – as for those in the humanities to make clear the relevance of their tools of critical thinking to data scientists.   The values of the humanities are the values of treating those questions – and many smaller ones – through skillful scholarship. 

 

The particular skills of humanities scholarship take many forms, but they all agree in emphasizing serious engagement with texts and their contexts.  They ask about the nature of the evidence at hand,the valuesthat govern the inquiry,and the many ways of modeling those concepts.   These skills, among other things, allow scholars to produce both a strong consensus about truth where it is found, while simultaneously making room for dissentabout issues of interpretation, identity and meaning.  Skillful interpretation of the data allows scholars to agree about the facts (for example, which manuscripts are the authentic production of a particular medieval scribe), while establishing room for dissent about the interpretation of those facts (for example, characterizing the perspective of Biblical literalism versus historical interpretation). 

 

I recently proposed the concept of “Critical Search” as a general model for how humanistic values translate into the world of data.  Critical Search has three major components that mirror how traditional humanists have approached big questions in the past: seeding a query, winnowing, and guided reading. 

 

Traditionally, humanists begin to unpack a category like “justice” by consulting the canons of the past (which is not to say uncritically accepting the values of the past). They “seed” their research by beginning with a review of learned writing topic, carefully choosing particular texts whose categories resonate with them, just as a gardener carefully selects the seeds to plant in the ground.  

 

Modeling this process has a lot to offer studies in big data. Much like the gardener, critical thinkers need to carefully choose their keywords, categories, and sets of documents to “seed” research in the field of big data.  When working with data, defining “justice” or even “gender” requires being clear about which definition one uses.  The choices need to be made explicit and self-reflexive because they have strong effects downstream.  They need to be documented in order to make the query replicable.  

 

In the era of big data, much of the work of “seeding” is done through the choice of algorithm – whether machine learning, divergence measures, or topic modeling, for example, is used to distill the findings of the data. From the humanities perspective, it isn’t enough to simply perform a search based on an algorithm; the algorithm itself has biases, which will redound through the search process.  Only by comparing the insights produced by different algorithms do we get insight into how a particular tool biases the result.

 

A second step in the model, “winnowing,” explains the work typically done by scholars as they read widely, gaining information about context, and following the insights of pattern recognition, discourse, or critical theory to foreground particular test cases.  This step is usually interpretive, which means that there is no objectively “right” answer about the “best” theory, but that scholarship progresses by scholars engaging from each others’ insights.  

 

In the case of big data, “winnowing” means a researcher reviews the results of any particular algorithm to ask how the data and algorithm fit her question.  It might mean, for example, discussing how the same algorithm produces different answers at different scales, or how using a different measurement produces different results.  For example, in one digital history experiment, three different commonly-accepted equations for divergence produced three radically different answers from the data.  Comparing the results of different algorithms means foregrounding the bias inherent in a particular algorithm, equation, or choice of scale.  

 

In data science work, as in problems traditionally addressed by the humanities, the right answer affords room for debate and interpretation.  The point is that engineers, even when working with big data, take care to transparently document the choice of a particular algorithm and the ways that it can be seen to bias the results.  Iterative seeding and winnowing provides safety barrier against naïvely embracing the results of computational algorithm.  At present, it is unclear how dependable most of our best tools for modeling text are, and where careful limits need to be provided.  For instance, computer scientists who deal with topic models have themselves called for more studies of whether, why, and how the topic model aligns with insights gained in traditional approaches. Eric Baumer and his colleagues have warned that there is "little reason to expect that the word distributions in topic models would align in any meaningful way with human interpretations." Iterative winnowing and reading offer insurance against embracing foolhardy conclusions from digital processes. A truly critical search requires human supervision wherever the fit between algorithms and humanistic questions is unclear.

 

The next step in the process is “guided reading,” which mirrors how a gardener picks over moldy and damaged fruit for those good for eating and those good for pie.  Presented with an archive, traditional scholars in the humanities actively choose passages for study. 

 

Digital scholars too must reckon with the choice of which findings to present.  At this stage in the process, the scholar carefully inspects the results returned by a search process, sometimes sampling them, sometimes generalizing about them (for instance by counting keywords again or topic modeling), before iterating the process again.  Making sure that there’s a human step of inspecting the data – or “guided reading” – is important to making sure that the research process is producing meaningful findings. The process of continuously "checking" the work of the computer allows the expert to judge better whether and how the resulting subcorpus fits the scholarly questions at hand.  Sampling the results in a structured, regular process allows the scholar to assess the results of a search confidently. 

 

Critical search in itself attunes the scholar's sensitivity to the bias and perspectival nature of particular algorithms. In many cases, however, one pass through the algorithms is not enough. Keyword search, topic models, and divergence measures may all be used to narrow a corpus down to a smaller body of texts, for example identifying a particular decade of interest. In order to precisely "tune" the algorithms to the researcher's question, successive rounds of the critical search process may be necessary.

 

Critical search means adopting algorithms to the research agendas we already have—feminist, subaltern, environmental, diplomatic, and so on—and searching out those tools and parameters that will enhance our prosthetic sensitivity to the multiple dimensions of the archive. Documenting the choice of seed, algorithm, cut-offs, and iteration can go a long way towards a disciplinary practice of transparency about how we understand the canon, how we develop a sensitivity to new research agendas, and how we as a field pursue the refinement of our understanding of the past.

 

By emulating the humanities and embracing the skills of critical thought, individuals who engage with the critical search process can make visible and transparent their choices about how they dealt with the data they were presented.  Like traditional humanists, they will compare and combine insights from secondary sources and canonical texts as they decide which categories will be extracted and what those categories mean.  In explaining any given approach to data, they will fully document the choices they made around different algorithms and their results, thus helping the community as a whole to make room for consensus about facts where they exist and dissent around different interpretive approaches. 

 

Editor's note: The title was corrected on February 27th at 10:30 A.M. 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171186 https://historynewsnetwork.org/article/171186 0
Teaching the Fall of the Roman Republic During the Trump Presidency

 

The summer of 2016 changed the way I teach Roman history.  That summer saw Mitch McConnell block the nomination of Merrick Garland to the Supreme Court and Donald Trump throw protesters out of political rallies. As these violations of American political norms accumulated, I began to get questions from students and relatives who were alarmed by the striking similarity of Trump and McConnell’s behavior to actions taken by Roman politicians during the last years of the Roman Republic.

 

This is not the first time that students have turned to Roman history for tools that might help them understand the present.  During the Iraq war, students and journalists alike looked to the later Roman Empire to diagnose the condition of American global hegemony. It was the Empire that mattered then as students worked to understand the contemporary relevance of moments of Roman imperial overreach like the emperor Trajan’s disastrous invasion of Mesopotamia or Maximinus Thrax’ failed attempt to conquer Germany. I had become used to talking about modern analogies to Roman imperial history. But it was the Roman Republic that students and relatives evoked in the summer of 2016.  At first, it seemed easy to dismiss their concerns. I assured my friends and relatives that American voters had learned from the past. Our fellow citizens could not possibly reward such cynical and menacing behavior. Of course, I was wrong. We had not learned enough to know better.

 

As I stood in front of a room filled with almost 100 college undergraduates to give my first lecture after Donald Trump’s travel ban was issued in early 2017, I realized how lessons from the Roman past could be newly meaningful to my students. I scanned that room and I saw students struggling to understand the abrupt transition from the administration of Barack Obama to the regime of Donald Trump. They did not want me to give them answers and they certainly did not want me to preach ideology to them. They wanted me to offer tools that could help them understand what this new world meant and how they might navigate it on their own terms. I decided to discard my planned lecture that day and instead spoke about Roman efforts to limit immigration in the 120s BCE. This prompted a wide-ranging discussion of how a representative democracy should balance the obligations it has to its citizens with the fair treatment of the non-citizens who live within its boundaries. The discussion proved so fruitful that I decided to redesign my class to focus on issues that Rome confronted 2000 years ago and that the US now struggles to address. 

 

 

I believe that this is now the main job of a Roman historian. The US republic is a descendent of the Roman Republic, patterned on Rome by our founders. When our country began, Rome offered history’s most successful republic.  Roman historians have unique insights into how these two related republics successfully confronted social and political problems, as well as the particular dangers they might face. My class now focuses on these issues. We discuss the rise of finance in Rome. We examine how Roman economic inequality grew in the mid-second century BCE in ways that resemble what has unfolded in the US over the last 30 years. We recount the sort of corruption inequality fostered in Roman political life. We study Roman crackdowns on illegal immigration in the second and first centuries and how tensions about who deserved Roman citizenship eventually erupted into violence that nearly destroyed the state. We analyze the political dangers posed by legislative obstruction and how this obstruction in Rome fostered the rise of Julius Caesar and put the Republic on the path to become an empire. And, most of all, we talk about how menacing rhetoric, political violence, and ultimately Civil War destroyed a Roman Republic that for centuries had been dominated by compromise and collegiality. 

 

These examples are not abstract. When Donald Trump calls on his supporters to throw protestors out of a rally, I am reminded of the way that the Roman populist politician Tiberius Gracchus used threats of violence as a political tactic. When Mitch McConnell uses parliamentary tactics to block votes on popular and necessary legislation, I am reminded of how the gridlock created by the Roman senator Cato led to the rise of Caesar. And when Democratic politicians call for retribution against Trump and his associates, I think of the cycles of Roman political dysfunction that primed Rome for autocracy by fostering popular cynicism about the responsiveness of Roman representative democracy in the first century BCE.

 

At every turn I emphasize to my students that the US is no more Rome than I am my grandfather—but, as citizens of the Roman Republic’s constitutional descendant, we ignore Rome’s failings at our peril. While Rome cannot tell us our future, it gives us important ways to think about the possible consequences of the disorienting political events happening around us.

 

Until January of 2017, I never imagined that the lessons from the fall of Rome’s republic would resonate so strongly in the America in which I live. It alarmed me when they did. But now, 2 years later, I take heart. My students are trying to understand Rome’s mistakes and discover sources of its political resilience. The Roman past is informing their efforts to shape the American future. With luck, they may use Roman lessons to help stabilize and even improve our Republic. 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171189 https://historynewsnetwork.org/article/171189 0
UPDATED: What Historians are Saying: In Response to Max Boot's Op Ed on Historians Click inside the image below and scroll to see tweets.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171293 https://historynewsnetwork.org/article/171293 0
Why Preserving Historical Places and Sites Matters

 

Why do old places matter to people?  Why should old places matter to historians, or to the general public that historians serve? What can we learn from the continued existence of old places in our communities, and in our nation?  Why does it matter if we save these old places or if we don’t?

 

There are many reasons old places matter, from memory, to civic identity, to history, to architecture, to beauty, to economics.  While even the fourteen reasons I name in Why Old Places Matterdon’t fully capture all the many meanings old places have for people, for the readers of History News Network, I’d like to emphasize one main idea: old places give us an understanding of history that no other documents or evidence possibly can.  

 

At Civil War battlefields like Antietam, historians and visitors alike can understand how a slight rise in the lay of the land could mean victory or defeat, and how one division was lost, while another survived.  At artists’ homes and studios like Chesterwood, the home of Daniel Chester French, who sculpted the Seated Lincoln, we can understand how a certain quality of light, or a clear mountain view, or the ticking of a clock, may have inspired a painting, poem, or sculpture – and may inspire visitors today. 

 

At the Lower East Side Tenement Museum, we can understand something profoundly visceral about cramped, dark, and crowded lives of emigrants in New York in the late 19th and early 20th centuries.  

 

And at dirt-floored, often roughly-built slave dwellings, we can try to glean an inkling of the reality of human bondage that we cannot understand from documents alone.  We experience old places with all of our senses, like full body immersion, and because of that, we understand different aspects of history as it was lived.

 

This would be enough.  But I believe that these old places play a larger role.  The continued existence of these old places may foster a deeper understanding of history that tells a more full and true story. 

 

 

Yes, these places can be manipulated to spin a particular viewpoint, like the way, for many years, the reality of slavery wasn’t acknowledged at plantation houses, or Native American perspectives weren’t expressed at frontier forts, or the way countless workers were left out of the story altogether.  One reason people weren’t acknowledged is that their places were not often recognized, valued, and retained.  These are the places that were easy to erase – to pave over with interstates, sports stadiums, and urban renewal.  Many have literally been erased from our landscape and our memory.  

 

It’s easier to pretend that slavery was benevolent if the reality of the poor living conditions of slave dwellings isn’t confronting visitors.  Or that labor unrest didn’t happen if the places where it happened are bulldozed.  Erasure of places can serve to hide truths that can’t be hidden if the place survives.  The recognition of sites by the National Trust’s African American Cultural Heritage Action Fund functions as an act of social justice.  As a descendant of the Chinese American builders of an 1850’s Taoist temple in Mendocino, California said to me, the fact that the place exists – a Taoist temple from the 1850s—announces to everyone that “we were here.”

 

If the place survives, it can also become the vortex and venue for understanding our changing civic and national identity.  The places we choose to save-or not-reflect our identity.  That’s why we see places that are important to the “enemy” being targeted in times of conflict, such as the Mostar Bridge.  The destruction of the old place is tantamount to the destruction of the group identity.  Old places may also be targeted precisely because they tell a deeper, older, and different story, such as the Bamiyan Buddhas, which were destroyed because they represented a different religion, or the archaeological sites of Babylon or Palmyra. 

 

I don’t want to suggest that we can understand everything about history simply by experiencing the old places where history happened.  In fact, I’d like to emphasize a completely different point.  These old places matter not only for what they can tell us, but precisely because they raise questions.  There are often things about an old building, or a battlefield, or a working landscape that will surprise or puzzle us.  It may only be a quirky door, or the etching of initials on glass, or an unexpected rise in an otherwise flat field, or an unusual place name.  

 

An old place continues to carry memories of other stories that we don’t necessarily understand today, like the way the bones of our ancestors continue to surface in our cities and towns where we thought there were no people buried, or the way a Hebrew letter on an ancient column reminds us that the Jews of Rome were not always forced to live in the ghetto.  

 

These puzzles upend what we thought we knew and help us remember that we can never know everything about the past.  These quirks at old places jab us to be less arrogant and remind us to be humble and open as we try to understand the past and what it means for us today.   

 

Old places matter because they give us a deeper understanding of the past – an understanding no other documents possibly can, while reminding us to be humble about what we know.  

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171192 https://historynewsnetwork.org/article/171192 0
One Of The Greatest Cover-Ups Of The 20th Century

 

How frequently do we learn that our good friend-- the one we think we know so well-- we never knew at all? No surprise that almost all humans harbor secrets, but what if those secrets, once disclosed, are darker than midnight?

In Full Flight is a story of two lives in one body. They belonged to Dr. Anne Spoerry, a good friend of mine who kept her two selves separated by a vertiginous wall of silence. At one level, In Full Flight pays tribute to Africa for its gift of secrecy—faithfully providing cover for those on the run. At another, it illuminates the burden a person of substance must shoulder even in the far reaches of a wild land. The last fifty years of Anne’ life, all in Kenya, were distinguished by her drive to help the rural poor-- a source of considerable pride to her, her adopted country and an ever-widening circle of admirers from around the world. It also illuminates the personal cost of such secrecy. 

What about her European past? I was told never to ask. So I became one of the intemperate ones who did repeatedly, only to suffer the consequences—thunderclaps of temper that reduced me to quivering jello. 

I knew Anne exclusively in Africa during the last 20 years of her life at a time when many Kenyans had taken to calling her “saintly” for her work as a medec in volant, flying doctor. Since I was only able to judge her through my Africa lens I too focused on her self-sacrifice and bravery. 

Anne and I had met because a magazine assignment. Later my profile of her grew into a chapter in one of my books. In time, our professional association mutated into a wonderful friendship—not unusual for Africa. During the many dinners Anne and I shared in Nairobi there was no disguising my admiration. Her tales of derring-do were mesmerizing. Eight years before her death, I filmed her for a PBS film. In whatever medium I chose, I reported only on triumphs—those perilous cross-country flights to East Africa’s distant reaches-- all to better the health of forgotten peoples. Over her lifetime, I wrote that the number of patients Dr. Spoerry treated and saved was simply incalculable.

When Anne died in 1999, age 80, her admirers were so numerous her family organized four memorial services to accommodate every sector of the population.  At each, the air virtually trembled with tributes. So moved was I by this outpouring of admiration for Anne I determined to write a biography, focusing on her singular approach to health care. 

A year after Anne’s death, while honeymooning with my wife on the island of Lamu, I came upon Anne’s nephew, Bernard Spoerry, and told him of my plans. “But why,” I asked off handedly, “Was she so secretive about the war?” Bernard seemed startled. All he knew was that she had bravely served in the Résistance, had been captured by the Germans and interned in a concentration camp. As for details, he had no ready answer—only that his aunt’s time in Ravensbrück had always been a well-guarded family secret. Bernard’s father, Anne’s beloved brother, knew all, but, alas, he had taken those secrets to his grave. Perhaps, Bernard suggested, the contents of Anne’s farmhouse safe would shed light on my question. A week later I opened the file, hidden away in that up-country safe for at least 50 years. On it Anne had written: “Do not open.”  

I opened it. Topmost were three attached pages, dated 1947 and headed: “CROWCASS: Central Registry of War Criminals And Security Suspects.” I read down the list by nationality. Under the heading “Switzerland,” there was one entry: “Anne Marie Spoerry… Crimes against humanity including torture.” 

After I overcame the shock, the focus of my research swiveled from the broad savannahs of Africa to the musty recesses of European libraries. I began in Basle and Zurich, and then I turned to Kew’s wartime archives. In time I visited Ravensbrück and afterwards enjoyed a correspondence with its head librarian. Finally, after nine years imploring lecommandant des archives to allow me to read the Anne Spoerry files, I received a response with a begrudging “yes.” I was directed to a military installation outside the medieval town of Le Blanc in central France. There I was allowed three hours, under the laser glare of an armed guard, to pour over a beribboned collection of letters, affidavits and testimony. These, I found, contained essential intelligence that would allow me to complete my book.

While I was confident detailing Anne’s Africa story, my knowledge of World War II was, at best, a series of generalities, most related to my father’s landing at Omaha Beach in the US VII Corps. I now needed to catch up, fully aware there were over 16,000 books dedicated to the subject of the Holocaust and concentration camps— most written by scholars and eyewitnesses. Being neither, I embarked on a learning blitz. Devouring these books was just the beginning. Critical were Anne’s own files. So too was spending a day in Ravensbrück and pacing out Block 10 (exclusive to tuberculosis victims and “idiots”) where Anne exerted power in the final months of 1944. My hours spent in Ravensbrück chill me even now. My final step was to interview three women who had known Anne in Ravensbrück—Violette LeCoq, Odette Allaire Walling and Dr. Louise LePorz. These remarkable survivors made me understand the true enormity of Nazi evil and Anne’s complicity in it. So taken was I by Dr. LePorz’s 20/20 recall and her dogged commitment to the truth that I interviewed her four times. 

From these many sources I learned that, beginning in September and ending on New Year’s Day 1945, Anne fell under the spell of fellow prisoner, double agent, provocateur and seductress, Carmen Mory. Mory was Block 10’s “block elder,” a prisoner the Germans designated “privileged” in return for her help informing against her fellow prisoners. Having helped her recover from pleurisy in August 1944, medical student Anne became her willing amanuensis. Intoxicated by the older woman’s wiles, Anne crossed the line from dutiful prisoner to exterminator. Together, Anne helped conduct a reign of terror, brutalizing the insane, making selections for the gas chamber, and, in a number of cases, administering lethal injections— all to curry favor with SS guards and... Carmen Mory. In a statement, buried in court testimony from the Hamburg War Crimes Trials in 1947, Anne claimed she had been “bewitched.”

What makes a 26-year-old member of “a good family” turn from tender humanity to brutal malice? Doesn’t this lie at the heart of life’s great riddles? The question may shadow our lives today, with its association to cover-ups and shameless falsehoods.  After the war, Anne was tried in three separate jurisdictions. Bolstered by family wealth, she escaped punishment. In the only tribunal where she was not represented by counsel she confessed that she had been spellbound by Carmen Mory. To make amends she vowed to spend the rest of her life “caring for lepers.”  Her jurors forbade her ever to practice medicine in France or its dependencies. 

Through family connections, Anne fled first to Aden and then to Kenya where, for two years, she practiced medicine without a medical diploma. Later, when she learned to fly, her medical influence expanded exponentially. The further she flew, the more famous she became—the world’s “first woman flying doctor,” celebrated by patients as “the angel from heaven.” Fame was an inspired strategy. With its trappings came the additional privilege of an apology-free way with words. Few dared risk the famous doctor’s ire with reckless questions about her past. 

Thanks to Africa, Anne was given a second chance, and then a third. Even now, her cover-up and renascence borders on the miraculous. 

While few of us harbor secrets as dark as Anne Spoerry’s, many of us cannot fail to admire her decades of private atonement, serving as her own judge, jury and executioner, even though her crimes were unspeakable.

In Full Flight raises questions that, for me, will linger forever. The first of many is “what would I have done?”

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171334 https://historynewsnetwork.org/article/171334 0
What Can One Ecologist Teach Historians?

 

On the December 12, 2018 episode of Jeopardy, Alex Trebek gave Faris Alikhan, a speechwriter from Atlanta, Georgia, a clue. “Subtitled ‘The Plant that Changed the World’ in John Gaudet’s study, this species was first widely used in Ancient Egypt.What is the answer?’”

 

Faris registered a blank even though he had $16,000 riding on it, and so the host moved on to the second contestant, Elana Schor, a journalist from Washington, D.C., who also passed.  Mary Kate Moriarty, a dentist from Dumfries, Virginia, volunteered an answer, “The what is wheat.” 

 

This was not a bad response, considering how important wheat was and still is in Egypt, but, it is wrong in several ways, the least being that wheat is a collective name for several species. Alex specifically asked for only one. 

 

The word they were looking for was “papyrus.” Once I got over the incredible surprise of seeing my name on my TV screen, I looked at the three contestants and couldn’t help but be surprised that none of them knew the answer. This confirmed my suspicion that much like the contestants on Jeopardy, many historians might still be confused about papyrus and ecological history.

 

A recent review of my latest book (The Pharaoh’s Treasure: the origin of paper and the rise of western civilization, Pegasus, NY,) brought this point home to me when the reviewer reminded me that I’m an ecologist and history was by no means my expertise.  I realized that at least in this case, I had missed my chance to get across my message.  The story told in my book was supposed to be an account of what happened during the early days when papyrus paper was the most common medium used throughout the world.  The making of this kind of paper, and the books and documents that came from it, represented to me one of the most astonishing and exciting stories in the history of the world. 

 

Ancient Egyptians used papyrusto build millions of reed boats, huts, houses, mats, and fences,and make millions of miles of rope and millions of sheets and rolls of paper. Much of the rope and paper was exported to earn foreign exchange (hence the name of the book, The Pharaoh’s Treasure).  It also provided millions of tons of locally dried stems used as fuel for the bath-fire stokers in Roman times in towns like Alexandria.  

 

Despite the importance of papyrus, as far as I could see its history hadnever been told in its entirety until I wrote the book.  The reviewer wasn’t concerned about that, and my excuses that the book was written as narrative non-fiction and was designed for a general reading public didn’t cut it.  According to him there was no substitute for history plain, simple and accurate to a fault.  

 

I still wondered if there wasn’t something in the book worth his while.  Then I thought of one item that should have impressed him, a quote from the 1911 edition of the Encyclopedia Britannica.  It was an item written by “E. M. T.”, the abbreviation used by Sir Edward Maunde Thompson, the first director of the British Museumandan historian of the first degree.

 

He wrote, “it seems hardly credible that the Cyperus papyrus could have sufficed for the many uses to which it is said to have been applied and we may conclude that several plants of the genus Cyperus were comprehended . . .” 

 

As an ecologist and particularly one specializing in the ecology of tropical African swamps I immediately knew that Sir Edward was very far off base.  E.M.T. was writing after papyrus had disappeared from Egypt (after 1000 A.D.) and well before modern ecological studies of the 20th and 21st centuries.  These modern studies show that both ancient and modern papyrus plants have an enormous rate of production.  We now know that in the ancient world no other species was needed,because papyrus was more than capable of providing for the needs of the people.  

 

Understanding ecology can help historians explain ancient history and human behavior. For example, ecological factors drove people’s choice of early writing materials. The Sumerians lived in the ancient floodplains of the Tigris and Euphrates on the eastern arm of the Fertile Crescent. Their environmentwas dominated by reeds, a situationvery similar to the wetland environment of ancient Egypt on the opposite western arm of the same Crescent. But the Sumerians used clay tablets to write on while the Egyptians used paper.  Why?  Because the dominant reed growing in the wetlands of Mesopotamia was a grass reed, bardi, that was useless as a source of paper. 

 

In the modern world we could convert this reed into paper by the pulping process, but in ancient times this was a nonstarter.  At maturity, the grass reed of the Sumerians was too stiff to be worked into a great number of handicrafts or to be made into paper or rope.  So the people of Egypt were lucky in that they had a natural resource, the papyrus sedge, a flexible reed of infinite possibilities, while the Mesopotamians had to make do with a clumsy substitutefor paper.  

 

Both wetland resources eventually disappeared: papyrus vanished from the Delta before the Crusades while the grass reed marshes were recentlydrained by Saddam Hussein. Though neither region suffers today from a shortage of paper (they import what is needed) the loss of wetlands that acted as natural filters has resulted in a serious decrease in water quality in both places.

 

Ecology is also important in explaining the history of  climate change. In my book, I discuss changes in the aquatic fauna and flora in ancient times that reflect what is happening in today’s global climate change. During the Holocene wet phase in Northern Africa, a period sometimes referred to as the “Aqualithic,” wetlands, papyrus swamps and large lakes flourished.  It was after this that papyrus paper appeared. Eventually, the wet phrase ended and wasfollowed by the drying that resulted in the present north African desert. The whole process reflects the changes affecting the large modern day papyrus swamps that are threatened along the Nile river basin in South Sudan.

 

Egypt was the only one among the many early hydraulic civilizations blessed by the papyrus plant.  This is another fact that seems to have escaped modern historians, and there is precious little in the general or scientific literature to indicate how or why this plant, almost from the beginning of recorded history, was revered and treasured by common folk and the pharaohs. The Pharaoh’s Treasure provides some answers and indicates some recorded and projected changes in ecology that can help historians improve and sharpen their conclusions in this area.  It can also help substantiate historical changes that might otherwise go unnoticed.

 

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171181 https://historynewsnetwork.org/article/171181 0
A Dozen Books To Help Weather The Political Storm

 

This is already shaping up to be extraordinary year in the United States. The country is deeply polarized and it is quite possible that Robert Mueller’s investigation of President Donald Trump could ignite a political and constitutional crisis. The endless barrage of tweets, newspaper headlines, and “Breaking News” bulletins on cable television has an exhausting and disorienting effect. It is becoming increasingly difficult to distinguish the important from the mundane, the consequential from the sensational. 

During a recent talk to a group of retirees in Carbondale, Illinois I was asked how to navigate this tense and fraught time. I surprised myself when I urged them to turn away from their televisions (particularly the partisan cable offerings) and toward books. Specifically, books that would allow them to see our country’s political traditions in a fuller and more nuanced way and to provide context to evaluate future choices. I suggested they read a balanced biography of a leader of the party they affiliate with and an equally balanced biography of a member of the opposing party. 

I’m a journalist, historian, and the director of a public policy research institute that is affiliated with a public university. As a dedicated and dogged reader of non-fiction I propose a dozen political books that can guide us through this gathering political storm. 

Let me begin with several caveats. I am not arguing that the following are “the best,” the “most impactful,” or the “most inspiring” books in American history. Nor am I positing that these twelve books provide a comprehensive and coherent framework to view American political life. They reflect my personal preference for history and biography and do not include works of sociology, psychology, spirituality, or literature, all of which also offer critical perspectives that are relevant for this turbulent time. 

I believe these books tell important stories, introduce us to consequential people from our past, describe our best traditions, and demonstrate that positive change is possible but often only after years of hard work and frequent setbacks. They exemplify the fact that America is a creative, and sometimes chaotic, country that tends to get things right, but often only after perplexing and disappointing detours. 

Twelve books that can help us weather the coming political storm:

 

1. A Brilliant Solution: Inventing the American Constitution by Carol Berkin, 2002.  

Berkin is a professor of American history at the City University of New York and Baruch College. Her book describes how an effort to fix the Articles of Confederation morphed into a negotiation that resulted in a new Constitution. A Brilliant Solution chronicles how chaotic and fiercely contested the drafting of this document was. Nothing was certain when the 1787 Constitutional Convention began in Philadelphia, and failure was a distinct possibility. Deep divisions persisted between those who supported a strong federal government and those who wanted the states to retain substantial powers. The final Constitution was an elegant compromise that emerged from a messy and unpredictable process. Berkin also argues that the subsequent battle for ratification was a hard-fought endeavor that could easily have failed. She makes it clear that not everything in the Constitution has worked out as the founders intended. For example, they were determined to create a government in which the legislative branch was more powerful than the executive. This was once the case but clearly no longer is. The founders feared a powerful executive and worried that a tyrant might one day govern the nation. Thus they took care to create procedures for removing such a person from the presidency. “The founding fathers did not expect their constitution to endure for centuries,” Berkin concludes. “They could not predict the social, economic, or technological changes produced by the generations that followed them. Perhaps their ultimate wisdom, and their ultimate achievement, was their willingness to subject the Constitution they created to amendment. With this gesture—a true leap of faith—they freed future generations from the icy grip of the past.” 

 

2. Eisenhower in War and Peace by Jean Edward Smith, 2012.  

Dwight Eisenhower was a solid but unspectacular West Point graduate from America’s Heartland who grew into a world-class military leader, helped win World War II, and served two terms as the president of the United States. Smith, one of America’s pre-eminent biographers and historians, depicts Eisenhower as a man of decency, force, intelligence, moderation and competence. He argues that except for Franklin Roosevelt, Eisenhower was the most successful president of the 20th century. Smith credits Ike for ending a three-year stalemated war in Korea, resisting calls for preventive war against the Soviet Union and China, deploying the Seventh Fleet to protect Formosa (Taiwan) from invasion, facing down Soviet leader Nikita Khrushchev over Berlin, moving the Republican party from its isolationist past, balancing the federal budget, and building the interstate highway system. He argues that Eisenhower understood the demands of leadership although he often concealed his political acumen. “All of his life Eisenhower managed crises without overreacting. He made every task he undertook look easy. Ike’s military experience taught him that an outward display of casualness inspired confidence, and he took that lesson into the White House,” Smith writes.

 

3. Truman by David McCullough, 1992. 

This Pulitzer Prize-winning biography by one of America’s most popular writers introduced President Harry Truman to a generation of Americans.  Few stories are more remarkable than Truman’s maturation from a mostly obscure senator to a mostly obscure vice president to a magnificent president. Following the death of Franklin Roosevelt, Truman became Commander in Chief in 1945, a critical time in American history. He confronted the sternest challenges imaginable and handled them successfully. Truman built the foundation for the United States and the West to eventually win the Cold War with the Marshall Plan, NATO, and the U.S. national security apparatus which includes the Department of Defense, the National Security Council, and the CIA. Truman was the first president to recommend that Congress take action on civil rights. He also is remembered for desegregating the armed forces. “Ambitious by nature, he was never torn by ambition, never tried to appear as something he was not,” McCullough writes of Truman. “He stood for common sense, common decency. He spoke the common tongue. As much as any president since Lincoln, he brought to the highest office the language and values of the common American people. He held to the old guidelines: work hard, do your best, assume no airs, trust in God, have no fear. Yet he was not and had never been a simple, ordinary man. The homely attributes, the Missouri wit, the warmth of his friendship, the genuineness of Harry Truman, however appealing, were outweighed by the larger qualities that made him a figure of world stature, both a great and good man, and a great American president.” 

 

4. The China Mission: George Marshall’s Unfinished War, 1945-1947 by Daniel Kurtz-Phelan, 2018

George Marshall was a quiet giant in American history. He served as the Army chief of staff who organized the American victory in World War II and later as Secretary of State and Secretary of Defense under Truman. He also won the Nobel Peace Prize. Truman called him “the greatest military man this country ever produced--or any other country produced.” Time magazine named him “Man of the Year” in January of 1948 and wrote that Americans “trust General Marshal more than they have trusted any military man since George Washington.” The China Mission chronicles Marshall’s impossible quest to broker an agreement between China’s warring communist and nationalist forces. Even in failure, Marshall emerges as honorable, creative, and devoted to duty. Kurtz-Phelan, executive editor of Foreign Affairs, offers a meticulous account of Marshall’s diplomacy as he tried to forge a peace deal between two sides, who ultimately did not want an agreement. “It is a story not of possibility and ambition, but of limits and restraint; not of a victory achieved at any cost, but of a kind of failure ultimately accepted as the best of terrible options,” Kurtz-Phelan writes. “Marshall came away with a more limited sense of America’s place in the story. A master of self-control, here he came to terms with what could not be controlled…Yet that did not mean settling into fatalism. Marshall also returned home with a deeper sense of what it would take to succeed in the larger struggle just beginning.” Kurtz-Phelan portrays Marshall as a remarkable man who was respected “not so much for brilliance of insight as quality of judgment.”

 

5. Lincoln by David Herbert Donald, 1995.  

Abraham Lincoln remains the most towering figure in American political life and our archetypal statesman. Donald, a revered Lincoln scholar and biographer, shows Lincoln’s large spirit, clear intelligence, implacable will, and deep humanity. He describes Lincoln’s striking and inspiring capacity for growth, which enabled one of the least experienced and most poorly prepared men ever elected to high office to become America’s greatest president. Donald sees Lincoln as a man of ambition, vision, and tactical shrewdness. “The pilots on our Western rivers steer from point to point as they call it—setting the course of the boat no farther than they can see and that is all I propose to myself in this great problem,” Lincoln once told a lawmaker who asked about the president’s post Civil War plans for the United States. Donald does not shy away from Lincoln’s flaws such as his sometimes passive and reactive approach to problems. “I claim not to have controlled events but confess plainly that events have controlled me,” Lincoln once acknowledged. However, Lincoln’s wisdom, decency, vision, and persistence ultimately prevailed. Few nations can claim a leader of Lincoln’s stature as part of their historical inheritance. 

  

6. The Walls of Jericho: Lyndon Johnson, Hubert Humphrey, Richard Russell, and the Struggle for Civil Rights by Robert Mann, 1996.  

Mann, a former Senate aide, offers a compelling account of the struggle to enact civil rights legislation, from the bitterly divisive 1948 Democratic Convention when three dozen Southern delegates walked out over the issue of Civil Rights, to the passage of historic legislation in the 1960’s. Years of stalemate, failure, and small advances preceded the Civil Rights Act of 1964 and the Voting Rights Act of 1965. Mann hones in on three of the dominant players in this drama: Senator Hubert Humphrey, a passionate and relentless advocate for sweeping civil rights legislation, Senator Richard Russell, a fierce, formidable opponent and segregationist, and Lyndon Johnson, the senator and then president who helped secure the critical legislative victories. Mann details Russell’s unrelenting battle to defeat civil rights initiatives but also makes the important point that once civil rights legislation became the law of the land, Russell implored all Americans to respect these laws. “I have no apologies to anyone for the fight I made. I only regret that we did not prevail. But these statutes are now on the books, and it becomes our duty as good citizens to live with them,” Russell said. Mann argues that passing civil rights and voting rights legislation was important, but they were just a first step. “The easy part was over,” he writes. “Congress had finally enacted powerful legislation to guarantee the civil and voting rights of all black Americans. Enforcing those new rights would be difficult, but not as daunting as the task of creating and nurturing an economic and social environment in which black citizens could achieve the American dream of economic independence and prosperity.”  

 

7. The Warmth of Other Suns: The Epic Story of America’s Great Migration by Isabel Wilkerson, 2010.  

Wilkerson, a former New York Times reporter and journalism professor, chronicles the historic migration of millions of African-Americans from the South to the Midwest, the Northeast, and the West between 1915 and 1970. “Over the course of six decades, some six million black southerners left the land of their forefathers and fanned across the country for an uncertain existence in nearly every other corner of America,” Wilkerson writes. “The Great Migration would become a turning point in history. It would transform urban America and recast the social and political order of every city it touched. It would force the South to search its soul and finally to lay aside a feudal caste system.” Wilkerson focuses on three people who illuminate this larger drama: Ida Mae Brandon Gladney, the wife of a sharecropper who moved from Mississippi to Chicago in the 1930s; George Swanson Starling, a laborer who left Florida in the 1940s for New York City; and Robert Joseph Pershing Foster, a doctor who departed Louisiana in the early 1950s for Los Angeles. Their stories highlight this critical demographic event in American life and also offer inspiring examples of resilience. The Warmth of Other Suns provides a strong complement to The Walls of Jericho. Wilkerson’sprotagonists benefited from civil rights and voting rights legislation, but also endured discrimination and employment challenges. “Over the decades, perhaps the wrong questions have been asked about the Great Migration,” Wilkerson concludes. “Perhaps it is not a question of whether the migrants brought good or ill to the cities they fled to or were pushed or pulled to their destinations, but a question of how they summoned the courage to leave in the first place or how they found the will to press beyond the forces against them and the faith in the country that had rejected them for so long. By their actions, they did not dream the American Dream, they willed it into being by a definition of their own choosing.” 

 

8. The March of Folly: From Troy to Vietnam by Barbara W. Tuchman, 1984. 

Tuchman was one of America’s great narrative historians and in this book she explores why governments throughout history have so often acted in ways that have been harmful to their own interests. She examines four episodes: the Trojan decision to accept a Greek horse into its city, the failure of six Renaissance popes to effectively deal with the Reformation, King George III’s mistakes that fueled the American Revolution, and America’s debacle in Vietnam. Tuchman argues that in all of these cases, leaders were warned against their courses of action, they had feasible alternatives, and critical mistakes were made by groups not just one misguided person. “A phenomenon noticeable throughout history regardless of place or period is the pursuit by governments of policies contrary to their own interests. Mankind, it seems, makes a poorer performance of government than of almost any other human activity,” Tuchman writes. “Why do holders of high office so often act contrary to the way reason points and enlightened self-interest suggests? Why does intelligent mental process seem so often not to function?” Tuchman does not find clear answers to her questions, but observes that self-deception “is a factor that plays a remarkably large role in government. It consists in assessing a situation in terms of preconceived fixed notions while ignoring or rejecting any contrary signs.” Some critics have challenged Tuchman’s use of four very different historical examples as well as her definition of governmental folly but she raises profound questions that resonate today. Tuchman’s final chapter, “America Betrays Herself in Vietnam” is sobering, especially given that the disastrous experience and outcome did not lead to clearer thinking by policymakers when they launched wars in Iraq and Afghanistan.  

 

9. Diversifying Diplomacy: My Journey from Roxbury to Dakar by Harriet Lee Elam-Thomas with Jim Robinson, 2017.  

Harriet Lee Elam-Thomas grew up in Boston, studied at Simmons College and the Fletcher School of Law and Diplomacy, and had a rich and consequential career as an American diplomat.  She represented the United States in Senegal, Cote D’Ivoire, Mali, Athens, Brussels, and Istanbul.  In her inspiring and deeply evocative memoir, Elam-Thomas describes the promise of America and the challenge of being an African-American woman diplomat.  “Whenever I encountered colleagues in diplomat settings, there were usually men.  I saw very few women—and even fewer women of color.  Wearing a skirt in the Foreign Service was ten times more difficult than having brown skin.  Few of my colleagues looked like me.  Although I do not profess to have been an effective diplomat because of my race, ethnicity or gender, I believe these elements of my persona paid dividends.  Though I thoroughly prepared for each new assignment, I am certain the key to making a contribution toward a credible articulation of U.S. foreign policy was the fact that I had the opportunity to serve and felt included.  Without inclusion, all of the lip service to diversity would have been suspect,” she writes.  Elam-Thomas argues that the example the United States offers, and the respect she extends, to other countries is a deeply powerful force.  American diplomats are most effective when they are culturally sensitive and modest.  “The best leaders are sincere and humble,” she writes. “Real leadership has to do with integrity and performance; neither one can take a holiday. They reflect on your character and soul.”   

 

10. The Glory and the Dream: A Narrative History of America, 1932-1972 by William Manchester, 1974.  

Manchester, a skilled journalist and historian, chronicles life in the United States from the Hoover and the Depression to Nixon and Watergate. This narrative largely focuses on the politics of this era but includes memorable descriptions of American life from the 30s through the 60s. We learn about the books people read, the clothes they wore, the movies they watched, the music they listened to, trips they took, the celebrities they followed, the companies they worked for, the churches they attended, and the cultural fads that influenced their lives. The Glory and the Dream is a vivid and nostalgic journey through important decades in American history. It transports you back in time, while also raising larger issues about the country. “Change is a constant theme in the American past,” he writes. “The United States is the only nation in the world to worship change for its own sake and to regard change and progress as indistinguishable.” Manchester also detects a periodic “yearning to renounce the present and find restoration in the unconsummated past.”   

 

11. These Truths: A History of the United States by Jill Lepore, 2018.  

A professor at Harvard and staff writer for the New Yorker, Lepore tells the story of the United States from Christopher Columbus to Donald Trump. These Truths is packed with broad assessments, fascinating vignettes, compelling sketches, and provocative questions. She believes the American experience can be understood by exploring three phrases, which Thomas Jefferson referred to as “these truths” - political equality, natural rights, and the sovereignty of the people. “The roots of these ideas are as ancient as Aristotle and as old as Genesis and their branches spread as wide as the limbs of an oak,” Lepore writes. “But they are the nation’s founding principles: it was by declaring them that that the nation came to be. In the centuries since, these principles have been cherished, decried, and contested, fought for, fought over, fought against.” She believes that it is important for Americans to understand the full sweep of their nation’s history and appreciate the country’s successes, failures, accomplishments, and inconsistencies. “There is, to be sure, a great deal of anguish in American history and more hypocrisy. No nation and no people are relieved of these. But there is also, in the American past, an extraordinary amount of decency and hope, of prosperity and ambition, and much, especially, of invention and beauty…The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it,” writes Lepore.

 

12. Dangerous Games: The Uses and Abuses of History by Margaret MacMillan, 2008.  

A professor of history at the University of Toronto, MacMillan has written popular and highly regarded books on the Versailles Peace Conference of 1919, the British Raj, World War I, and Richard Nixon’s 1972 trip to China. In Dangerous Games, MacMillan argues that history should be read, studied, and savored. But it should be used cautiously when considering public policy. Examining the past is useful and sometimes edifying, she posits, but it does not provide a prescription for navigating the present or predicting future. Studying history allows you to delve into complex situations, evaluate leaders, and render informed judgments. It encourages you to ask hard questions, study evidence, and probe assumptions. “If the study of history does nothing more than teach us humility, skepticism, and awareness of ourselves, then it has done something useful,” she writes. MacMillan is concerned that some people, either through malice or sloppiness, use history in ways that are harmful.  She believes, “History can be helpful; it can also be very dangerous. Sometimes we abuse history, creating one-sided or false histories to justify treating others badly, seizing their land, for example, or killing them. There are also many lessons and much advice offered by history, and it is easy to pick and choose what you want. The past can be used for almost anything you want to do in the present.”

 

In addition to the specific merits of each of these books the discipline of serious reading helps us slow down, think more carefully, weigh evidence, and respect - and expect - careful argument. I acknowledge that these dozen books will not provide a clear guide to our current challenges. They do, however, offer wonderful stories and introduce us to remarkable people, many who were important, not famous. They remind us that America has endured much and accomplished great things and reinforce the fact that both parties have honorable traditions and have been led by impressive people. Great successes have often occurred when ordinary people have acted responsibly, fairly, and with an eye to the future. We owe it to them to conduct ourselves honorably in these trying times and with concern for those who will come after us.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171188 https://historynewsnetwork.org/article/171188 0
The Many Joys of Writing History

 

“It was finally time to move the body,” I began my book on Alexander II and his times (text edition here). “The funeral bells were tolling in the churches of St. Petersburg. For nine days the corpse of the dead Emperor, Nicholas I of Russia, had remained within the red walls of the Winter Palace. On some of these days the odor of his decomposing body had been almost unbearable. But it was now Sunday, February 27, 1855, and the winter sun was shining brilliantly.”

Writing and rewriting that paragraph was very pleasurable. Capturing the reader’s interest from the get-go is important and I thought I had done it well. When writing, cutting or rearranging words to produce better sentences and paragraphs, I sometimes feel like a sculptor—say a Rodin, only chiseling not marble, butwords away until more beautiful prose appears.

President Kennedy said he was happy being president because he thought of happiness as Aristotle did—“the full use of one’s faculties along lines of excellence”—and the presidency provided ample opportunity for such use. Similarly, researching and writing history should be joyous because it gives us many opportunities to pursue excellence.

They begin with our choice of subjects. For many historians, their first major writing project might be a thesis or dissertation. Mine was a Ph.D. dissertation, “Vladimir Soloviev and the Russophiles.” I picked the topic because it was interesting and important to me and, I hoped, to others. It was the late 1960s, and the Cold War, civil rights, nationalism and imperialism (e.g., in regard to Vietnam), the ecumenical religious movement, and tolerance all seemed significant. As opposed to the Russophiles, Russian philosopher Solovievwas a critic of antisemitism and nationalism and a proponent of ecumenism and tolerance. 

In subsequent decades, interest and importance have remained major criteria. In various years, the Cold War, Russia, nationalism, imperialism, racism, capitalism, war and peace, our relationship to technology, terrorism, climate change, and the dangers of Trumpism have seemed significant. Thus, these topics have been major subjects I have written about in books, essays, and reviews. My 2008 book, An Age of Progress? Clashing Twentieth-Century Global Forces, for example, begins with a chapter on twentieth-century violence (especially wars and terrorism) and subsequently treats many of the other subjects mentioned above. 

In writing about topics we consider important we might sometimes feel like we are battling for truth or “fighting the good fight.” Psychologist William James once wrote, “What we now need to discover in the social realm is the moral equivalent of war: something heroic that will speak to men as universally as war does, and yet will be as compatible with their spiritual selves as war has proved itself to be incompatible.” As James realized, such heroic endeavors, even if only in our own minds, can bring us great satisfaction.  

Besides the importance of issues, our interest in various topics might be stirred by other considerations. Several of my books evolved out of teaching, primarily courses on Russian history and twentieth century global history. For the latter course, seven editions of a joint-authored text and a book of readings came about because several of us at Eastern Michigan University thought we could create better books than the ones we were using. 

A similar thought process led to A History of Russia (Vol. I and Vol. II). There was a need for a treatment not only of Russian political history, but also of everyday life:the role of women, rural life, law, religion, literature, and art. My Alexander II book also developed out of a course I team-taught called “Russia in the Age of Tolstoy and Dostoevsky.” In addition, the book reflected an experiment I wished to attempt. In interweaving the story of Alexander II with those of the great writers and thinkers of his era, I wanted to write a scholarly work that read like a good novel. 

In my retirement years, I have become fascinated by the topic of wisdom. In An Age of Progress? I had quoted General Omar Bradley, “Ours is a world of nuclear giants and ethical infants.  If we continue to develop our technology without wisdom or prudence, our servant may prove to be our executioner.” Wisdom now seemed more important to me than ever, and I discovered the Wisdom Page and began contributing to it. 

Whatever the reason or reasons we choose to write on a topic, however, it should bring us joy. No one is forcing a subject on us. We should choose it because it is important, at least to us, and we want to write about it. The choice should be the beginning of a great adventure. 

This is not to deny that there may be rough patches along the way, especially if our writing is constrained by considerations like trying to please a dissertation adviser with an ax to grind. As in sports, we perform best when do so freely without mental constraints. Mundane considerations might also influence us like time constraints and the availability of sources.  

Working in libraries, archives, and on the Internet can sometimes be daunting, as Robert Caro has recently indicated in regard to the Lyndon Johnson archives, but if we maintain an adventuresome spirit we can often overcome our fears. Sometimes it may be necessary to visit new and foreign places, which can be especially enjoyable and adventurous. In researching my Russian history books, I spent much time traveling in the Soviet Union/Russia, not only in Moscow and St. Petersburg, but in more remote areas such as the Siberian city of Irkutsk. One of the chapters in my Alexander II book was set there. Other chapters took place in various other Russian cities, in the Crimea, at royal palaces, Tolstoy’s estate, and foreign places that the prominent Russians of the time visited, including the German resort areas of Baden-Baden, Bad Ems, and cities such as Geneva, Paris, and London. I greatly enjoyed stopping at many of these places and taking pictures that I later included with the online version of my Alexander II book. (I realize that foreign travel expenses might be a problem for some researchers, and I was fortunate in being able to combine my research with paid work abroad.)

One of the great pleasures of research and writing is the opportunity it gives us to spend so much time writing and thinking about great people. My list includes not only Soloviev, Tolstoy, and Dostoevsky, but also the subjects of my long wisdom portraits—Anton Chekhov, Carl and Paula Sandburg, Dorothy Day, Andrei Sakharov, and E. F. Schumacher—and many others such as Franklin Roosevelt (FDR), Martin Luther King Jr., Pope Francis, and the writers W. H. Auden and Wendell Berry. They have all taught me important lessons, for example, FDR on political imagination and creativity, Chekhov on the dangers of dogmatism, and Day and MLK on various dimensions of social compassion and love.

In addition, being trained in intellectual history and the history of ideas has provided me the background to explore and clarify in my own mind important concepts such as wisdom, love, empathy, humor, humility, imagination, and creativity. Such an effort has been most enjoyable. 

The writing process itself should bring us joy. Storytelling is as old as human existence, and central to our lives, and history itself is an attempt to tell stories truthfully. If we do it well, we exercise virtues such as wisdom, empathy, imagination, humility, and humor, and, as Aristotle and JFK realized, that should make us happier.   

One of the enjoyable tasks in writing textbooks is relating humorous historical anecdotes—it gives students a bit of relief. Here are a few from my A History of Russia. In recounting the appearance of the mid-thirteenth Mongol khan Batu, Friar William Rubruck noted that he was “about the height of my lord John de Beaumont.” Unfortunately, as the historian Karamzin said: “It’s a pity we have not had the honor of knowing Monsieur de Beaumont!” Much later during Stalin’s purges of the 1930s, admittedly not a laughing matter, the writer Isaac Babel said “Today a man talks frankly only with his wife—at night, with the blanket pulled over his head.” (Ironically, however, Babel was unwise enough to have also spent time under the covers with the wife of NKVD head Ezhov; after Ezhov himself was arrested, he denounced Babel, who was subsequently shot.)

A final pleasure that we historians should enjoy is a type of satisfaction that must have come to writers like Shakespeare and Chekhov. It is the chance to stand back and observe the whole human story, with all its tragedies and comedies, and like Shakespeare think “What a piece of work is man!” but also “what fools these mortals be!”

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171190 https://historynewsnetwork.org/article/171190 0
What 2020 Presidential Hopefuls Can Learn From Carter, Clinton, and Obama's Foreign Policy

 

In the past half century, America has had three Democratic Presidents---Jimmy Carter, Bill Clinton, and Barack Obama—who all pursued peace and reconciliation in international affairs and diplomacy. The goal has been to repair historical divisions and resolve conflicts where possible.

Jimmy Carter promoted peace and diplomatic ties between Israel and Egypt, inviting Israeli Prime Minister Menachem Begin and Egyptian President Anwar el-Sadat to Camp David in 1978. The subsequent Camp David Accords, resulted in diplomatic recognition and an historic exchange of visits ushered in what is now 40 years of ties between the first Arab nation to recognize Israel and the Jewish state.

Carter also negotiated the Panama Canal Treaty, which led to eventual takeover of the Canal and the Canal Zone by the Panamanian government in 2000, improving relations with Latin America, and overcoming unfounded fears that Fidel Castro and Cuba would seize control of the Canal.  At home, this treaty helped promote the rise of the Far Right in the Republican Party and contributed to the election of Ronald Reagan in 1980.

In 1978 Carter went beyond the economic and travel ties Richard Nixon had established with the People’s Republic of China, by recognizing the mainland government as the government of China, rather than Taiwan, and rejecting a thirty-year old policy. While U.S. recognition enhanced the relationship of the world’s most populous nation and the world’s leading democracy, it did not lead to a long-term liberalization of the Chinese leadership.

Early in his first term, Bill Clinton pursued peace negotiations with Israeli Prime Minister Yitzhak Rabin and Palestine Liberation Organization leader Yasser Arafat. The resulting Oslo Accords were a step toward acceptance of co-existence between the Palestinians and Israelis, which failed miserably after Rabin’s assassination by a right wing Israeli in the fall of 1995. A later attempt in 2000 to reach a Camp David II Accord also failed. (Clinton blamed Arafat for its failure, at a time when the Israeli government was amenable to compromise, after the defeat of Benjamin Netanyahu as prime minister in 1999).

Like Carter with the People’s Republic of China, Clinton reconciled with a long-time adversary when he established diplomatic relations with the Communist nation of Vietnam in 1995, sending former prisoner of war Senator John McCain of Arizona, and Vietnam War participant Senator John Kerry of Massachusetts to finalize the reconciliation. 

Also during the Clinton administration former Senate Majority Leader George Mitchell successfully negotiated the Good Friday Agreement between the warring Protestant and Catholic factions in Northern Ireland ---a settlement which is now in its twenty-first year. 

Following through on his pledge, in 2011 Barack Obama withdrew U.S. forces from Iraq after eight years of bloodshed, although he later sent some troops back to counter ISIS, the terrorist group. 

Four years later, Obama opened relations with Raul Castro’s Cuba, despite opposition from Republicans and Cuban Americans in South Florida. Relations with Cuba have cooled under Donald Trump and the issue of the Guantanamo Bay Naval Base, which the US has controlled since the Spanish American War, remains an area of dispute, as well as the issue of hearing loss suffered by western diplomats in Cuba. 

Obama’s decision to join with the United Kingdom, France, and Germany, along with Russia and China, to pursue a nuclear deal with the Islamic Republic of Iran in 2015, to avoid that nation’s development of offensive nuclear weapons, which might threaten Israel, Saudi Arabia, and other Middle Eastern nations, was a well-regarded attempt to solve a major foreign policy dilemma although his successor has repudiated it.

None of the three Democratic Presidents had a “perfect” record of peace and reconciliation in diplomacy. Carter failed to release American hostages in Iran in 1980; Clinton led the belated bombing of Serbia in 1995 and the 1998 bombing in Kosovo; Obama continued military action in Afghanistan and Iraq. Nevertheless, the three Democratic presidents in the last half-century pursued peace and reconciliation within reasonable means and circumstances.

As one examines the major foreign policy issues that will probably be factors in the 2020 presidential primary, what stands out is the historical desire of Democrats to promote peaceful dialogue and avoid war.  However, the next president, Democrat or Republican, will face a daunting set of foreign policy dilemmas---everything from terrorism, immigration, North Korea’s nuclear program, and possible Russian collusion in U.S. politics to global warming, the Middle East, and the repair of the U.S.’s fractured relationships with its traditional allies.  

It would be wise if the next President of the United States examined the records of accomplishment in foreign policy of the three most recent Democratic Presidents.  It is important to pursue dialogue but also make it clear that the nation will pursue its basic values of respect for human rights and maintaining a strong defense. Working through international organizations and restoring the strong ties America has had since World War II with its allies in Western Europe, Canada, and the Far East are essential to promote democracy and international stability and to deal with the growing crisis of global warming.  The next president must respect the sovereignty of other nations and avoid future military ventures to shape relations in the Western Hemisphere.  

It is essential to make it clear to the Russian and Chinese governments that we insist on fruitful relations while remaining rivals.  Working with Kim Jong Un of North Korea to resolve the issue of his nuclear program is likely to be a major challenge as well.  Trying to resume diplomacy with Iran and Cuba would also be a positive move.  The Middle East will remain the most complex international challenge. We should remain loyal to the survival of Israel but not allow them to dictate American policy in the area.  And Saudi Arabia must not have an undue impact on American policy, as that would undermine the chances of a negotiated resolution of tensions in the area.  And the constant danger of terrorism is a never ending nightmare that the next American President will face. 

The Democratic candidates must show they are ready to handle these challenges. Looking to their Democratic predecessors offers a valuable model of navigating international policy.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171332 https://historynewsnetwork.org/article/171332 0
What I’m Reading: An Interview With Russianist Historian Katherine Antonova

Katherine Pickering Antonova is Associate Professor of History at the City University of New York, Queens College. She is the author of An Ordinary Marriage: The World of a Gentry Family in Provincial Russia (Oxford University Press, 2013) and the forthcoming Essential Guide to Writing History for Students (Oxford University Press, 2019) as well as A Consumer's Guide to Information: How to Avoid Losing Your Mind on the Internet (Amazon, 2016).

 

What books are you reading now?

 

Imagining Russian Regions: Subnational Identity and Civil Society in Nineteenth-Century Russia by Susan Smith-Peter, which I’m reviewing, and I’m also enjoying Naomi Novik’s new novel Spinning Silver. 

 

What is your favorite history book?

 

Books that had the biggest impact on me were Barbara Alpern Engel’s Mothers and Daughters: Women of the Intelligentsia in Nineteenth-Century Russia and Laurel Thatcher Ulrich’s A Midwife’s Tale: The Life of Martha Ballard, Based on Her Diary, 1785-1812. There are so many others I’ve loved for other reasons: some that may not be familiar include Loren Graham’s Ghost of an Executed Engineer, Sheila Fitzpatrick’s Stalin’s Peasants, and the new biography of Rasputin by Douglas Smith. Everyone should read the gutting new book by my colleague, Deirdre Cooper Owens, Medical Bondage: Race, Gender, and the Origins of American Gynecology, and a very different but equally great new book by another colleague, Julia Sneeringer’s A Social History of Early Rock ‘n’ Roll in Germany: Hamburg from Burlesque to the Beatles, 1956-69.

 

Why did you choose history as your career?

 

I usually say it’s because I like to read other people’s diaries, and it’s true: I love reading primary sources, preferably the originals, in an archive where you can feel the texture of the paper and spot the occasional hundred-year-old dead bug still stuck to the page. As a kid I always read history and historical fiction for fun. About a year into college I realized that if I majored in history, my fun reading could also be my required reading. It was a couple of years after that before I really learned what a historian actually is. When I was growing up, people who liked history became k-12 teachers and the only other kinds of professionals people associated with history were archeologists, geneologists, or writers (I didn’t like science or family trees and writers starve or live off trust funds, as I was told at the time). I didn’t encounter any clear sense of what academic historical research looks like until late in college when the secondary sources we were reading were connected to stories of the archives told by my profs – most notably Sheila Fitzpatrick, who was one of the early western researchers in Soviet archives and has some very good stories to tell. 

 

What qualities do you need to be a historian?

 

This is such a great question because I think there’s a popular perception that all a historian really needs is a great memory for names and dates, which is of course not remotely true. Some might go a bit farther and wish that historians were also “good writers.” But scholarship that can be vetted and built on by other scholars can’t be written the same way as historical fiction or popular history intended for pleasure reading, though of course it should be well-written for its purpose. It’s difficult to articulate a historian’s qualities because I think we rarely try. A historian looks at the world as continuously changing, not as a separate “past” that to many people can feel as remote as fiction. A historian sees what happened in the past not as a set story, but as disparate bits of evidence that might or might not cohere enough to answer our questions. A historian sees each event or action or phenomenon as contingent: no outcome was inevitable, every factor depended on other related factors. Our important questions are not “what happened” – that’s just a means to an end -- but always “why” and “how”: why did things go this way and not that way, how do systems and processes and ideas work and do they vary depending on different contexts? Being a good historian is about being meticulous with details, like names and dates, sure, but also much more important details like the nuances of meaning in a text, the shifting perspectives of multiple narratives, the interactions of multiple causal forces, and the infinite ways that context affects people’s behavior and views. A good historian doesn’t just find, track, organize, and weigh all these many factors in a infinitely complex system with imperfect evidence. She also synthesizes all of it to figure how what it might mean: what questions it can answer, with what implications. And we have to do all that while vetting and citing every source and constantly checking ourselves for errors of logic or bias, so that our work can serve its purpose as the foundation of further research, as a reliable teaching tool, and as a reliable basis for all the other kinds of history that rest wholly or partly on scholarship: fiction, popular history, and public history.

 

Who was your favorite history teacher?

 

I was incredibly lucky to study with Sheila Fitzpatrick, one of the great historians of the Soviet Union, as an undergraduate at the University of Chicago in the mid-90s. I actually had no idea how important her work was until I was about to finish and a grad student clued me in. I only found out in retrospect that a lot of the readings she assigned were by people who virulently disagreed with her. She led her own discussion sections, where she walked us through making our own interpretations of primary sources. She modeled what historians do and helped us try it out, and that ultimately is what all the best history teachers do.

 

What is your most memorable or rewarding teaching experience?

 

My university introduced a new general education program that added to the typical “freshman comp” introductory writing course a second semester of writing instruction that was explicitly disciplinary. I developed the version of the course for history at my college, taught many sections of it over several years, and ultimately brought this together with my grad school training in Composition Studies to write a manual for students on how to write history essays – not just the typical research essay (which is now often assigned only toward the end of an undergraduate program) but the other common kinds of history writing, from primary source close-readings to exam essays and imaginative projects like role-playing games and historical fiction. In many ways it’s the culmination of a journey I started as a grad student with a fellowship in the University Writing program -- I found ways to answer questions that have bothered me from the first baby steps of my career. Students led me to those answers because this course gave us the time to explore the meta questions like “why are we here?” “what is this for?” and “can this be better?”

 

What are your hopes for history as a discipline?

 

I’m very glad to see, and be a part of, efforts to get much better at articulating what historians do and why it matters. The American Historical Association has taken the lead with this through their Tuning Project, but we’re also seeing a new generation of historians who came through grad school at a time when training in pedagogy and composition studies were finally beginning to be recognized in places like history Ph.D. programs. I was trained in composition studies as part of a fellowship to teach a year of freshman comp, and it completely changed the way I approach teaching. At the same time my university instituted its first formal teaching training programs for grad students, which I took part in. Those experiences opened the door for me to a whole world of evidence-based problem-solving and purpose-driven teaching. Earlier generations were often left to either continue what was “traditional” or reinvent the wheel on their own. Now there are a lot more people who got at least some teaching training, and there’s much greater access to conversations about teaching via blogs and Twitter and so on, and all of that is gradually having a very positive impact on how history is taught, which in turn is helping historians be more articulate in general about what we do. I hope it is also the beginning of a substantive course correction in how the public understands what history is all about. But we’re doing all this in an environment of crippling austerity and short attention spans, so my hope is qualified by quite a bit of anxiety that things may just keep getting worse despite everyone’s efforts. 

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

Ha! I don’t get paid enough for that. I have some trinkets from having spent a lot of time in Russia on research trips, but nothing valuable. My favorite souvenir other than the deck of cards with the Romanovs on them is a pair of thick, felted mittens with tiny holes in the pad of the forefinger and thumb to make it just possible to hold a pen. I made them for working in the archive in Ivanovo where I spent 10 months researching my dissertation. Archivists keep the building chilly partly for preservation, partly from lack of funding, but after sitting still for hours day after day in the reading room the cold seeps into your bones. It’s worth it to read other people’s diaries, though. 

 

What have you found most rewarding and most frustrating about your career? 

 

As one of the incredibly lucky few who got a tenure-track job (just before the 2008 crash obliterated the market), I’m very aware of the incredible privilege of being able to teach with the speech protections afforded by tenure, not to mention the steady salary even though it’s low and the workload is ridiculous – at least I don’t have to teach this many or more courses at several different campuses for a fraction of the money! This means I am able to mostly focus on my teaching, research, and professional service, which is what I’m good at and worked so hard for for so many years. This is fulfilling work, and though I work all the time, I can do it with a pretty extraordinary degree of flexibility and autonomy and I know how rare and valuable that is. 

 

The most frustrating thing is how few qualified, brilliant historians share my luck, and the unspeakable loss to society of so much knowledge and talent being thrown away by the adjunctification process that exploits people as long as possible until they leave the system. The stupidity and waste of it horrifies me. Similarly, the pervasive myths about what higher ed and the humanities are, how they work, and the value we bring to society are really frustrating, not least because these myths are deliberately perpetuated as a way of continuing this process of taking all the money out of public education and putting it in the pockets of private companies and their executives. It’s a looting process that’s happening throughout our society, though – not specific to public education, though we’re a relatively easy target because everyone has at least one teacher they’re still angry about, and it’s easy to exploit those feelings. At the same time, we’re a valuable target because we actually do have such a big impact on society that knocking us down a few pegs really disrupts the whole system. 

 

How has the study of history changed in the course of your career?

 

The whole climate of education has changed so much in my lifetime – the loss of funding and public support for education, particularly in the humanities, the rise of testing culture in k-12 schools, the adjunctification and commodification of higher ed, and the resulting crises in tuition/indebtedness and textbook costs, faculty security, workload, and pay, and the impact of all those crises on what anyone can do in a classroom – have been so huge that sometimes it’s hard to remember to look at the relatively smaller changes within my discipline or field. Historical research has been greatly enriched in the past few decades by increased diversity in who can do history and how we do it and the kinds of questions we ask. But all that progress is now at risk because of these outside pressures, and much of the great recent work historians have done doesn’t ever really reach the public thanks to the defunding and privatization of academic publishing alongside the limitations on everyone’s attention. 

 

In my own subfield of imperial Russia, I’ve seen tremendous new insights thanks to a generation of archive-based work in the wake of the Cold War. Most people are aware that the collapse of the Soviet Union opened archives and enriched the study of that period, but it has a huge impact on the study of Russia before 1917, also, and on our whole conceptualizion of the Russian empire as a continuous entity before and after that date. For example, our understanding of regional diversity and the importance of developments outside the capital cities is only just beginning to inform the broader narrative – that’s one of the contributions of the book by Susan Smith-Peter that I mentioned above.  

 

 

What is your favorite history-related saying? Have you come up with your own?

 

I love the Lamartine quote, “History teaches everything, including the future.”

 

What are you doing next?

 

This fall I’m hoping to finish the research for my second monograph, which centers on Russian police investigations of women mystics and sectarians in the 1820s and 30s. I’m also developing another book project about writing, with a different focus and working with a co-author. 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171034 https://historynewsnetwork.org/article/171034 0
The Price of Thomas Scott is Pretty High

 

For years, British drapery store owner Thomas Scott and his family dreamed of selling the store and house attached to it and using the money to follow their dreams. Scott’s wife wants to move to a high-end suburban town, his son wants to earn a job as a government auditor and his daughter wants to move to Paris and work for one of the world’s great millinery shops. They nearly sold the store several times, but the sales always fell through. Now, finally in 1913, Mr. Wicksteed arrives from the Courtney Company and he wants to purchase the building. He offers Scott an enormous amount of money, far more than expected and more than enough to pay for the dreams of everybody in his family. The problem is that the Courtney Company runs a chain of dance halls and wants to make Scott’s drapery store one more of them, clinging couples, night long drinking and wild music included.

Scott hates dance halls. He finds them morally repugnant, the scourge of Great Britain and the “Devil’s work.”

What should he do, take the big pot of money, hand over the keys and please everybody in his family or turn down the offer and please his conscience?

That’s his problem in The Price of Thomas Scott, a fine 1913 play by Elizabeth Baker that is being revived long after its British premier. The play, nicely directed by Jonathan Bank, is a delightful look at the millenary business just after the turn of the 20th century and the men and women who work in it. The drama is a sturdy look at the lives of women in that era and shows how thousands of them found work in millinery and other women’s’ shops, creating a whole new underclass of fashion workers.

The play is full of skilled actors who deliver fine performances in the play, which is part of an 18-month long tribute to Ms. Baker, who was an early sensation as a playwright in the 1909 to 1920 era, but faded from view over the years.

Thomas Scott is played well by Donald Corren, who races about the stage, sales contract in hand, debating what he should do. His wife Ellen is played smartly by Tracy Sallows as joyful and mournful as she worries about what decision her husband will make. Daughter Annie Scott is played well as the Paris dreaming milliner by Emma Geer. The dashing and charming Wicksteed is played by Mitch Greenberg, Others in the talented ensemble are Nick Medica, Mark Kenneth Smaltz,  Jay Russell, Andrew Fallaize Josh Goulding,  Avana Workman and Arielle Yoder.

The play opened last night at the Mint Theater, housed at Theater Row on W. 42d Street. The Mint revives plays from the first half of the 20thcentury, finding chestnuts here and there that for some reason did not enjoy longs runs or were forgotten by theater lovers over the years.

Ms. Baker is one of those forgotten playwrights. A typist, she burst into the spotlight with her play Chains, about shop girls and their troubles, that debuted in 1909. She received much attention and wrote a succession of successful plays, including The Price of Thomas Scott. Right after that, she faded in popularity and penned her last drama in 1932, taking time off to live in a hut on a South Pacific Island, pounding away on her typewriter under a thatched roof.

Now the Mint has brought back Ms. Baker.

The Price of Thomas Scott sets up the dance hall (the Devil) vs. simple street life (God), but there are numerous subplots involving the children and wife of Scott, a very conservative businessman, and the men in his circle who are all reformers of some kind. It is, like Baker’s other plays, a vivid story about the ordinary lives of ordinary people, with all of their tension seething beneath the surface.

The play is also a nice historical look at middle- and working-class England in 1913, a country just invaded by the automobile and a country battling for fashion supremacy in Europe. Under all of that in the UK, as in the U.S. are the armies of millinery girls and shop girls who really were the machine of the fashion industry in that era. No Guccis, Versaces, St. Laurens or Valentinos to be seen anywhere.

The play does have its problems. The first twenty minutes are terribly slow moving. Wicksteed’s arrival at the shop finally gives it the jolt of electricity that it needs. Some of the subplots are a bit thin. Scott’s connection to his buddies seemed a little hollow. His son gets lost in the shuffle. Scott himself is not as defined as he might be.

Despite those weaknesses,The Price of Thomas Scott is certainly worth the price of admission.

PRODUCTION: The Mint Theater is the producer of the play. Scenic Design: Vicki R. Davis, Costumes: Hunter Kaczorowski, Lights: Christian Deangelis, Sound:  Jane Shaw. The play is directed by Jonathan Bank. It runs through March 23.   

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171331 https://historynewsnetwork.org/article/171331 0
Why Aren't Americans More Enraged About Russian Interference? World War 2 History Helps Explain

 

No one should be surprised if a substantial minority of American voters remains unconvinced that Russian agents interfered in the 2016 presidential election, or if an even larger percentage of the American public downplays the urgency of the Russian threat to the nation’s electoral system.  Complacency in the face of foreign dangers is nothing new in the United States; during World War Two, it was plainly visible among Americans less than two months after the Japanese attack upon Pearl Harbor.

 

In the days immediately following December 7, 1941, outraged patriots flocked to recruiting stations, purchased more than $1 billion worth of war bonds, and destroyed all the Japanese-manufactured merchandise they could find — including Christmas tree ornaments, silver bells on street lights, and china plates which rampaging customers smashed on department store floors.  Political leaders of both parties pledged their loyalty to the Roosevelt administration. But the initial outburst of enthusiasm and unity soon faded.

 

By the middle of January — following the fall of Guam, Wake Island, and Manila to Japanese forces — numerous commentators noticed that Americans seemed to be going about their business “apparently unmindful,” as the Chicago Tribune complained, “of the soldiers who are dying in the Philippines,” notably on the Bataan peninsula.  After traveling across the country, newscaster Edward Murrow agreed that most Americans regarded the war with no sense of urgency, viewing the conflict instead as a spectacle, and themselves as mere spectators. Upon completing his own cross-country journey, columnist Walter Lippmann lamented “the unawareness, the overconfidence, and the complacency” of Americans.

 

“I find all around me a smugness and satisfaction which to my mind are entirely unjustified,” concluded William Batt, director of the War Production Board’s materials division.  “It is no secret, that, by and large, the American people have not yet settled down to the grim task of fighting for their freedom and their lives,” noted the Washington Postwith a palpable sense of frustration. On Capitol Hill, veteran Congressman Hatton Sumners of Texas complained that “I do not see yet that vital, stirring consciousness of responsibility, consciousness of danger . . . which we have got to have.”

 

After touring defense production plants on the Pacific coast in early 1942, Congressman Lyndon Johnson noted that Americans’ “lack of mental concern is evident on every facial expression. Their complacency, indifference, and bewilderment are an open invitation to [enemy] direction.”  In Louisiana, Governor Sam Houston Jones grew so alarmed at his constituents’ lethargy that he mounted a three-week, statewide speaking tour in a sound truck with the words “Awake America” on the side.

 

Declining war bond sales provided hard evidence that a significant segment of the public remained disconnected from the war effort.  In January, Americans purchased $1.075 billion worth of bonds, although only one in seven American wage earners bought any bonds at all. February sales tumbled to $711 million, and March fell to $565 million; some people reportedly hesitated to buy bonds because they believed rumors that Roosevelt was putting the cash into his personal savings account.

 

Instead of buying war bonds, American consumers bought and hoarded goods which they expected to soon become scarce:  sugar (sometimes 100 pounds at a time), nylon stockings, radios, wool clothing (after federal officials announced that the Army would require most of the nation’s wool supply over the next twelve months), bicycles, soap, socks, and women’s foundation garments, which were disappearing rapidly due to a shortage of rubber.  From coast to coast, retailers sold more merchandise to consumers in the first several months of 1942 than at any time in the nation’s history.  Treasury Secretary Henry Morgenthau repeatedly warned the public that excessive spending and the subsequent rise in prices could cripple the war effort, and merchants in New York, Philadelphia and Cincinnati took out newspaper ads begging their customers to stop buying so much.  “You cannot buy victory and luxury in the same market at the same time,” warned the New York Times, but consumers kept on stockpiling merchandise.

 

Other Americans, fearful that the Roosevelt administration planned to confiscate their savings to pay for the war, withdrew their savings from banks and buried their cash in a safe (rentals of safe-deposit boxes surged) or stuffed it in a sock, instead of lending it to the federal government or investing it for productive use.  “Hoarded dollars are idle dollars, slacker dollars,” one New York bank reminded its customers.  “Hoarding in times of war amounts to sabotage against the Government.” Still the drain continued; by mid-April, Americans were hoarding an estimated $500 million to $1 billion.

 

By far the most prevalent explanation for the public’s “dangerously complacent” attitude was the insistence of military censors and administration spokesmen in painting the military situation in a far too optimistic light.  As the War Department and the Navy controlled the release of military information, official communiqués downplayed or concealed American defeats — “Never before in history,” complained one critic, “have so few kept so much from so many” — and released stories that transformed relatively minor triumphs into brilliant victories.  “They are misleading,” charged the Washington Post,” in the sense that they are utterly out of proportion.”

 

The government’s practice of distorting news reports fueled the conviction of many Americans that “one American can lick ten Japs or five Germans and that is all there is to it.” The public’s overconfidence was boosted further by the Roosevelt administration’s decision to keep arms-production figures secret, while promoting reassuring  — but misleading — news stories about the progress of the nation’s output of war goods, typically accompanied by multicolored charts and graphs of upwardly spiraling production and the same adjectives used over and over: “huge,” “enormous,” “immense,” “tremendous,” or “magnificent.”  “We here in the United States,” observed journalist Alistair Cooke, “studied our own production story and assumed the victory.”

 

Not even the loss of Bataan and Corregidor in April and May could shock Americans out of their complacency.  Factory workers who were earning more than a subsistence income for the first time in their lives spent their newfound wealth on good times in restaurants, theaters, night clubs, and strip joints.  Sales of jewelry and champagne soared.  And throughout the spring and summer, Americans bet more money on sports than ever before.  One after another, horse-racing tracks across the nation broke their own records for wagers; on Kentucky Derby Day, bettors at Churchill Downs wagered nearly $2 million on the first leg of the Triple Crown, while nearby booths selling defense bonds reportedly took in less than $200.

 

Meanwhile, German U-boat commanders were waging an increasingly successful campaign — “Operation Paukenschlag,” or “Drumbeat” — against American and Allied merchant shipping along the East Coast.  The task of sighting and sinking slow-moving vessels, particularly oil tankers, was made easier by carelessly illuminated shops, homes, and thoroughfares along the coast.  Sometimes visible for ten miles or more at sea, the lights from shore silhouetted the tankers and created a “neon shooting gallery” for the submarines.  Despite the horrific toll — by the end of April, nearly two hundred ships had been sunk, and more than four thousand sailors and passengers killed — a disturbing number of Americans ignored repeated requests by civilian and military authorities to dim or extinguish their lights.  Dimout inspections in early June consistently revealed widespread and “flagrant violations” of Army regulations; New York City remained a “murderous mound of light”; and a sailor on a merchant ship passing a New Jersey resort town at night noted that “the lights were like Coney Island.  It was lit up like daylight all along the beach.”

 

In late summer and autumn, a new problem emerged as a rising wave of absenteeism at war plants and shipyards undercut arms production efforts.  “The extent to which absenteeism impedes our war effort is beyond belief,” grumbled a high-ranking official of the War Manpower Commission, as workers increasingly took time off to nurse hangovers (“Monday morning sickness”), to look for better-paying jobs, or simply to engage in shopping sprees (“Pay-day richness”).  In November, a Senate investigating committee headed by Harry Truman reported that excessive worker absences were reducing production by as much as ten percent in many war plants; in some shipyards, the absentee rate reached 18 percent, while Ford’s massive Willow Run bomber plant in Michigan, suffered absences of nearly 25 percent.  Federal officials attempted to combat the trend with posters and slogans designed to make “work skippers” feel like slackers, but to little effect.

 

One year after Pearl Harbor, many Americans still refused to give their wholehearted commitment to the war effort; calls for sacrifice often had been ignored, and restrictions evaded.  “There’s one thing America hasn’t yet got around to,” claimed the Carrier Corporation in an early December appeal to the public.  “We’re still waiting for that old-fashioned American ‘drive’ that hits the line head-on and sweeps everything before it.”

 

The war would have to wait a little while longer.

 

 

 

 

 

         

         

         

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171184 https://historynewsnetwork.org/article/171184 0
The Polar Bear Expedition: When America Intervened in Russian Affairs The Great War had been over for more than two months when on the frigid morning of January 19, 1919. Lt. Harry Mead peered into the slim, gray light of dawn and saw hundreds of ghostlike figures, armed and dressed in white, skimming the frozen ice of Russia’s Vaga River on skis. They were coming towards the forty-five men of his first platoon of Company A, 339th United States Infantry Regiment. 

 

Soon enough, the Americans were overwhelmed, and as Americans fell left and right,the survivors of the initial assault began a fighting retreat through deep snows and temperatures of minus-fifty; more than half of the platoon would be wiped out at the hands of 1,700 of their Bolshevik counterparts. 

 

The long-anticipated attack by the Reds marked the beginning of the end of a particular episode in U.S. history that few today are aware of:  the intervention into the Russian Civil War by America and its British, French, Polish, and even Italian allies. 

 

As allegations and investigations swirl around the questions of Russian intervention in our 2016 elections, and whether the Republican campaign of now-President Trump actively colluded with agents of Vladimir Putin, it’s worth a look back one hundred years to our own intervention—invasion, if you will—in Russia, if only to add some context to the current air of distrust and enmity that exists between the U.S. and it former Cold War opponent. 

 

While most Americans might point to the post-World War 2 Cold War between east and west as the starting point of antagonism between the two super powers, the story line must in fact be pushed back much farther, back to the summer of 1918.Then-President Woodrow Wilson was agonizing over whether to accede to Allied insistence that the U.S. supply troops for a planned foray into Russia. 

 

The aims of the intervention, as put forward mainly by the British, were twofold:  (1) the reestablishment of the Eastern Front, which the Bolsheviks had quit in March, 1918, leaving Germany free to transfer some eighty divisions to the Western Front; and (2), more grandly, the incitement of a counter-revolution that would throw the Reds out of power and alleviate the nascent “Red threat” that caused some governments in the Western democracies to worry. 

 

Wilson wanted no part of an intervention, but under tremendous pressure he finally agreed in mid-July 1918 to send a single U.S. regiment to northern Russia. His orders were that the Americans were only to guard the tons of materiel that the Allies had sent to Russia during the war, and stay out of the post-revolution turmoil that saw Reds fighting so-called “Whites”—Tsarist loyalists—across Russia.

 

That regiment was the 339th, part of the 85th Division. Mostly draftees from Michigan and Wisconsin, the men had arrived in England in August and were anticipating a trip across the Channel to France when they were plucked for assignment in Russia. 

 

The regiment sailed for Archangel, Russia in late August of 1918, and arrived at the bustling northern port on September 5. But almost as soon as they arrived, the men were hustled off their transports by their British overseers and sent by train south in the direction of Vologda, and southeast on the wide Dvina River towards Kotlas, hundreds of miles away.

 

Left behind were some seventy men stricken with or already dead from the flu that had haunted the ships that had carried them to Archangel. Many more—more than 150—Americans would die in battle or from wounds in the coming months. 

 

Living or dead, all were mere pawns in an epic drama that would play out through the coming fall, winter and spring—and well past the November 11, 1918 Armistice that ended World War 1. 

 

Company by company, the men took up posts in isolated positions across a front of four hundred miles, the farthest-flung being Company A, which after seeing duty on the lower Dvina River was went down the Vaga River, a tributary, to a God-forsaken village called Nijni Gora, 250 miles from Archangel.

 

Polar Bear Memorial at White Chapel Cemetery in Troy, Michigan

On November 11, ironically, Company B was attacked by hundreds of Bolsheviks—the men called them “Bolos”—at Toulgas on the Dvina. Firing from hastily made blockhouses, the Bolos were finally put at bay after several days of fighting with the help of a single battery of Canadian artillery. But the company would remain there in that isolated spot until March, under constant attack and harassment by the Bolsheviks. 

 

On the railroad, the third battalion struggled through the fall and winter to push south towards Vologda, but never got to within three hundred miles of the city.  Meanwhile men died, and men suffered in the brutal cold—so cold that the men slept with their water-cooled machine guns in an effort to keep them operable.   

 

Meanwhile, the Reds’ minister of war, Leon Trotsky, was building an army of 600,000 men, which he vowed to use to push the invading and slim Allied force into the White Sea. That process began at the farthest Allied base at Nijni Gora, where Company A in the third week of January was routed, and each man subsequently fought for his survival during a two-week retreat through the harsh elements. 

 

By then the main reason for the intervention—the recreation of the Eastern Front—was a non-issue. So, too, was Woodrow Wilson’s stated intent of guarding war materiel from the predations of the Germans. 

 

And still the men—American and Allied—suffered and fought bravely and wondered why they had never been given a single good reason for their being there. As the winter progressed, they wondered even more when they might be withdrawn. 

 

An increasing clamor back home in the U.S. was rising at the same time, and in February Wilson made the decision to pull out—an impossibility, however, until the frozen White Sea broke up. Finally, the Americans began withdrawing across the fronts and shipped from Archangel in early June, followed by the British in September.

 

The intervention would be quickly forgotten in the U.S., if not by the survivors, by the public in general. Few lessons would be learned from the expedition, and similar mostly unsuccessful invasions—in Vietnam, in Iraq, in Afghanistan—would be undertaken with no look back at our experience in Russia. 

 

But the average Russian has been taught about the intervention, and the average Russian remembers. Though Presidents Richard Nixon and Ronald Reagan would both proclaim separately that Americans and Russians never faced off in battle, the Russians still remember a time when foreign, Western nations interfered in their affairs. 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171182 https://historynewsnetwork.org/article/171182 0
Roundup Top 10!  

Why Are We Still Segregating Black History in February?

by Christina Proenza-Coles

Even before the U.S. was a nation, African-Americans played crucial roles in nearly every stage of history in the new world. ‘Honoring’ that history in one month is a travesty.

 

Americans’ ignorance of history is a national scandal

by Max Boot

You simply can’t understand the present if you don’t understand the past.

 

 

Max Boot’s Screed Against Historians

by John Fea

Boot is the latest public intellectual to chide academic historians for failing to speak to public audiences.

 

 

A quick response to Max Boot’s critique of historians

by Glenn David Brasher

Aren’t retention and anti-intellectualism the real problems?

 

 

Do American Women Still Need an Equal Rights Amendment?

by Susan Chira

We’re already living in Phyllis Schlafly’s nightmare.

 

 

Winthrop's "City" Was Exceptional, Not Exceptionalist

by Jim Sleeper

There are compelling anthropological reasons why almost every society in history has invented “special covenant” and “origin” myths, or “constitutive fictions.”

 

 

The black men of the Civil War were America’s original ‘dreamers’

by Colbert I. King

Like dreamers of today, those black soldiers and sailors also had families and attended churches; some were enrolled in schools.

 

 

What we get wrong about the roots of slavery in America

by Eric Herschthal

How we remember the past shapes the fight for racial justice today

 

 

Protesting on Bended Knee: Race, Dissent and Patriotism in 21st Century America

by Eric Burin

This digital book is available for free download!

 

 

The "Historovox" and the Bad Synergy Between Historians and Journalism

by Corey Robin

When academic knowledge is on tap for the media, the result is not a fusion of the best of academia and the best of journalism but the worst of both worlds.

 

 

Why History is Important Today

by Luis Martínez-Fernández

"The most effective way to destroy people is to deny and obliterate their own understanding of their history.”

 

 

The Catholic Church is bursting with secrets. Investigating one will unravel them all.

by Garry Wills

Secrecy in one clerical area intersects with secrecy in others.

 

 

How George Washington would fix partisan politics in America today

by Eli Merritt

The United States' first President George Washington would prescribe rule of law and emotional intelligence to help us heal.

 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171319 https://historynewsnetwork.org/article/171319 0
We Need to Acknowledge the Power of the Israel Lobby

 

Last week, Rep. Ilhan Omar (D-MN) flippantly tweeted, “It’s all about the Benjamins” alluding to the financial role of pro-Israel groups on unstinting US support for Israel. Widely condemned for invoking an anti-Semitic trope pertaining to Jewish money, Ilhan apologized for the remark but not for the essence of her larger point, which is unquestionably true, namely that money in politics plays a role in the lopsided pro-Israeli policy that the United States has pursued for decades. The absurdity is that everyone in Congress already knows this.

 

While reams of type and hype have spilled forth concerning the intrusions by the big, bad Russian bear (yes, he’s back after a post-Cold War hibernation) on American politics, we hear very little about Israel’s influence, which has profoundly shaped United States Middle East diplomacy since World War II. As I document in a forthcoming book, the Israel lobby goes much deeper historically than most people realize and has long exercised an outsized influence on Congress and presidential elections.

 

 

On February 15 the British Guardian did something American newspapers, magazines—and historians--rarely do: it published an analysis of pro-Israeli financing in American politics. “The data examined by the Guardian suggests that the pro-Israel lobby is highly active and spends heavily to influence US policy.” (“Pro-Israel Donors spent over $22 million in lobbying and contributions in 2018,” The Guardian online.) There’s more to the story than the Guardian grasps, namely individual campaign contributions that are made with the explicit or implicit understanding of unquestioned political support for Israel.

 

Fear of baseless charges of anti-Semitism must not prevent us from making relevant and scarcely disputable arguments about money and political influence. Let’s examine some basic facts: Israel, a tiny country of 8.5 million people, is the largest recipient of US foreign assistance since World War II (more than $100 billion), according to the Congressional Research Service. In 2016 President Barack Obama--despite being treated contemptuously by Israeli President Benjamin Netanyahu--signed a 10-year, $38 billion military assistance pact with Israel.

 

Well, you might say, Israel needs the money to defend itself from the hostile forces that surround and threaten to destroy it. This cliché misrepresents realities that have prevailed in the Middle East since the creation of Israel, namely that Israel more often than not has been the aggressor in the region, has won every war, and is more than capable of defending itself without receiving another dollar in US military assistance. Since 1967 the country has illegally occupied Palestinian territory and illegally sponsored settlements, which have been funded in part with American money, and has repeatedly engaged in indiscriminate warfare notably in Lebanon and the Gaza strip. Beyond dispute Israel has suffered from terror attacks but it is far from innocent and has killed many times more people than it has lost in the conflict. And now it is illegally absorbing, with Trump’s blessing, a historic holy city that under international law was meant to be shared by people of all faiths. 

 

Even more absurd than over-hyping Russian influence on US elections while ignoring those of Israel, is the widespread condemnation of Iran for supposedly pursuing a nuclear weapon, while ignoring the history of Israel’s utter contempt for nuclear non-proliferation in defiance of the United States dating back to the Eisenhower administration. Israel has the bomb, scores of them, acquired secretly and mendaciously--that is, it was developed even as the American special ally was told Israel would not introduce such weapons to the Middle East. Turns out what Israel meant is that it would not hold a press conference and say, “Hey, we’ve got the bomb!” Meanwhile Netanyahu and the Israel lobby spare no effort to condemn Iran, which unlike Israel proved willing to negotiate a non-proliferation agreement, which it entered into with Obama in 2015. Netanyahu--joined by Trump, Mike Pompeo, John Bolton, Elliot Abrams, and other assorted fanatics of the far right who are now in power—are lusting for an excuse to go to war with Iran. 

 

In sum, beyond a doubt, Israel and its American supporters have assembled the most powerful lobby pursuing the interests of a foreign country in all American history. Certainly, there are cultural and historical affinities on the part of American Christians (think Mike Pence) as well as Jews that help explain broad-based and historical US support for Israel. But only a fool—or an apologist—would argue that pro-Israeli money and influence do not play a significant role in American politics.

 

I hope this gets into print while it is still legal to criticize a foreign country (other than Russia, China, Venezuela--and France and Canada when they don’t do what we tell them). In an effort to head off a growing boycott, divestment and sanctions movement, Israel and the lobby now have set their sights on curbing freedom of speech and expression in this country. The matter is before Congress.

 

The time is past due to speak truth to power about Israeli policies and the American Israel lobby. 

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171292 https://historynewsnetwork.org/article/171292 0
“National Security Crisis” or “Power Grab”?

 

On February 14, 2019, Congress allotted Donald Trump some 1.4 billion dollars for 55 miles of wall between Mexico and the United States. Trump was flabbergasted—a poor Valentine’s gift indeed!—as he had demanded 5.7 billion dollars for the wall. Consequently, Trump, on the next day, declared a state of national emergency in order to access money, earmarked for defense spending, toward his wall. Trump said defiantly and with rodomontade: “We’re going to confront the national security crisis on our southern border and we’re going to do it one way or the other. It’s an invasion. We have an invasion of drugs and criminals coming into our country.” Trump’s declaration, signed later that day, will enable him to use 3.6 billion additional dollars, earmarked for military projects, for his beloved wall, so long as there are no impediments. He has plans to allocate 2.5 additional billion dollars for the wall, for a total of some 7.5 billion dollars.

Why such precipitancy? What is the national crisis?

 

An emergency is defined as an unexpected and urgent state of affairs that requires immediate action—here, a crisis of national significance. The invasion of drugs and criminals of which Trump spoke, it seems, had taken on the status of an immediate threat to national security. Yet Trump added in his Rose Garden speech: “I didn’t need to do this, but I’d rather do it faster. I want to get it done faster, that’s all.” Those words undermined the notion that his actions were precipitated by a national emergency. They were not. The border problem has been longstanding. There was, however, a large and looming crisis: Trump did not get what he wanted.

 

As anticipated, Democrats, headed by Speaker of the House of Representatives, Nancy Pelosi, argued that Trump’s maneuver was not on account of a national emergency. “This is plainly a power grab by a disappointed president, who has gone outside the bounds of the law to try to get what he failed to achieve in the constitutional legislative process.” They vowed to do what they could, with the help of sympathetic Republicans, to block the power-hungry president.

 

The prickly issue is one of Congressional authority, and thus, the constitutionality of Trump's gambit. Congress failed to allot Trump the money he wanted, so he decided to ignore Congress and to do whatever he could to get the needed money for his wall. Yet the problem is this: Does the president have the constitutional authority to sidestep Congress? If there is a real state of emergency, then the answer is yes, but as we have seen, the real crisis is that a man who is used to getting his way, usually through bullying, did not get his way. It is a matter of presidential pouting, and that is a dangerous precedent.

 

Still one could argue that the border wall was a campaign pledge of Trump and he has steadfastly stuck to keeping that promise, and that seems laudable, does it not? Promises ought not to be lightly made and once made, they ought to be kept. 

 

Yet there is a larger, more fundamental nodus. According to a Gallup poll, conducted in January, 2019, and polling randomly 1,022 American adults all over the United States (95% confidence range and +/- .03 margin of error), 60 percent of American oppose construction of the wall (39 percent, strongly opposing), while 40 percent favor construction of the wall (26 percent, strongly in favor). With the margin of error, there is no question that the majority of Americans disfavor the wall. The question redounds: Ought Trump to build a wall if the American citizenry does not want a wall?

 

After the Revolutionary War, American politicians and visionaries, in keeping with Jefferson’s sentiments in his Declaration of Independence, sought to do politically something that had never been done before: build a nation beholden to the will of the majority of the people over, paceAthens, a large expanse of land. For Jefferson, vox populi (the voice of the people) was the axial principle of a representative democracy, and he was fond of using the metaphor of machine to describe governmental efficiency in keeping with vox populi in two distinct senses.

 

A fine illustration of the first occurs in a letter to Comte de Tracy (I26 Jan. 1811), in which Jefferson writes fondly of Washington’s cabinet. “Had that cabinet been a directory, like positive and negative quantities in algebra, the opposing wills would have balanced each other and produced a state of absolute inaction. But the President heard with calmness the opinions and reasons of each, decided the course to be pursued, and kept the government steadily in it, unaffected by the agitation. The public knew well the dissensions of the cabinet, but never had an uneasy thought on their account, because they knew also they had provided a regulating power which would keep the machine in steady movement” (see also, TJ to Joel Barlow, 11 Apr. 1811, and TJ to Jedediah Morse, 6 Mar. 1822). Machine-like efficiency meant, thus, that the various, often disparate, parts of government would strive to work together with a common aim. That common aim was to give expression politically to vox populi, which Jefferson, in his First Inaugural Address, called a “sacred principle”—“The will of the majority is in all cases to prevail,” though he added that that will “to be rightful must be reasonable.”

 

Given government united by the aim of actualizing vox populi, Jefferson was also fond of describing the function of an elected official to be, in some sense, machine-like—as it were, perfunctory. In Summary View of the Rights of British America (1774), Jefferson enjoined King George III: “This his Majesty will think we have reason to expect when he reflects that he is no more than the chief officer of the people, appointed by the laws, and circumscribed with definite powers, to assist in working the great machine of government, erected for their use, and consequently subject to their superintendance.” The sentiment is that a governor is a superintendent of the great machine of government—a steward, not a lord. To Benjamin Rush (13 June 1805), Jefferson said, “I am but a machine erected by the constitution for the performance of certain acts according to laws of action laid down for me.” The notion here is that, once elected, a president’s will is no longer his own, but that of the people.

 

With today’s amaranthine political bickering, the metaphor of government as a machine, whose parts work toward the efficient functionality of the machine, is laughable. What is more ludicrous is the notion of elected representatives functioning as machines in working for the will of their constituency, and the president being beholden to the will of the general American citizenry. The aim of the Revolutionary War, if we follow Jefferson in his Summary View and Declaration, was resistance to tyranny. He was adamant in his Declaration that no revolution ought to be begun for “light & transient causes,” but only on account of “a long train of abuses & usurpations pursuing invariably the same object.”The greatest tyranny is government indifferent to the will of the people.

 

That is where we are with Trump’s wall and the national security crisis, birthed by Congress’ failure to acquiesce to the will of The One. We are in a state of national emergency because of presidential pouting. In all of the partisan bickering, vox populi is too infrequently mentioned. 

 

Jefferson also wrote in his Declaration that “mankind are more disposed to suffer while evils are sufferable, than to right themselves by abolishing the forms to which they are accustomed.” We are today flooded with governmental evils, and we suffer them, not because they are sufferable, but because we have become apathetic to injustice and the ideals for which our forebears fought. As Jefferson prophesied in Query VII of his Notes on the State of Virginia, when the Revolutionary War is long forgotten and Americans fix themselves on “the sole faculty of making money,” they will become mindless of their rights. Our shackles “will remain on us long, will be made heavier and heavier, till our rights shall revive or expire in a convulsion.” If we follow the general trend of indifference, expiration of rights through convulsion, if it has not already occurred, seems the most likely path.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171291 https://historynewsnetwork.org/article/171291 0
Proof that Bombs Can Stop Genocide

 

A new study has documented the Syrian government’s role in more than 300 chemical weapons attacks against its own citizens. Significantly, the new report found no evidence of any such gassing attacks in the ten months since last year’s U.S. missile strike on Syrian chemical warfare sites. 

It’s time for some soul-searching by those who denounced that U.S. military action—including one of the current candidates for the Democratic presidential nomination and officials of the U.S. Holocaust Memorial Museum. It’s time to acknowledge that American bombs can stop genocide.

The landmark Syria report, released February 17 by the Berlin-based Global Public Policy Institute, found there were at least 336 chemical weapons attacks in Syria between 2012 and 2018, and 98% of them were perpetrated by the Assad regime.

(Link to the full text of the report:  https://www.gppi.net/2019/02/17/the-logic-of-chemical-weapons-use-in-syria)

According to the report’s detailed timeline of the Syrian chemical atrocities, there have been no such attacks since April 7, 2018. That date is significant—it was one week before the United States carried out major missile strikes on multiple Syrian chemical weapons facilities. 

The strikes were praised by a wide range of foreign leaders and by prominent voices across the American political spectrum. But there were some notable, and disturbing, exceptions.

Rep. Tulsi Gabbard (D-Hawaii) said that since “Syria has not declared war against the U.S.,” missile strikes on the chemical sites were “illegal” and unconstitutional. Gabbard earlier said she was “skeptical” as to whether the Assad regime really was gassing civilians. She is now a candidate for the Democratic presidential nomination.

Rebecca Erbelding, a staff historian at the U.S. Holocaust Memorial Museum, tweeted in response to the missile strikes: “There are viable ways that the US can aid those being persecuted under an evil regime. Bombing isn't one of them.”

A few months earlier, the Museum had stirred controversy by issuing a report arguing it would have been “very difficult” for the US “to take effective action to prevent atrocities in Syria.” That sounded like a justification of President Barack Obama’s embarrassing failure to act on his famous “red line” ultimatum. After an outcry, the Museum backed down and deleted the report’s most objectionable language.

Amnesty International responded to the U.S. strike on the chemical weapons targets with a press release characterizing Assad’s atrocities as “alleged violations of the Syrian government.” Meanwhile, Code Pink and other antiwar groups staged “Hands off Syria” rallies around the country.

Seventy-five years ago this spring, the Roosevelt administration learned the full details of the mass gassing of Jews in the Auschwitz death camp. Thanks to two escapees from the camp, US officials even received detailed maps pinpointing the location of the gas chambers and crematoria.

Jewish groups in the United States and elsewhere pleaded with US officials to order air strikes on the mass-murder machinery, or on the railway lines and bridges over which hundreds of thousands of Jews were being deported to Auschwitz.

Since US planes were already bombing German oil fields within five miles of the Auschwitz gas chambers as well as railway lines and bridges throughout that region of Europe, it would not have diverted from the war effort to drop a few bombs on the transportation lines to Auschwitz or the mass-murder machinery. But the Roosevelt administration refused.

Those who have not learned from the moral failures of the Holocaust era should at least pay attention to more recent evidence of how U.S. military force can be used to interrupt genocide or other atrocities.

Recall that President Bill Clinton used air strikes to put an end to atrocities in the Balkans. President Obama used military force to preempt the plan by Libyan dictator Muammar Qadaffi to carry out what the president called “a massacre that would have reverberated across the region and stained the conscience of the world.” Obama also took military action to end the ISIS siege of thousands of Yazidi civilians in Iraq.

The new report on Syria urges the US to further damage Assad’s chemical weapons potential by “directly targeting the military formations that would be responsible for any future attacks.” It argues that “the Syrian helicopter fleet, which has played a critical role in the delivery of conventional and chemical barrel bombs, should be a primary target.”

If President Roosevelt had heeded the pleas that were made in 1944 to use force against the machinery of genocide, many lives could have been saved. Let’s hope the current president will learn from FDR’s mistake.

]]>
Sat, 23 Mar 2019 06:49:22 +0000 https://historynewsnetwork.org/article/171263 https://historynewsnetwork.org/article/171263 0