The Most Alarming Argument in Jill Lepore's These TruthsNews at Home
tags: Republican Party, climate change, Internet, political history, technology, Democratic Party
Walter G. Moss is a professor emeritus of history at Eastern Michigan University. He is a Contributing Editor of HNN, and his latest book is In the Face of Fear: On Laughing All the Way Toward Wisdom (2019). For a list of other recent books and online publications, click here.
“Hiroshima marked the beginning of a new and differently unstable political era, in which technological change wildly outpaced the human capacity for moral reckoning.” We find these words near the beginning of “The Machine (1946-2016),” the last part (some 270 pages) of Jill Lepore’s lengthy and highly-praised These Truths: A History of the United States. The rest of this section provides little hope that the outpacing she writes of is narrowing. This failure of ours is what is most alarming about these years.
Lepore’s survey of our post-WWII years addresses computing developments, polling, and political polarization. UNIVAC, the Universal Automatic Computer, was first revealed to the public in 1951. Along with subsequent computing, it helped turn “people into consumers whose habits could be tracked and whose spending could be calculated, and even predicted.” It also wreaked political “havoc, splitting the electorate into so many atoms,” and it contributed to newer forms of alienated labor.
Lepore thinks that conservatives took over the Republican Party in the late 1970s and early 1980s and gained a “technological advantage” over Democrats that “would last for a long time.” In this same period, corporations increasingly used computers to conduct their own polls, the accuracy of which Lepore often questions. By the 1990s, conservatives were increasingly using “targeted political messaging through emerging technologies” and were contributing to “a more atomized and enraged electorate.”
Although collapsing communist regimes and the end of the Cold War, culminating in the disintegration of the USSR in 1991, boosted Americans’ confidence in the future, Lepore believes “they were unable to imagine the revolution in information technology that would resist regulation and undermine efforts to establish a new political order.”
Despite the early Republican advantage in information technology, its impact on Democrats was also great. In the 1990s, Silicon Valley entrepreneurs and other professionals came to dominate the party, which deemphasized concerns of blue-collar workers, as it “stumbled like a drunken man, delirious with technological utopianism.” In February 1996, in what “would prove a lasting and terrible legacy of his presidency,” Bill Clinton signed the Telecommunications Act. By deregulating the communications industry, it greatly reduced antimonopoly stipulations, permitted media companies to consolidate, and prohibited “regulation of the Internet with catastrophic consequences.”
Despite claims that the Internet helped democratize political life, Lepore thinks that social media, expanded by smartphones, “provided a breeding ground for fanaticism, authoritarianism, and nihilism.” She writes of how the alt-right used web sites like Breitbart to spread its influence and how the Internet was “easily manipulated, not least by foreign agents. . . . Its unintended economic and political consequences were often dire.” The Internet also contributed to widening economic inequalities and a more “disconnected and distraught” world.
Beginning in the 1990s the concept of innovation “gradually emerged as an all-purpose replacement” for progress. The newer goal was more concerned with profit than any moral improvement, and it was often perceived as “disruptive innovation.” One of its proponents was Mark Zuckerberg, who in 2004 founded Facebook. Lepore quotes him as saying, “Unless you are breaking stuff, you aren’t moving fast enough.”
Newspapers were one of the casualties of this disruption. Compared to them, Internet information was “uneven, unreliable,” and often unrestrained by any type of editing and fact-checking. The Internet left news-seekers “brutally constrained,” and “blogging, posting, and tweeting, artifacts of a new culture of narcissism,” became commonplace. So too did Internet-related companies that feed people only what they wanted to see and hear. Further, social media, “exacerbated the political isolation of ordinary Americans while strengthening polarization on both the left and the right. . . . The ties to timeless truths that held the nation together, faded to ethereal invisibility.”
During the twenty-first century political polarization accelerated as the Internet enabled people “to live in their own realities.” Lepore quotes conservative talk-radio host Rush Limbaugh as saying in 2009 that “science has been corrupted” and “the media has been corrupted for a long time. Academia has been corrupted. None of what they do is real. It’s all lies!” Instead the “conservative establishment” warned audiences away from any media outlets except those that reinforced right-wing views. Such polarization also affected people’s ability to deal with our most pressing global problem—climate change—because, as Limbaugh believed, the science of the “alarmists” could not be trusted.
Although one can argue that Lepore pays insufficient attention to all the plusses of technological change, her main point that our moral advances have failed to keep pace with technological developments is irrefutable. One can further argue that many of our main problems today, such as climate change, nuclear buildups, cybersecurity, growing economic inequality, and the Trump presidency, are related to our inability to relate wisely to our technological changes. The popularity of our tweeting president, for example, was greatly boasted by his starring role in the twenty-first-century reality TV show The Apprentice.
More than four decades ago economist and environmentalist E. F. Schumacher bemoaned that “whatever becomes technologically possible . . . must be done. Society must adapt itself to it. The question whether or not it does any good is ruled out.” Adecade ago I concluded that “it was indeed evident how difficult it was for people’s prudence, wisdom, and morality to keep pace with technological change.” More recently, I updated this perspective by citing the brilliant and humane neurologist Oliver Sacks, who shortly before his death in 2015 stated that people were developing “no immunity to the seductions of digital life” and that “what we are seeing—and bringing on ourselves—resembles a neurological catastrophe on a gigantic scale.”
Undoubtedly, how to insure the use of digital and other technology to improve the common good is a tough problem. One place to look is to futurists. Psychologist Tom Lombardo is one of the wisest ones. He recognizes that “the overriding goal” of technology has often been “to make money . . . without much consideration given to other possible values or consequences,” but in his 800-page Future Consciousness: The Path to Purposeful Evolution he details a path by which we can evolve toward a more noble way of managing technology: by developing “a core set of character virtues, most notably and centrally wisdom. ”
Another source of wisdom regarding technology is from religious and philosophical thinkers. In the 1970s Schumacher in his chapter on “Buddhist Economics” in Small Is Beautiful sketched out a way wholly different than in the West for looking at technology and economics. More recently, in an encyclical on climate change—which the nonbeliever neurologist Sacks referred to as “remarkable”—Pope Francis devoted many pages to technology and acknowledged that at present it “tends to absorb everything into its ironclad logic.” But in opposition to our present “technocratic paradigm” he called for a “bold cultural revolution” based on noble values and goals.
Finally, on a history site such as HNN it is appropriate to ask, “Does history give us any hope that such a ‘bold cultural revolution’ can occur?” Can our approach to technology change from that of the dominant Western one of the last few centuries? Despite indicating our many post-WWII failures to cope wisely with technological change, Lepore does provide examples of movements and individuals that changed our history’s trajectory.
She writes of the Second Great Awakening, a religious revival movement that swept over the USA in the 1820s and 1830s. It increased church membership from one out of ten Americans to eight out of ten. She recalls the long struggle to end U. S. slavery from Benjamin Franklin, whose “last public act was to urge abolition,” to Frederick Douglas, whose writings helped inspire Lincoln and continue to inspire Lepore. She notes that after the money-grubbing Gilded Age, the Progressive Era emerged, and that “much that was vital” in it grew out of the Social Gospel movement, which “argued that fighting inequality produced by industrialism was an obligation of Christians.”
She recounts the many battles for civil rights from the Civil Rights Act of 1866, through Martin Luther King’s efforts and the Civil Rights Act of 1964, to the contested battles for the rights of blacks, women, immigrants, and LGBTs during the Trump presidency. She also details Franklin Roosevelt’s New Deal, whose scope was “remarkable” in combatting the Great Depression, when “nearly five in ten white families and nine in ten black families endured poverty,” and during which President Herbert Hoover argued against government relief, believing it would plunge the nation “into socialism and collectivism.”
One significant historical change that Lepore pays scant attention to is the end of the Cold War, noting simply, “by 1992, more than four decades after it began, the Cold War, unimaginably, was over.” That “unimaginable” ending, however, was due to individuals (like Soviet leader Mikhail Gorbachev and Ronald Reagan) who acted in unexpected ways to carry out steps that other individuals (like activist Andrei Sakharov and other protesters) and movements had long been demanding. In other parts of the world leaders like Gandhi and Nelson Mandela also produced results like nonviolent résistance and the end of apartheid that changed history’s trajectory.
As discouraging as post-WWII efforts to manage technology wisely have been, there may be, paradoxically, glimmers of hope emerging from our present dire climate-change situation. In a recent New York Times op-ed, “Time to Panic,” we read that “we’re at a point where alarmism and catastrophic thinking are valuable, for several reasons.” One is that “politics, suddenly, is on fire with climate change.” Just as the catastrophe of the Great Depression led to the imaginative New Deal, so too the present climate-change crisis might soon alarm us enough to spark new actions and ways of interacting with our planet—and with technology in general.
comments powered by Disqus
- When Jim Crow Reigned Amid the Rubble of Nazi Germany
- Why Suburban American Homeowners Were Accused of Being a 'Profit-Making Cartel' in the 1970s
- Animals large and small once covered North America’s prairies – and in some places, they could again
- Library of Congress acquires major archive of African American photographer Shawn Walker
- A farm boy became a fearsome warrior at Iwo Jima. And he did it with a flamethrower.
- Trump and the Christians: Evangelical historian John Fea on decoding the great paradox
- Six historians weigh in on the biggest misconceptions about black history
- Renowned presidential historian Doris Kearns Goodwin finally takes on George Washington
- Legal Historian Jed Shugerman Says William Barr's Actions Are "Remarkably Not Normal"
- Historian Ruth Ben-Ghiat Quoted in Washington Post Article on Trump's Quest to Rewrite History