With support from the University of Richmond

History News Network

History News Network puts current events into historical perspective. Subscribe to our newsletter for new perspectives on the ways history continues to resonate in the present. Explore our archive of thousands of original op-eds and curated stories from around the web. Join us to learn more about the past, now.

The Anti-Democratic Origins of Voter Prediction (Review)

I’m always a bit relieved when a tech company’s algorithm seems to fall short of mind reading. It’s somewhat reassuring when a so-called smart ad displays hideous shoes I would never buy just because I once bought a different pair from the same brand, or when Facebook suggests I befriend a good friend’s ex, having determined that he’s in my extended orbit, but lacking the sophistication to comprehend that I dislike him. In an age of unchecked data harvesting, in which corporations have free rein over the entirety of your internet history and personal information—not to mention, in some cases, your movements, thumbprint, face, and DNA—there’s a certain comfort in seeing where the machine still hasn’t quite mastered the quirks and nuances of human thought.

That sense of comfort, though, is almost certainly illusory. Tech companies’ surveillance and rampant data collection practices don’t need to replicate the exact intricacies of a person’s brain in order to invade the privacy of billions of people. There’s Google, for one, which quietly yet meticulously collects data on users’ searches, locations, and video-watching habits, and has even partnered with a private health care chain to obtain the medical information of millions of people. Then there’s also Facebook, which stores information on the pages its users have clicked on, the people they’ve searched for, and their ethnicities. These tech giants’ primary use of all this personal data so far seems to be to sell ads. But 2016’s Cambridge Analytica scandal also raised the possibility that these troves of information could be marshaled in an attempt to influence a national election, or change the course of democracy.

Though such predations may look like a uniquely twenty-first–century problem, Jill Lepore’s new book, If Then, demonstrates that this technological dystopia has been in the making since at least the Cold War era. Lepore traces the birth of one strain of predictive technology through the rise and fall of the mysterious Simulmatics Corporation, an advertising and political consulting company that produced a “People Machine,” or a computer program designed to profile and predict the behavior of voters. Among other ventures, Simulmatics consulted on John F. Kennedy’s 1960 campaign and (it claimed) helped him win, in part by persuading him not to attempt to obscure or downplay his Catholicism and to take a stronger stance in support of the burgeoning civil rights movement. Though the company was eventually shuttered and forgotten, it was also, Lepore writes, a shadowy and indirect participant in some of the most significant events of the decade, including the Vietnam War, the civil rights movement, and the construction of Lyndon Johnson’s Great Society.

Compared to today’s still-imperfect algorithms, Simulmatics’ People Machine was rudimentary and very often either missed the mark or produced rather commonsense results. And the company itself, though staffed by several brilliant minds, was at least as much the product of savvy hype and its founder’s well-placed connections as it was of groundbreaking technological innovation. What Lepore’s rich account unearths is the impetus behind the project, a set of attitudes that continue to drive psychographic microtargeting efforts today: For the stratum of professionals who developed voter prediction, politics was primarily a code to be cracked, rather than a means to a better life, let alone a matter of survival. The men who ran Simulmatics were archetypal Cold War liberals—committed, in a sense, to an idea of human progress and the prospect of a brighter world—but also completely in the thrall of the idea of engineering and controlling that world themselves.

Read entire article at The New Republic