;

health insurance



  • How the "Job" Became the Center of American Life

    "Since about the 1940s, Americans have been encouraged to look to their jobs for nearly all of life’s necessities: a living wage, health insurance, and retirement benefits, as well as intangibles like friendship, identity, and a sense of purpose." Historians Nelson Lichtenstein and James Livingston explain why. 



  • Our Insurance Dystopia

    by Caley Horan

    America's health insurance morass is a result of the replacement of the ideal of mutual, universal risk sharing with the privatization of risk in pursuit of profit. 



  • Inequality and the Coronavirus

    by Liz Theoharis

    Here’s the simple truth of twenty-first-century America: all of us live in a time and in an economic system that values our lives relative to our ability to produce profits for the rich or in the context of the wealth we possess.