Teachers Edition: Grades 3-6 (Backgrounders) Teachers Edition: Grades 3-6 (Backgrounders) articles brought to you by History News Network. Thu, 18 Apr 2024 20:42:51 +0000 Thu, 18 Apr 2024 20:42:51 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://hnn.us/article/category/168 Environmentalism Download this backgrounder as a Word document

Worth Reading

  • Frank Uekoetter: Global Warming – It's 1970 All Over Again
  • Brian Hamilton: Making “Environmentalism” Relevant for Everyone
  • Nancy Unger: Fifty Years After “Silent Spring,” Let's Not Roll Back Environmental Protections
  • Background

    On February 17, 2013, nearly 35,000 activists gather on the National Mall in Washington, D.C. to protest the construction of the Keystone Pipeline XL, an oil pipeline which will connect oil fields in Alberta, Canada with refineries in Kansas, Oklahoma, and Texas. Critics of the project have contended that by passing through the fragile Sand Hills in Nebraska, the pipeline could potentially contaminate one of the largest freshwater sources in the United States (the oil company constructing the pipeline has since altered construction plans to avoid the Sand Hills), and that the oil the pipeline will be carrying – extracted from a substance called “tar sands,” emits approximately 12 to 17 percent more greenhouse gases (which are the major contributing factor in climate change) than conventional oil.

    The uproar over the Keystone XL project is but the latest and most visible issue around which the environmental movement has rallied.

    Environmentalism in the United States, broadly stated, is political and social philosophy which focuses on human interactions with the natural world. There are environmentalist groups dedicated to wilderness preservation; protecting endangered species; fighting climate change; improving public health through government regulation of, for example, pollution; improving the urban environment; and a whole host of other issues.

    Environmentalism as an organized political movement has a relatively recent history. Earth Day, celebrated every year on April 22, was first held in 1970 as an offshoot of the anti-Vietnam War movement. Green parties, political groups which fused environmental politics with social democracy, began appearing in Western Europe and the United States in the late 1970s and early 1980s; while the U.S. Green Party remains small, with its electoral appeal limited mainly to a few big cities, green parties in other countries have a broader appeal. The German Green Party, for instance, is the fifth-largest political party in Germany, and has formed part of liberal coalition governments in that country.

    Of the two major political parties in the United States, it is generally the Democratic Party which is most identified with environmental politics, but this was not always the case. The Environmental Protection Agency was established while Republican Richard Nixon was president, and John Lindsay, the Republican mayor of New York, was a key player in the first Earth Day celebrations in New York City (though Lindsay later became a Democrat).

    What the Left Says

    The modern environmental movement was an outgrowth, in many ways, of left-wing social activism in the 1960s, and so it's not surprising that environmental politics are primarily identified with liberals. But there are, as we noted earlier, different groups within the movement. Democratic politicians will generally voice support for environmental causes, but this can vary based on geographic and the demands of constituents. Democrats from West Virginia or Michigan, for example, are generally fiercely opposed to additional environmental regulation because of the coal and automobile industries in their states. Other Democrats from states like California tend to be much more supportive of, for example, emission standards on automobiles.

    What the Right Says

    Modern American conservatives tend to fiercely oppose environmental regulation, and even deny the existence of climate change (either by outright calling climate change a “hoax,” as Oklahoma senator James Inhoufe, ranking Republican on the U.S. Senate's environmental committee, or by insisting that climate change is a natural process; the overwhelming majority of climate scientists – over 90 percent – believe that the Earth is rapidly warming due to human activity). Ironically, conservatives and Republicans were once the leaders in the environmental movement, but as the party became more ideologically rigid in the 1990s and 2000s – and as big business became more and more opposed to additional environmental regulation – environmental politics gradually became one of the Republican Party's third rails. Rick Santorum, a contender for the Republican presidential nomination in 2012, decried an “reign of environmental terror” from the Environmental Protection Agency, which was ironically established by a Republican president.

    Historical Background

    The deeper historical origins of the modern environmental movement date back to the Industrial Revolution of the late eighteenth and nineteenth centuries, which saw tremendous changes to the physical landscape and introduced modern industrial pollution in the form of coal emissions (it is at roughly this moment in history when climate change first began, as huge amounts of greenhouse gases began to be emitted into the atmosphere by the coal-driven power plants).

    The factories of the nineteenth century belched noxious gases. London, a heavily industrialized city in heavily industrialized Britain, became notorious for its air quality. – the first “smog,” air pollution so thick it resembles fog, dates from roughly this period, and was responsible for many thousands of deaths. British writers – not the least among them Charles Dickens – frequently compared the factories and living conditions of Victorian Britain to hellish furnaces.

    It was partly in reaction to the environmental degradations of the industrial era that motivated the New England “transcendentalists” of the nineteenth century. They advocated a simple, back-to-nature, pastoral lifestyle, as opposed to the industrial urban living. The most famous transcendentalist work, Walden, by Henry David Thoreau, would be very influential within the wilderness preservation/conservation movement.

    John Muir, a Scottish-born naturalist, became probably the most important figure within the conservation movement. Muir was one of the co-founders of the Sierra Club, which is still one of the largest environmental groups in the United States. He was also instrumental in establishing the first two U.S. national parks, Sequoia and Yosemite National Parks in California in 1890, a system which has since expanded to fifty-nine separate parks.

    But the end of the nineteenth century saw challenges for wilderness preservation in the United States. The passenger pigeon, a species of North American bird which migrated in massive flocks that numbered in the hundreds of millions, went extinct due to overhunting and human encroachment; the last passenger pigeon died in captivity in 1914 (though there is ongoing discussion about potentially reviving the species through cloning). The American buffalo nearly followed; at its peak, the species numbered nearly 50 million individuals. After systemic overhunting in the nineteenth century, the population declined to only a few hundred individuals. Private efforts by a handful of ranchers managed to save the species from extinction, and there are now nearly 350,000 bison in North America – but the overwhelming majority of those bison are domesticated herds.

    Though the federal government did not act to save either the bison or the passenger pigeon, the disappearance of the two species brought preservation and conservation efforts into the public sphere. It would not be until the 1960s, however, when the modern environmental movement was born.

    After the end of World War II, it became common for farmers to spray their crops with the insecticide DDT – and for the chemical to be used indiscriminately in the effort to eradicate malaria from Europe and the United States, a disease carried by mosquitos. In her 1962 book Silent Spring, Rachel Carson advanced the shocking (for the time) thesis that it probably wasn't a good idea to dump massive amounts of DDT – the side effects of which were not fully understood – into the environment, and she tied increased cancer rates and the near-extinction of some bird species, especially the bald eagle, to indiscriminate DDT use. DDT was eventually banned by the U.S. government in 1972.

    To address increasing concerns from the public about the environment and health, the U.S. Congress passed a flurry of legislation enacting new environmental regulations: the Clean Air Act of 1963, which enacted regulations on emissions of noxious gases; the National Environmental Policy Act of 1970, part of a package of federal initiatives which also birthed the Environmental Protection Agency; and the Clean War Act of 1972. At the same time, environmentalism entered the public consciousness through the space program; the Blue Marble, a photograph of the Earth taken by Apollo 17 astronauts in 1972, remains to this day an enduring symbol of the environmental movement.

    The one issue, however, which has gradually overtaken every other aspect of the environmental movement, only really entered public awareness in the 1980s: global warming. Climate scientists had long been concerned about the potential for catastrophe due to runaway global warming, but a consensus on the reality of global warming only cemented at the end of the 1980s.

    In brief: the average temperature of the Earth is quickly rising, this is primarily due to the massive amounts of greenhouse gases dumped into the atmosphere due to industrialization, and the consequences of this increase in temperatures will be disastrous (think more extreme weather events like hurricanes and droughts; a decrease in agricultural production, social instability, sea level rises... and that's just the beginning).

    The environmental movement has long been an advocate of reducing global emissions of greenhouse gases, scoring a notable victory with Kyoto Protocol in 1997, but because the protocol does not require developing countries to reduce their greenhouse gas emissions – and because the United States, one of the largest emitters, along with China and India, of greenhouse gases, has not ratified the treaty – its effectiveness has been limited in stopping climate change. Increasingly, environmentalists have begun to argue for preparedness for climate change, as opposed to prevention, since it's too late for meaningful measures to stop the process.

    ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/152221 https://historynewsnetwork.org/article/152221 0
    Taxes Download this backgrounder as a Word document

    Worth Reading

  • Ray Raphael: The Income Tax Amendment Turns One Hundred and It’s Worth Celebrating
  • Is the Income Tax Illegal?
  • Michael Lind: What If All Sides are Wrong about Taxes?
  • Q&A: How FDR Built Today’s Tax System
  • Background

    How does the federal income tax – due on April 15 each year – work in the United States?

    It’s complicated.

    Since the passage of the 16th Amendment, incomes have been classified according to particular tax brackets, under which income made within a particular bracket is taxed accordingly.

    In other words, say a single person person makes $500,000 a year in income. That puts him in the top income bracket of 35%, which applies to those making over $388,351. But he will only pay that 35% on the income he’s made over $388,351 – he will pay the lowest marginal rate of 10 percent on his first $8,700 in income, then 15% on income between $8,700 and $35,350, then 25% on income from $35,350 to $85,650, then 28% on income between $85,650 to $178,650, then 33% from $178,650 to $388,350, then the top rate of 35% from $388,351 on for the rest of his income.

    So, his taxes break down thusly:

  • $8,700 of his income is taxed at 10%, so he pays $870 in taxes for the first income bracket.
  • $25,650 of his income is taxed at 15%, so he pays $3,847.50 in taxes for the second income bracket.
  • $50,300 of his income is taxed at 25%, so he pays $12,575 in taxes for the third income bracket.
  • $93,000 of his income is taxed at 28%, so he pays $26,040 in taxes for the fourth income bracket.
  • $209,700 of his income is taxed at 33%, so he pays $69,201 in taxes for the fifth income bracket.
  • $111,649 of his income is taxed at the highest tax rate, 35%, so he pays $39,077.15 taxed at this rate.
  • Our hypothetical taxpayer theoretically pays a total of $151,610.65 in federal taxes on an income of $500,000, meaning he is actually taxed at 30.3% of his income.

    This is what is referred to as a progressive income tax. As the amount of money liable for taxation increases, the tax rate increases. If our example had made $1,000,000, he would have paid the top tax rate of 35% on $611,650, meaning he’d pay $214,077.50 at the top rate and $326,611 in taxes overall, a rate of 36.2%.

    If that’s not confusing enough, that doesn’t take into account tax deductions for home mortgages, dependents, charitable contributions, state and local taxes. Especially at the upper end of the income bracket, relatively few pay their theoretical tax rate.

    That also doesn’t take into account other kinds of income, such as capital gains tax (income generated from assets like stocks, bonds, or real estate), which has a top tax rate of 15%, payroll taxes on Social Security and Medicare, and other kinds of personal taxes.

    What the Left Says

    The left has long been a proponent of the progressive income tax – the argument is more fair (those who have the ability to pay more should do so), that progressive taxation reduces income inequality, and that there is a correlation between the perception that a tax is fair and the general happiness of a citizenry. However, some critics on the left point out that the American system of taxation – handing out exemptions as a form of social policy – is ineffective, and that a fair tax policy should also target consumption, i.e. sales tax.

    What the Right Says

    Since the 1990s, thanks largely to anti-tax advocates like Grover Norquist, the Republican Party has become the “no new taxes” party – a Republican in Congress hasn’t voted for a tax increase since 1990. But the right has long been skeptical of the progressive income tax, arguing that progressive income tax disincentivizes people to make more money (because it will be taxed at a higher rate), and encourages loopholes. Higher tax rates are ineffective, other conservative economists argue – in particular Arthur Laffer – because after a certain tax rate is reached, federal income will actually decline. This theory, known as the Laffer curve, remains controversial to this day.

    Historical Background

    Americans have never liked paying taxes – we started a revolution in no small part because Great Britain was levying taxes against American colonists without their direct consent. It’s not surprising that under America’s first constitution – the Articles of Confederation – Congress has no power to levy taxes. It’s also not surprising that this ended up being a disaster, and that in the U.S. Constitution ratified in 1789, Article I, Section 8, Clause 1 specifically states that Congress has the power to impose “Taxes, Duties, Imposts, and Excises” on the states.

    In practice, this meant that Congress could tax states on the basis of population (including the slave population) and property (again, including slaves), but could not directly impose income taxes on U.S. citizens. This was less of an issue in the eighteenth century, when a person’s wealth was mainly measured by property and land as opposed to income (and indeed, property taxes of the day affected mainly the wealthy).

    By 1861, though, the country faced a problem: how to pay for the Civil War? The first federal income tax was levied that same year (and replaced by another income tax in 1862). They were repealed after the end of the war in 1865. The Supreme Court ruled that these income taxes were “indirect” and therefore perfectly constitutional, but thirty years later, in 1895, the first peacetime federal income tax was struck down by the Court as a “direct” tax.

    Twenty years after that, in 1913, the 16th Amendment was ratified by the states, explicitly granting Congress “power to lay and collect taxes on incomes, from whatever source derived, without apportionment among the several States, and without regard to any census or enumeration.”

    Since the passage of the 16th Amendment, the primary source of federal revenue has been through the federal income tax, especially during wartime. Top tax rates soared to 77% on income over $1,000,000 during World War I, and the highest top tax rate in history was 94% of income over $200,000 during World War II (but remember, income under $200,000 was taxed at a much lower rate.) Tax rates steadily declined throughout the 1960s and 1970s, to the point where our current top tax rate, at 35%, is less than half that of six decades ago.

    But it’s also important to remember that relatively few people paid the highest tax rate on their income, since there were so many deductions available through the tax code.

    This is an IMPORTANT POINT: Congress has repeatedly shown a preference for adding exceptions/deductions in the federal tax code to benefit certain groups as a form of social policy. For example, tax deductions are handed out to encourage child-rearing, home ownership, small businesses, clean energy, and a variety of other social policies favored by the government.

    These special provisions are one of the biggest sources of criticism, both by anti-tax activists and even by IRS employees, as they complicate the tax code and make it much harder to collect taxes.

    Which brings us to the Internal Revenue Service, the dreaded IRS. The Internal Revenue Service, founded in 1862 but which got its current name in 1918, is a bureau of the Department of the Treasury and is responsible for collecting federal income taxes. For perspective, it has around 106,000 employees and collects $2.4 trillion dollars from around 234 million tax returns, 140 million of which are individual income tax returns (which in turn nets $1.1 trillion in revenue).  

    ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/151575 https://historynewsnetwork.org/article/151575 0
    Medicare Download this backgrounder as a World document

    Worth Reading

  • HNN's History of Healthcare Reform http://hnn.us/articles/hnns-history-healthcare-reform
  • NYT Times Topic: Medicare http://topics.nytimes.com/top/news/health/diseasesconditionsandhealthtopics/medicare/index.html?8qa
  • Colin Gordon: “Hands Off My Medicare: The Deadly Legacy of Social Insurance” http://hnn.us/node/122017
  • Background

    As people get older, they tend to get sicker. You've probably seen this yourself with your parents and grandparents. According to the Centers for Disease Control and Prevention (CDC), those 65 and older (usually called “the elderly” in the media) are nearly twice as likely to need to take pills in order to stay healthy; they are also much more likely to require hospitalization -- in fact, the CDC estimates that nearly 25 percent of those aged 65 and up are in “fair or poor health.”

    In order to help older people struggling with health problems, the U.S. government created the Medicare program in 1965. Medicare provides near-universal health insurance for those 65 and up – meaning that doctor's appointments, hospital visits, pills, and other kinds of health care are paid for by the federal government. The program currently insures 45 million people (not all are elderly – Medicare also covers younger people with disabilities, such as kidney disease). Projections show that nearly 80 million people will be covered by Medicare by 2030 due to America's aging population (today's median* age of 36.8 years will be over 39 years by 2035). Nearly half of all Medicare recipients have incomes below the federal poverty line.

    Medicare is one of the most popular federal programs, and is widely considered to be a “third rail” in American politics, meaning that if a politician “touches” Medicare (by slashing benefits or doing away with the program entirely), it will destroy his or her political career.

    The program is divided into four parts, each with their own separate budget: Medicare Part A, which covers hospital stays; Medicare Part B, which covers ordinary doctor's visits; Medicare Part C, added in 1997, which allows Medicare recipients to receive their benefits through private insurance companies, and Medicare Part D, added in 2006, which covers prescription drugs.

    Medicare is an expensive program -- in 2009, it cost the federal government $480 billion, and that's expected to increase in the future. Though Medicare is funded by a Medicare tax paid by all employed persons, Medicare Part A's expenses will start outpacing the income generated by this tax by 2024 (though there will still be enough money to cover most benefits, and Medicare Part B will still have a balanced budget). This long-term budget problem is one of the looming governmental crises, and it is expected to be a major part of upcoming budget talks between President Obama and the Republican-held House of Representatives in March.

    *meaning that half of all people are above this age, half of all people are below it.

    What the Left Says

    Medicare is one of the fundamental parts of the social welfare state, along with Social Security and the similar Medicaid program, which provides health care for people in poverty and/or with limited incomes. In order to “fix” Medicare (meaning making the program fiscally stable), President Obama has proposed raising Medicare premiums, increasing the eligibility age for Medicare benefits from 65 to 67, and increasing Medicare payroll taxes.

    What the Right Says

    The right has long had a conflicted relationship with Medicare, and the concept of government health insurance in general. In the 1960s, Barry Goldwater and Ronald Reagan called the program socialistic, and there have been concerns that Medicare offers low-quality health care outcomes. Today, conservative politicians generally express support of Medicare due to its overwhelming popularity with voters – George W. Bush signed into law Medicare Part D, which covers prescription drugs, in 2006 – though cost-cutting reform plans by prominent Republicans like Paul Ryan would change the nature of Medicare by introducing vouchers for seniors to buy private (as opposed to government) insurance plans.

    Historical Background

    The earliest government health insurance program was introduced in Germany in the 1880s, part of Otto von Bismarck's social welfare reforms,, the first of their kind anywhere in the world.

    With the increasing complexity of health care in the early twentieth century (sparked in part by the development of new technologies like x-rays, electrocardiographs, and antibiotics, along with the general increase in the complexity of medical knowledge), health care costs began to rise dramatically.

    In the 1930s, Democrat Franklin D. Roosevelt tried to include some form of government health insurance in his New Deal reforms, but failed in the face of intense resistance from the American Medical Association (AMA). His successor, Harry S. Truman, also pushed hard for national health insurance after World War II, but again failed due to strong AMA resistance (a similar effort, incidentally, succeeded in Britain, leading to universal health care through the National Health Service, established in 1948).

    Republican President Dwight D. Eisenhower was little interested in expanding the social welfare state, but in 1960 he signed the Kerr-Mills Act, which created a federal program to subsidize individual state health care programs for the elderly. John F. Kennedy called Kerr-Mills “grossly inadequate” in his 1960 campaign and proposed a program he called “Medicare.” Again, the AMA fiercely resisted, even enlisting the aid of Hollywood star (and future president) Ronald Reagan to act as spokesman against Medicare (he called it the first step toward socialism in the United States).

    However, two things happened: one, Kennedy was assassinated and his successor, Lyndon Johnson, pushed most of Kennedy's social legislation through Congress – the Great Society programs. Two, the stance of the public changed, due both to increasing medical costs and an aging population (the elderly made up 10 percent of the population  in 1965, up from 4 percent in 1900, though still a far cry from the 2030 projections). Medicare passed in 1965, along with the Medicaid program for low-income Americans.

    Since 1965, Democrats have been trying to expand national health insurance to cover all Americans, though not necessarily through expanding Medicare. (And indeed, the Affordable Care Act of 2010, which  did comprehensively expand health insurance to cover almost all Americans, did so through a combination of various policies, most significantly and controversially through mandated purchasing of private insurance plans.)

    Medicare has since developed into the so-called “third rail of American politics” – if a politician touches it, he or she risks political suicide (and in fact one argument frequently employed by critics of the Affordable Care Act was that it would take money away from Medicare). But with a population that continues to age and with medical expenses that continue to climb, the future financial stability of Medicare is in doubt.

    Questions

  • Should the health care of the elderly be a government responsibility?
  • Do you know anyone on Medicare?
  • How would you try to reform Medicare to make it solvent?
  • ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/150811 https://historynewsnetwork.org/article/150811 0
    Social Security Download this backgrounder as a Word document

    Worth Reading

    Background

    Social Security is the nation's largest social program. More than 50 million people receive benefits totalling more than $600 billion a year. Originally established to provide retirement benefits, the program was extended to widows, the disabled, and children in some cases. It is estimated to keep 40 percent of the elderly out of poverty. Currently, retirement benefits are paid to people when they reach the age of 65, though younger people will have to wait until they are 66 or 67, depending on the date of their birth.

    Both political parties now say that they support Social Security, the landmark program of FDR's New Deal. But they have different ideas about it. Democrats insist the program is essentially sound, though changes are needed to assure it can meet its obligations by 2040s, when the Trust Fund into which people have been paying is expected to take in fewer dollars than it pays out. Republicans insist the program is in danger of going broke and favor a plan to provide vouchers to recipients, a plan Democrats refer to as privatization.

    Polls show that an overwhelming majority of the American people support the plan. No effort to substantially change it has ever succeeded. It is supported by the payments workers pay into the Trust Fund through the payroll deduction system. Employers match the contributions made by workers: 6.2% of a worker's paycheck up to $110,000.

    What the Left Says

    From economists Mark Weisbrot and Dean Baker, authors of "Social Security: The Phony Crisis":

    Social Security is our largest and most successful antipoverty program, keeping about half of the nation's senior citizens from falling below the official poverty line. In 1959, the poverty rate among the elderly was more than 35 percent; by 1970, it was twice the rate of that for the general population. Largely as a result of the Social Security program, it has since fallen to 10.8 percent, or slightly less than that for the general population. For two-thirds of the elderly, Social Security makes up the majority of their income; for the poorest 16 percent, it is their only source of income.

    Social Security provides about $12 trillion worth of life insurance, more than that provided by the entire private life insurance industry. The program's 44 million beneficiaries today include 7 million survivors of deceased workers, about 1.4 million of whom are children. Some 5.5 million people receive disability benefits, including not only disabled workers but also their dependents. For a typical employee, the value of the insurance provided by the program would be more than $200,000 for disability and about $300,000 for survivors insurance.

    The coverage of the program is nearly universal -- about 95 percent of senior citizens either are receiving benefits or will be eligible to receive them upon retirement. For a society that wants to ensure some minimum standard of living for its elderly, this is an important achievement in itself. But it also allows for other accomplishments that would be difficult or impossible to replicate in the private sector. For example, Social Security provides an inflation-proof, guaranteed annuity from the time of retirement for the rest of the beneficiary's life. The cost of retirement, survivors, and disability insurance does not depend on the individual's health or other risk factors. And the benefits are portable from job to job, unlike many employer-sponsored pension plans.

    What the Right Says

    From the GOP party platform 2012

    While no changes should adversely affect any current or near-retiree, comprehensive reform should address our society's remarkable medical advances in longevity and allow younger workers the option of creating their own personal investment accounts as supplements to the system. Younger Americans have lost all faith in the Social Security system, which is understandable when they read the non- partisan actuary's reports about its future funding status. Born in an old industrial era beyond the memory of most Americans, it is long overdue for major change, not just another legislative stopgap that postpones a day of reckoning. To restore public trust in the system, Republicans are committed to setting it on a sound fiscal basis that will give workers control over, and a sound return on, their investments. The sooner we act, the sooner those close to retirement can be reassured of their benefits and younger workers can take responsibility for planning their own retirement decades from now.

    From the Heritage Foundation

    Social Security's finances significantly worsened last year, according to the new 2012 trustees report, because of a weakened economy and structural problems with the program. The April 23 report shows that all people who receive Social Security benefits face about a 25 percent benefit cut as soon as 2033 -- three years earlier than predicted in last year's report. The program's long-term deficit is now larger than it was before the 1983 reforms. In order to pay all of its promised benefits, Social Security would require massive annual injections of general revenue tax money in addition to what the program receives from payroll taxes. These additional funds would be needed for the next 75 years and beyond.

    Historical Background

    Social Security was established in the 1935 to help protect older Americans from poverty, then the demographic group most in need. A product of the New Deal, it was originally conceived to cover only industrial and commercial workers, about half the workforce. It wasn't terribly popular; even many Democrats in Congress wanted to vote against it. Only FDR's insistence overcame their opposition. It hurt the economy in the short run, taking money out of the financial system at a time when money was needed to help lift the cloud of the Great Depression. It was a contributing factor in the Recession of 1937.

    In 1939 survivor's benefits were added for widows and their children. In 1956 benefits were added to cover the disabled. Starting in 1975 benefits were geared to automatically take inflation into account.

    In 1981 Ronald Reagan attempted to reduce benefits for early retirees and people in select categories in order to shore up the system, which was in danger of running short of cash. The U.S. Senate voted unanimously to reject the reform. Later, President Reagan appointed a commission under Chairman Alan Greenspan to address the problem. The reforms, which were adopted by Congress, increased Social Security taxes substantially while postponing a scheduled cost-of-living increase. Soon surpluses began accumulating in the Trust Fund, saving the system. The surpluses were invested in special government bonds, essentially IOUs. For the next several decades politicians from both political parties used the surplus to finance their favorite government programs or tax cuts, knowing the surplus wouldn't be needed until the baby boomers began retiring in large numbers in the 2010s. At that time less money would be coming into the system than was going out in the form of benefits. That is just what has happened. Today the system spends more than it takes in, requiring Congress to dip into general funds as Social Security redeems the IOU's it received in return for the surplus money paid into the treasury for decades.

    When FDR established the system he was pressed by some to finance the program from general revenues. He insisted the system be funded through deductions from workers' paychecks: "We put those pay roll contributions there so as to give the contributors a legal, moral, and political right to collect their pensions and their unemployment benefits. With those taxes in there, no damn politician can ever scrap my social security program. Those taxes aren't a matter of economics, they're straight politics."

    Fun Facts

    From the website of Social Security:

    Q: Is there any significance to the numbers assigned in the Social Security Number?

    A: Yes. The first three digits are assigned by the geographical region in which the person was residing at the time he/she obtained a number. Generally, numbers were assigned beginning in the northeast and moving westward. So people on the east coast have the lowest numbers and those on the west coast have the highest numbers. The remaining six digits in the number are more or less randomly assigned and were organized to facilitate the early manual bookkeeping operations associated with the creation of Social Security in the 1930s.

    Q: Who was the first person to get Social Security benefits?

    A: A fellow named Ernest Ackerman got a payment for 17 cents in January 1937. This was a one-time, lump-sum pay-out -- which was the only form of benefits paid during the start-up period January 1937 through December 1939.

    Topics for Discussion

  • Do you think Social Security will be around for you?
  • Should the federal government provide the elderly and the disabled with a pension?
  • ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/150810 https://historynewsnetwork.org/article/150810 0
    Presidential Inaugurations Download this backgrounder as a Word document

    Worth Reading

  • HNN Hot Topics: Presidential Inaugurations
  • Highlights of Past Inaugurals
  • Nixon Pigeon-Proofed His Inaugural Parade Route
  • Background

    Presidential Inaugurations -- which are held every four years on January 20th, except when the date falls on a Sunday, in which case they are held that Monday--serve many purposes, but the main one is symbolic. Following the usually heated conflict attending elections, inaugurations serve to unite the country around the president, lifting him up above the regular hurly burly of politics, giving presidents a chance to strike a pose of statesmanship. The symbolism of a country uniting around its chosen leader is powerful. It is a testament to the peoples' commitment to self-government. In other countries the transition in power between leaders of opposing parties is often marked by bloodshed. But not here in the United States.

    The chief event at a inauguration and the only one mandated by the Constitution is the swearing in of the president. At the swearing in ceremony the president takes the following oath, which is provided in the Constitution: "I do solemnly swear (or affirm) that I will faithfully execute the Office of President of the United States, and will to the best of my Ability, preserve, protect and defend the Constitution of the United States."

    By tradition presidents deliver an inaugural address in which they lay out their vision for the country. Some of these addresses have been so memorable that we remember select lines generations later.

    What the Left Says

    In recent decades inaugurations have become so extravagant that they have had to be sponsored by contributions from the wealthy and corporations. This has elicited criticism from liberals, who bemoan the influence of money in politics. It is hard, they argue, for a president to make decisions in the best interest of the country as a whole when his inaugurations are paid for by the wealthy. Periodically liberals have objected to the role of the military in inaugural day parades.

    What the Right Says

    Conservatives generally approve of inaugurations as a magnificent tradition. But for a time after the election of Democrat Franklin Roosevelt some small-government conservatives criticized the extravagance of modern inaugurations, which seemed to them to celebrate big government.

    Historical Background

    The first president to hold an inauguration was the first president: George Washington. Every president thereafter followed his example, marking the swearing in with an address and a celebration (though in some years, especially during war-time, celebrating was kept to a minimum). He was sworn in on March 4, 1789. The date of the ceremony was later changed to January 20 during the administration of FDR to reduce the length of time between the election and the start of the term of the new president.

    With just a few exceptions, the outgoing president has attended the ceremony of his successor--but not always in good cheer. Harry Truman and Dwight Eisenhower barely spoke on their way up from the White House to Ike's inaugural.

    The song "Hail to the Chief" was first played at an inauguration in 1845 for James K. Polk. The music comes from an old English tune by songwriter James Sanderson.

    Greatest Inaugural Addresses

  • George Washington's first (1789)
  • Thomas Jefferson's first (1801)
  • Abe Lincoln's first and second (1861, 1865)
  • Franklin Roosevelt's first (1933)
  • John F. Kennedy (1961)
  • Common Themes in Inaugural Addresses

  • The People
  • Chosen people
  • Mission and American history
  • City on a Hill
  • Sacrifice
  • Unity
  • Founding Fathers
  • President as instrument of the people
  • Confidence
  • Here are some highlights from past inaugurals:

    George Washington: At the suggestion of James Madison, who drafted the inaugural address, Washington included a plug for the as yet unratified Bill of Rights. It helped win passage of the amendments. Washington was apprehensive about his inauguration as president. He confided to a friend that he felt like a thief going to an execution.

    John Adams: His inaugural address included one sentence that had over 700 words!

    Thomas Jefferson: His most famous claim that "we are all republicans. We are all federalists" was actually less of an appeal to nonpartisanship than an acknowledgement of the support he had received from some federalists. When assumed office he threw out federalists who hadn't supported him and installed members of his own party--a practice that became common until the establishment of civil service reforms late in the 19th century.

    James Monroe: His first inauguration in 1817 was the first to take place outdoors. It was by necessity. The British had burned the capitol in the War of 1812.

    Martin Van Buren: He took office just as the country slipped into a depression. (So did Grover Cleveland in 1893.)

    William Henry Harrison: He served the shortest term (1 month, then died) and delivered the longest inaugural address (1 hour and 40 minutes). His inaugural featured the first formal parade. Harrison was the first president to travel about the country to meet people on the campaign trail, much to the horror of traditionalists, who thought it was too great a concession to democratic norms.

    Franklin Pierce: He memorized his address. (So did Grover Cleveland.) But his was the saddest of all inaugurations. His young son had been killed in a train accident just two months earlier. His wife refused to move into the White House when she learned that he had lied to her. He had sworn that he hadn't lifted a finger to win the nomination. He had.

    James Buchanan: Candles dripped on Buchanan's guests at his inaugural ball. His was the first inauguration to be photographed.

    Abe Lincoln: His son lost the president's first inaugural address on the train heading from Springfield to the capital. Lincoln found it in the luggage compartment. First time blacks were allowed in an inaugural parade was in 1865.

    James Garfield: He got a bad case of writer's block and kept putting off the drafting of his address. He didn't finish it until 2 in the morning on the day of the event.

    Grover Cleveland: His second inauguration parade featured trained seals, dancing horses, and dog acts.

    Teddy Roosevelt: He was the only president not to use the first person pronoun in his address. (The average is 15 uses of the first person pronoun in inaugural addresses.)

    Warren Harding: He was the first to use a device to amplify his voice.

    Calvin Coolidge: Coolidge, who ascended to the presidency when Harding died in office, was sworn in by his father in an emergency ceremony held at his family farm in Vermont. His inaugural address in 1925 was the first broadcast by radio.

    Herbert Hoover: His was the first recorded on film in 1929.

    Franklin Roosevelt: FDR had 300 members of the Electoral College in his 1933 parade--a kind of civics lesson. His 1933 parade was the first to feature a movie star--Tom Mix. He was the president who began the tradition of attending church the day of the inaugural. In 1945 he added the invocation to the ceremony.

    Harry Truman: His was the first broadcast on television in 1949. It was said that up to a million turned out to watch his swearing in, one of the biggest crowds in presidential history, enticed by Democrats who were shocked by Truman's victory. (1.8 million turned out for President Obama in 2009.)

    Dwight Eisenhower: His first inaugural parade lasted an astounding 5 hours! (Most last no more than 3 hours.) A cowboy lassoed Ike.

    John Kennedy: The temperature was 22 degrees, but JFK refused to wear an overcoat. (He did wear long underwear.) He remained for the entire 3 hour plus parade (Jackie left early). He had learned there were no blacks in the Marine Corps units in the parade and ordered that they be included.

    Ronald Reagan: Reagan's first inaugural was the warmest on record--55 degrees. His second was the coldest on record--20 below with wind chill factor), so cold the ceremony had to be moved indoors. He broke the tradition of being inaugurated on the east side of the capital. A westerner, he wanted to face west--toward the Washington monument.

    Bill Clinton: Clinton didn't finish his address until 4:30 in the morning. Hillary joked he "never met a sentence he couldn't fool with."

    Topics for Discussion

  • Inaugural addresses used to be written at a college level. In the past few decades they have been written at a fifth grade level. Why?
  • 8 presidents were under age 50. Is it better to have a younger or older president?
  • Why do we have parades at inaugurations? Why was the first, held in 1841, significant?
  • ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/150173 https://historynewsnetwork.org/article/150173 0
    Deficits and the Debt Ceiling Download this backgrounder as a Word document.

    Worth Reading

  • Bernard Weisberger, "What History Tells Us Will Likely Happen to Those Giant Surpluses"
  • HNN Hot Topics: Tea Party History
  • HNN Hot Topics: The Debt Ceiling
  • Background

    In President Clinton's last full year in office (2000), the federal budget was roughly $1.8 trillion dollars. When President Bush was inaugurated in January 2001, his administration received the previous year's $236 billion dollar budgetary surplus. By fiscal year (FY) 2010, a mere decade later, the United States' budget had doubled to nearly $3.6 trillion, and incurred a $1.6 trillion budget deficit. A budget deficit is the difference between outlays (expenditures) and income for a single fiscal year.

    So how exactly did we get to this point, where our federal deficit is almost as large as our entire federal budget was just 10 short years ago? To answer that question we must explore the concept of deficits and our nation's financial history.

    When economists refer to "the deficit" they are talking about the annual fiscal difference between what the government spends (outlays) and receives (taxes). More formerly, this difference is referred to as the budget-deficit, as distinguished from the trade-deficit, or the net difference between imports and exports. While we will focus here primarily on the budget-deficit, it's important to note that historically the United States has covered its budget deficit with a trade surplus, or vice versa. More recently however (since the early 70's), we have developed an unhealthy pattern of simultaneously spending more than we receive from tax revenues, and importing more than we export.

    One way we compensate for going over budget is by selling United States government securities or bonds. Private investors, including foreign governments such as China's, can purchase these securities from the Department of the Treasury. We leverage the income from the sales of these securities against the annual deficit. The aggregation of outstanding bonds and securities-the aggregation of the deficits over a period of years-is referred to as the national debt. Our government, regardless of which political party is in office, tends to operate on the assumption that regardless of the size of our deficit, investors and foreign countries will always cover the difference by buying these securities.

    We offset a portion of the amount of bonds we need to sell by minting new currency. This raises the danger of inflation, according to many economists, though the extent of the danger is a matter of opinion.

    But why do we have deficits in the first place? Many economists believe that deficits are the result of naturally recurring lifecycles of the economy. During periods of economic expansion, employment tends to be near capacity. As a result, taxable income peaks. The government's receipts come close to meeting (and in some years actually exceeding) expenditures. Conversely, however, during times of recession when unemployment is relatively high, taxable income decreases. At the same time, social welfare programs are stressed when more people are unemployed, so the government paradoxically ends up spending more at the same time that it's receiving less-a double edged sword. What's important to realize is that often deficits are the result of endogenous factors, the natural ebb of economic cycles. While economists such as Milton Friedman might argue that the term "cycle" is a misnomer-economic history tends to be asymmetric and sporadic-there is no denying the presence of a cyclical-nature at the very least.

    Sometimes, however, deficits are caused by systemic irritants or exogenous factors. Unforeseeable events such as wars, natural catastrophes, political unrest or poor financial planning (the selling of subprime mortgages, for example) means the government has to spend more than it has planned for, or more than the current economy can support. These deficits can be compounded by cyclical deficits-for example, we went to war with Iraq during a trough in the business cycle.

    What the Left Says

    Did the Bush tax cuts increase the deficit? Paul Krugman, a self-professed liberal economist, argues vehemently that the tax-cuts did in fact play a crucial role in the subsequent deficits.   He charges that there is more than a correlation between the Bush tax-cuts and an increase in the size of the deficit. He denies the notion that lower taxes lead to a smaller deficit, pointing out that when Reagan cut taxes deficits rose just as they did after Bush cut taxes.

    Krugman, a Nobel prize winning economist, cites statistics showing that most of the deficit over the last four years is owing to the Great Recession's effect on government revenues and the necessity of outlays for the unemployed.

    What the Right Says

    The conservative Heritage Foundation's top budget analyst, Brian Reidl, argues that uncontrolled spending, not tax-cuts, are to be blamed. Conservatives contend that the only way to control spending is to cut taxes, in effect, taking away the checkbook so the big spenders can't keep spending.

    Historical Background

    Having established a perfunctory explanation of "why" deficits occur, it is now necessary to establish "when" they occur. Is economic history really asymmetric and sporadic or are there certain observable trends? Do certain economic variables have a high predictive validity regarding the future condition of the budget? In short, are we able to empirically-divine when periods of large deficit spending will occur?

    Historian Bernard Weisberger thinks so. In this essay on the website of the History News Network, he argues that large budgetary surpluses are also the harbingers of massive future deficits.

    President Jefferson inherited an $80 million national debt (largely from the Revolutionary War) when he was inaugurated in 1801. However, tumult on the European continent allowed America's fledgling system of mercantilism to prosper from manufacturing and trade. During his tenure Jefferson was able to cut taxes, reduce the debt by half, and buy the Louisiana territory all while accumulating budgetary surpluses each year from 1801 to 1807.

    To accomplish this, Jefferson-always the proponent of a small federal government-cut military expenditures significantly. When James Madison succeeded Jefferson as President, America's foreign rivals took notice of our paltry defenses; Britain, in 1812, resolved to test the strength of our still pubescent nation. Responding to the threat of British usurpation, Madison borrowed heavily to rebuild the military-pushing the debt back toward the pre-Jeffersonian levels of $80 million.

    The debt wasn't repaid entirely until 1835 (the first and last time this has happened) under President Andrew Jackson. Public land sales in the western territories provided substantial revenue for the federal government. Free of debt, and with ostensibly limitless surpluses in the future, Jackson sought to "return" $37 million in surpluses to the states. The plan backfired and caused "widespread speculation in land," and a recession, resulting in $20 million in deficits over the next 2 years.

    Similarly, in the 1880's as the United States emerged from the Civil War with roughly $2 billion of debt, the economy rebounded. High tax rates and a rapidly expanding industrial economy led to a string of huge surpluses. As Weisberger notes, the question facing Grover Cleveland's administration was much like the one that had faced Jefferson and Jackson: How do we return surplus tax revenue to the people? Cleveland decided to subsidize the private sector, grant entitlements to veterans, and expand the military. In 1893 the stock market plummeted and the country again ran substantial deficits.

    The 19th century trend Weisberger notes is this: large surpluses tend to precede large deficits. But the trend didn't die with the century. The surpluses of the 1920's led to tax cuts, and subsequently, the Great Depresssion. The surpluses in the late 1990's led to President Bush's tax cuts, and subsequently, our most recent recession.

    Do surpluses cause deficits? It's likely not that simple, but the correlation is clear. It seems that large surpluses tend to trigger a sort of fiscal emetic reflex-politicians instinctively want to throw excess money back to the electorate. Often, they over compensate and tamper with the system that led to the surpluses in the first instance. While saving the money might be more prudent, it's not popular.

    Following a huge post-World War I expansion in the 1920's, the economy was destined for a cyclical downturn-much as the dotcom bubble was destined to "burst" circa the new millennium. The deficits in the early to mid 1930's naturally paralleled the record high unemployment rates that came to define the same era. As we noted above, high unemployment means less taxable income. This, coupled with FDR's New Deal legislation, meant government spending increased as government income decreased.

    Some economists argue that when the economy is under-producing, as happened during the Depression, excess deficit spending actually may prove beneficial in the long run as it serves as a catalyst to "speed-up" the economy, providing the foundation for long term future growth and more taxable income. John Maynard Keynes, the famous British economist, was a proponent of this idea.

    Factually, we know this much: the deficit spending in the 30's coupled with high unemployment resulted in deficits equal to roughly 5% of GDP from 1932-1935.

    Compared with the deficits of the 30's, those of the 40's were enormous, equaling more than 22% of the Gross Domestic Product (GDP). As a generation of young men dropped out of the workforce to fight overseas, the American economy was forced to reinvent itself. Many women entered the workforce, and big-industry geared up to buttress the war effort. Military spending ballooned-and the economy boomed.

    After the war soldiers returned home and became civilian auto-manufacturers, steel producers, and owners of businesses. Deficit spending, which had been feared, had helped build the platform for a larger economy in the future, indicating that deficit spending is not always the evil villain it is often caricaturized as. Still, the spending contributed to a massive public debt that, while it has been periodically reduced, has never been completely paid off and is currently in a period of growth. In fact, the last time the national debt was paid off in full was 1835, when Andrew Jackson was president. 

    Topics for Discussion

  • Is our problem that taxes are too low or that spending is too high--or some combination of the two?
  • How much of an impact on the deficit did the Great Recession have?
  • To reverse the trend in deficits, do we need to reform entitlement programs? (And just what is an entitlement program anyway?)
  • ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/150131 https://historynewsnetwork.org/article/150131 0
    What Does the President Actually Do? Download this backgrounder as a Word document

    Worth Reading

  • Peter M. Shane: How Has the Presidency Changed Most in the Last Thirty Years?
  • David E. Kyvig: How Presidential Power Became Untouchable
  • Ray Raphael: Foreign Policy and Original Intent: The Powers of the President
  • Warmaking: President vs. Congress
  • Background

    The powers and responsibilities of the president are enumerated in Articles I and II of the Constitution. The president is the commander-in-chief of the military (but not the power to declare war, which is reserved to the Congress), has the power to pardon crimes, to make treaties “with the Advice and Consent of the Senate,” and, again with the approval of the Senate, can appoint ambassadors, judges, Supreme Court justices, and a host of other public officials (note that Cabinet appointments don not require approval by the Senate). The Constitution obligates the president to “give to the Congress Information on the State of the Union.” Also, the president has the power to veto congressional bills, but the veto can be overridden with a two-thirds congressional majority.

    That's it. That's what the Constitution says about the powers and responsibilities of the president.

    So what does a modern president actually do?

    The simplest answer is that the president does a little bit of everything.

    As the head of the executive branch, he/she exercises authority over four million federal employees in over 200 different executive agencies (like NASA, the CIA, and even the post office) and the fifteen executive departments (State, Treasury, Defense, etc.). Some of these appointments require congressional approval, namely the heads of most executive agencies, as opposed to Cabinet officers like the secretary of state, who serve at the discretion of the president. Modern presidents also appoint officials known as “czars,” who are policy specialists given charge of coordinating the national effort on a particular issue (for example, the “drug czar” is responsible for drug enforcement).

    As the political leader of his/her party, the president is responsible for setting the agenda and using the “bully pulpit” of the presidency, as Teddy Roosevelt put it, to further his/her party's goal. Under the Constitution, the president cannot introduce a piece of legislation into the Congress, but he/she can certainly pressure congressional leaders to propose certain pieces of legislation. A good example of this dynamic at work was the 2010 health care reform bill -- President Obama had pledged to pass comprehensive health care reform, but it was his Democratic allies in Congress proposed and eventually passed the legislation.

    Presidents also have broad budgetary powers. While Congress has the power of the purse under the Constitution, the president must sign any congressional budget and also negotiates with the Congress over issues such as the federal debt (an issue which reared its ugly head in 2011).

    In terms of foreign policy, while Congress theoretically possesses the sole power to declare war, in practice wars have been fought more or less at the discretion of the president since the end of World War II, the last military conflict involving the United States where Congress officially declared war. And while the Senate theoretically must ratify treaties signed by the president (and controversial treaties, ranging from the Treaty of Versailles in 1919 that ended World War I to Kyoto Protocol against climate change, often are not ratified) the president can enter into “executive agreements” with other countries (which, to make things even more confusing, are recognized as treaties under international law).

    In addition to his/her governmental and policymaking roles, the president is also the head of state, which in many other countries is a strictly ceremonial role. As a result, the president hosts state dinners and meets with other heads of state.

    But perhaps the most important role of the president, at least in terms of how the average American sees him/her in their day-to-day lives, is as the celebrity-in-chief. Only about four in every ten Americans can correctly identify the Speaker of the House, but everyone knows who the president is -- after all, he/she is on TV all the time, has a legion of reporters at his/her beck and call, and is an intrinsically newsworthy personality given the amount of power the presidency possesses.

    The celebrity president is a relatively recent development, made possible in no small part due to advances in communications. Franklin D. Roosevelt was the first president to effectively take advantage of radio with his “fireside chats” beamed straight into the living rooms of ordinary Americans; Dwight D. Eisenhower was the first TV president, with his Madison Avenue public relations consultants and campaign commercials; Ronald Reagan took TV performances to a new level; and Barack Obama has effectively become the first social media president through savvy use of social networking sites to communicate directly with voters.

    What the Left and Right Say

    Both the Left and the Right have a love-hate relationship with the presidency and the extent of presidential powers -- they love it when their candidate is in office, and they hate it when their candidate isn't in office. Generally speaking, both Republican and Democratic presidents take an expansionary view of presidential power, and both liberal and conservative scholars and pundits have found much to object to in the use and abuse of presidential power.

    Historical Background

    The powers of the presidency have expanded greatly over time, just as the federal government has expanded greatly over time. But even at the beginning of the republic, presidents often took an expansionary view of their powers -- Thomas Jefferson, for example, approved the Louisiana Purchase despite objections that such an agreement was unconstitutional. But it was Abraham Lincoln, who took office after a series of weak presidents, who most dramatically expanded presidential and federal power due to the Civil War (this is a perennial theme throughout American history -- the expansion of presidential power during wartime). He suspended habeus corpus, printed and spent money without congressional approval, imprisoned 18,000 Confederate sympathizers, and unilaterally signed the Emancipation Proclamation.

    In 1973, the presidential historian Arthur M. Schlesinger, a liberal scholar who was closely connected to the Kennedy administration wrote a book which applied a label to the expanding powers of the president: the “imperial presidency.” According to Schlesinger, the dramatic expansion of the executive branch's bureaucracy in the twentieth century removed effective congressional oversight of the president, that the president effectively had the power of war and peace despite the prescriptions in the Constitution. This last element in particular -- especially in light of Vietnam -- was instrumental in the congressional passage of the War Powers Act over a presidential veto in 1973: it requires that the president must seek congressional approval for military action abroad before the sixtieth day of operations. In practice, however, both Democratic and Republican presidents have regularly ignored the Act -- most recently, with President Obama's commitment of air forces to Libya.

    ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/149462 https://historynewsnetwork.org/article/149462 0
    What Does the Vice President Actually Do? Worth Reading

  • Joshua Spivak: Why Do We Have Vice Presidents?
  • Joel K. Goldstein: How the Vice President Can Serve as the President's Most Unbiased Advisor
  • Christopher Bates: The Vice Presidency Should Not Be an Accident Waiting to Happen
  • Background

    According to John Nance Garner, Franklin D. Roosevelt's first vice president, the vice-presidency “isn't worth a bucket of warm [spit].” (He actually used a ruder word than spit, but it was bowlderized by reporters). And indeed, for most of American history, the office of the vice-president really wasn't much more than an afterthought -- there have been sixteen non-consecutive occasions throughout American history when the office has been vacant.

    The office was created by the founders largely to prevent deadlocked presidential elections (before 1804, the runner-up in presidential elections became vice-president). All the Constitution has to say about the office is that the vice-president serves as the president of the Senate and casts tie-breaking votes in that body, and that the vice-president “assumes the powers and duties” of the president should he/she die or become incapacitated (of the forty-seven vice presidents, nine have become president due to the death or resignation of the sitting president).

    It's for this reason that perhaps the most important function of the vice-president is that he/she is a single heartbeat away from the Oval Office.

    Still, until quite recently, the vice presidency has been a place of a sort of political exile, as vice presidents were (and still are) often political rivals of the president under whom they served, but who were included on the ticket for political reasons.

    Lyndon B. Johnson, John F. Kennedy's vice president who ended up becoming president himself upon Kennedy's assassination, is a perfect example. Johnson and Kennedy had butt heads in the Senate, where Kennedy was a junior senator and Johnson was the Senate Majority Leader, and had both vied for the Democratic presidential nomination in 1960. In office, Kennedy excluded Johnson from major decision-making.

    Harry S. Truman provides another case-in-point: selected as Franklin Roosevelt's running mate in 1944 because the then-current vice president, Henry Wallace, was viewed as too left-wing by FDR and his advisors, Truman met with Roosevelt all of twice before the latter's death. Truman wasn't even informed of the Manhattan Project, which developed the atomic bomb, until after he became president.

    Nevertheless, since the 1970s, the vice president has taken an increasingly prominent and important part in the U.S. government and politics (compared to such nonentities like Daniel D. Tompkins, James Monroe's vice president, Truman and Johnson are giants -- though admittedly it helps that they actually became president).

    It was Walter Mondale, Jimmy Carter's vice president from 1977-1981, who established the contours of the modern office. Far from just sitting in a room waiting for the president's pulse to stop, Mondale pioneered the “activist vice president,” serving as a close advisor and point man for the president, a role which has continued to this day.

    But indisputably, the most prominent and powerful vice president in American history was Dick Cheney, who served under George W. Bush. A veteran politician and former secretary of defense under the first President Bush, Cheney wielded huge influence over the administration's foreign policy, including the decision to invade Iraq. Though he was not, as was often claimed by liberal critics, a “shadow president” who was really the man in charge in the Oval Office, he was consistently the person who had the president's ear and, working with political allies within the administration like Secretary of Defense Donald Rumsfeld, was able to outmaneuver his foreign-policy opponents within the administration, like Secretary of State Colin Powell.

    The incumbent vice president, Joe Biden, has taken a much more Mondalian than Cheneyite approach to the office. President Obama has compared his role in the administration to a basketball player “who does a bunch of things that don’t show up in the stat sheet,” often playing devil's advocate within the administration to keep other decision-makers on their toes.

    ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/149458 https://historynewsnetwork.org/article/149458 0
    Electing the President: Who Actually Votes? Download this backgrounder as a Word document

    Worth Reading

  • HNN Hot Topic: Election 2012
  • HNN Hot Topic: Electing Presidents
  • HNN Hot Topic: The Electoral College
  • Background

    Ever wonder why presidential candidates talk so much about issues like Medicare that affect people in the upper age brackets? It's because older people vote at much higher rates than others. In 2008 70 percent of eligible voters 65 and older voted. Only 49 percent of young voters (age 18 to 24) cast a ballot. If you were a politician which group's concerns would you most care about?

    Buried in the statistics are other interesting findings.

    Year in year out voter turnout by non-Hispanic whites is higher than any other group. In 2008 66 percent voted. Asians and Hispanics voted at the same rate as young people, who traditionally turn out in deplorably low numbers. In 2008 it was just 49 percent--less than a majority. (That was higher though than in 2004, when just 47 percent of young people bothered to vote).

    One group dramatically increased turnout in 2008. The opportunity to vote for the first African-American nominee of a major party prompted 65 percent of blacks to vote--the highest percentage in history. Normally, blacks vote at a far lower rate than whites.

    In 2008 131 million people voted in the presidential election. The eligible voting population was 206 million.

    As in past election cycles, people with less education and lower income voted at a lower rate than people with college degrees and higher incomes.

    As one website reports: "The voting rate among citizens living in families with annual incomes of $50,000 or more was 77 percent, compared with 48 percent for citizens living in families with incomes under $20,000."

    What about gender? Women vote in slightly higher numbers than men. Married people vote in substantially higher numbers than people who are single.

    Bottom Line: Civics leaders often bemoan the low turn-out rates of voters. But when you look at the data you see that low voting rates are not a problem across-the-board. Non-Hispanic whites with college educations and annual incomes over $50,000 vote at high rates. So the challenge for society is increasing the rates for the groups that don't participate in elections.

    What the Left Says

    Liberals bemoan low turn-out rates--with good reason. They draw support from the groups with some of the lowest rates: the poor and minorities. If those demographics voted at the same rate as whites with college degrees Democrats would be the majority party.

    Barack Obama has had a particularly difficult time drawing the support of whites with college educations.

    What the Right Says

    Conservatives also bemoan low turn-out rates. But the low rates tend to work to the benefit of the Republican Party. Republicans can count on the support of a majority of non-Hispanic white voters at the presidential level and they vote at a higher level than other groups. The last Democratic candidate for president to win a majority of the white vote was Lyndon Johnson in 1964.

    Historical Background

    In the beginning -- that is, after the ratification of the Constitution in 1789 -- only white males voted. And in many states, they had to own property. Limiting the suffrage to people who owned property sounds undemocratic to our ears. But the thinking was that people who worked for others for a living were unable to exercise independence, leaving them susceptible to pressure to vote the way their employer instructed. It was also believed that only voters with property had a strong enough stake in the community to exercise their voting rights responsibly. But property qualifications quickly began to fall. According to Alexander Keyssar's comprehensive history of the suffrage, The Right to Vote:

    "Delaware eliminated its property requirement in 1792, and Maryland followed a decade later. Massachusetts, despite the eloquent opposition of [John] Adams and Daniel Webster, abolished its freehold or estate qualification in 1821; New York acted in the same year. Virginia was the last state to insist on a real property qualification in all elections, clinging to a modified (and extraordinarily complex) freehold law until 1850. And North Carolina finally eliminated its property qualification for senatorial elections in the mid-1850s."

    By the 1820s the suffrage was extended to the masses of white male citizens, giving rise to Jacksonian democracy. This changed our politics forever. From then on presidents would be selected from a pool of candidates who demonstrated popular appear rather than solely on account of a stellar resume.

    After the Civil War blacks in theory won the right to vote but only under Reconstruction were they allowed to vote in large numbers. Once Reconstruction ended, the vast majority of blacks in the South lost the right to vote for a hundred years.

    This two-steps forward, two steps back history of voting is at odds with the theme of progress and is largely forgotten. But it shaped American politics for generations.

    The first national pressure to give women the right to vote came in 1848 when a convention of women in Seneca Falls, New York demanded that the suffrage be extended to women. As abolitionism grew in the 1850s leaders of women's suffrage grew hopeful that the fight to free blacks would enhance their fight to win the vote. It did not. Abolitionists in general declined to link the two causes. Blacks won the right to vote before women (though in a few states women were permitted to vote in state elections).

    Women finally won the national right to vote in 1920 with the ratification of the 19th Amendment. But women did not vote differently than their husbands, as critics had feared. Not until the 1980s did a significant gap develop in the voting patterns of men and women. (Women tiled toward the Democratic Party.)

    In 1964 the Civil Rights Act was passed that guaranteed blacks the right to vote. The Voting Rights Act of 1965 suspended literacy tests that had blocked blacks from voting and authorized the Justice Department to force state and local governments in the South to enroll black voters.

    As the 1960s wore on there were two further developments affecting the suffrage. First, the Supreme Court ruled that the states no longer could maintain congressional districts of unequal populations that had allowed thinly populated rural districts to send more members to Congress than heavily populated urban districts. Second, the Nixon administration backed the extension of the suffrage to people aged eighteen and over in response to critics who argued that if an eighteen year old was old enough to fight and die in Vietnam he was old enough to vote.

    Topics for Discussion

  • Why do people with college educations vote at such a higher rate (78 percent) than those who only completed high school (40 percent)?
  • Why do a majority of young people not vote?
  • How does the pool of voters affect our politics and government policies?
  • ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/148901 https://historynewsnetwork.org/article/148901 0
    Electing the President: Who Has the Right to Vote? Download this backgrounder as a Word document

    Worth Reading

  • News21: Voting Rights
  • VIDEO: Democracy Distilled: a History of Our Nation's Voting Rights
  • HNN Hot Topics: Election 2012
  • HNN Hot Topics: Electing Presidents
  • HNN Hot Topics: The Electoral College
  • 13th Amendment to the U.S. Constitution
  • 14th Amendment to the U.S. Constitution
  • 15th Amendment to the U.S. Constitution
  • 19th Amendment to the U.S. Constitution
  • 26th Amendment to the U.S. Constitution
  • Background

    All U.S. citizens over 18 years of age have the right to vote. This was not always the case. Before the 26th Amendment was ratified in 1971, only those 21 or older were guaranteed the right to vote by the federal government. Before the 19th Amendment was ratified in 1920, only men 21 years old or older were guaranteed the right to vote by the federal government. Before the 15th Amendment was ratified in 1870, only white men 21 years old or older were guaranteed the right to vote by the federal government. (Much, much more on all of this in the historical background section).

    But now everyone over 18 can vote. Sounds good, right? So what's with all the controversy recently about voting rights?

    The right to vote has always been a messy business in America thanks to our system of federalism. Since the very beginning of the United States and right down to the present day, each state has been largely responsible for its setting its own qualifications for voters.

    What does this mean today?

    For example, most states -- Florida is a case in point -- a person convicted of a felony cannot vote, even after being released from prison (this affects nearly 6 million Americans, most convicted of nonviolent drug charges). But in a handful of states like Maine, a felon can vote by absentee ballot from prison.

    Voter registration also varies from state to state. Minnesota regularly leads the nation in voter turnout thanks to in no small part to same-day voter registration -- all you need to do is simply turn up at the polls. In Pennsylvania, on the other hand, voters must register 30 days prior to the election in order to vote.

    The biggest controversy this year has been over voter ID laws requiring voters to present a photo ID before voting, ostensibly in order to prevent voter fraud. Voter ID laws have been passed in Georgia, Indiana, Pennsylvania, Tennessee, and Kansas, and are under consideration in many other states.

    But a recent non-partisan study determined that of the hundreds upon hundreds of millions of votes cast in elections since 2000, there have been only around 2,000 cases of electoral fraud -- most attributable to mistakes rather than malicious intent.

    What the Right Says

    Republicans argue that measures like voter ID laws are necessary to prevent voter fraud, and that felons forfeited the right to vote by choosing to commit crimes in the first place.

    What the Left Says

    Democrats are adamant that voting restrictions, particularly voter ID laws, are an attempt by Republicans to prevent young people, the poor, and minorities from voting -- these are all groups that tend to vote Democratic and are also the groups that are most likely to lack valid photo ID -- pointing to studies which show voter fraud as essentially nonexistent.

    Some (though by no means all) liberals also argue against permanently disbarring ex-convicts from voting, arguing that such a harsh policy encourages former felons to re-offend.

    Historical Background

    Before independence from Great Britain, each individual colony had its own voting laws, many of which excluded certain groups from voting -- a 1737 New York law, for example, banned Jews from the ballot box. Most colonies also had property requirements, meaning that a (white male) citizen had to own a certain amount of land or possess a certain amount of money in order to vote (the idea was that men of property had more of a stake in their communities).

    Property requirements were gradually removed at the state level throughout the late eighteen and early nineteenth centuries. Vermont was the first state to introduce universal male suffrage in 1777; by the age of Andrew Jackson, most white male citizens were able to vote, regardless of their wealth. This still left a huge number of people who were disqualified: women, slaves, free blacks, Native Americans, and other ethnic minorities.

    As part of Reconstruction (1863-1877), the reincorporation of the defeated Confederacy back into the Union after the Civil War, Congress passed (and the states ratified) the 13th, 14th, and 15th Amendments to the Constitution, respectively freeing the slaves, extending citizenship to the freed slaves, and guaranteeing the right to vote to all citizens regardless of race. Newly enfranchised freed slaves flocked to the polls, electing hundreds of African American state legislators throughout the South, not to mention fifteen African American representatives and two senators to the U.S. Congress.

    When Reconstruction ended in 1877, the Southern states successfully disenfranchised the overwhelming majority of black voters. This was done not by passing laws banning blacks from voting -- that's unconstitutional! -- but by imposing poll taxes (requiring voters to pay a fee in order to vote), literacy tests (most freed slaves were illiterate), and exempting white voters from meeting these requirements through the so-called “grandfather clause” (if your grandfather or father was qualified to vote, then you were exempt from voting restrictions). The number of African American voters plummeted, and most Southern blacks would remain disenfranchised until the mid-twentieth century -- in 1940, only 3 percent of black Southerners were registered to vote, despite the fact that in some states African Americans were in the majority.

    At the same time, Chinese immigrants in the West were denied the pathway to citizenship, and thus the vote, by the Chinese Exclusion Act of 1882. Native Americans were also generally denied the vote through a series of complicated legal mechanisms similar to those blacks faced in the South. This situation only began to change after the turn of the century, and even then only very slowly. An act of Congress in 1925 gave all Native Americans full citizenship, but even this did not allow all Native Americans to vote, despite the provisions of the 15th Amendment.

    Women, on the other hand, had their voting rights cemented nationally by the 19th Amendment in 1920, though women's suffragists had been protesting for the vote since 1848 -- and in fact Wyoming became the first state to grant women the right to vote in 1869.

    Voting rights for African Americans, Native Americans, and other ethnic minorities continued to be suppressed throughout the first half of the twentieth century, especially in areas that had sizable minority populations. However, legal barriers erected against certain ethnic minorities gaining citizen gradually began to be removed in the 1940s, and black voter registration in the South began to modestly increase (from 3 percent in 1940 to 13 percent in 1947). But resistance to black voter registration in the South was savage -- infamously, three civil rights volunteers who were registering black voters in Mississippi in 1964 were murdered by local members of the white supremacist Ku Klux Klan, and a year later local police in Selma, Alabama attacked a peaceful march in support for voting rights (itself held in response to an Alabama state trooper killing an unarmed voting rights activist).

    1965 marked a turning point in the history of voting in America with the signing of the Voting Rights Act by President Lyndon B. Johnson and the final repeal of immigration and citizenship quotas. Poll taxes, literacy tests, and other measures designed to prevent African Americans from voting were declared illegal under federal law. An extension to the Voting Rights Act was signed by George W. Bush in 2006.

    With the ratification of the 26th Amendment in 1971, for first time in U.S. history, the right to vote was extended to all American citizens over 18 years of age regardless of race, color, gender, or creed. In theory, anyway -- the controversy continues over the exercise of the vote, as opposed to the legal right to vote.

    Topics

  • Is voting an inalienable right or a privilege? Why or why not?
  • Are voter ID laws a good idea? Why or why not?
  • Should a person convicted of a crime, provided he or she has paid his/her debt to society, be allowed to vote, or should he/she permanently be disenfranchised? Why or why not?
  • Can you draw connections between the exercise of the vote and important events/themes in American history (i.e. civil rights)?
  • ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/148813 https://historynewsnetwork.org/article/148813 0
    Constitution Day: Backgrounder

    Worth Reading

  • Full Text of the U.S. Constitution
  • National Endowment for the Humanities: Constitution Day
  • Jonathan Dresner: The Constitution, Ritual, and History
  • The date we recognize the passage of the U.S. Constitution, Constitution Day, September 17, is very much the kid sibling to Independence Day. Unlike the Fourth of July, with its gigantic spectacle of fireworks and parades to celebrate that fateful day in the hot summer of 1776, Constitution Day (until 2005 known as Citizenship Day) is a time to learn about how the Founding Fathers drafted the Constitution, the challenges they faced and the solutions they came up with, as well as constitutional controversies that rage to this very day.

    Historical Background

    Though the thirteen colonies declared independence from Great Britain in 1776 and the Revolutionary War ended in American victory in 1783, the Constitution was not drafted until 1787, ratified until 1788, and George Washington did not become the first president of the United States until 1789. So how was the U.S. Governed between 1776 and 1787?

    The answer is through the Continental Congress under the Articles of Confederation (which wasn't actually ratified until 1781, but was the governing principle under which the Continental Congress operated throughout its existence.) The Articles of Confederation provided a very loose form of national government -- which made sense. Both before and after the Declaration of Independence in 1776, most Americans didn't even think of themselves as Americans -- you first and foremost a Virginian or a Massachusettsian or a New Yorker or a Georgian, then you were a British subject, then you were an American, but only in the sense that you lived in British North America. Each state was effectively independent, but in association with each other. Under the Articles, there was no president, no Senate, nor was there even a firm assertion that the United States as such was a sovereign entity, instead describing a “firm league of friendship” between the states. The Articles were successful in the sense that they kept the colonies together until the war with Britain could be successfully finished, but they were unable to effectively govern the colonies in the peace that followed. For example, the Continental Congress had promised army veterans generous benefits after the war, benefits that they were unable to pay because Congress had no effective power to either levy taxes or regulate interstate trade. By 1787 an increasingly unstable economic and political situation demanded a fundamental revision of American government.

    Throughout the spring and summer of 1787, delegates from the thirteen states debated in Philadelphia over the shape of the new government. The person most directly responsible for the actual writing of the Constitution was James Madison (later the fourth president), who largely wrote the first draft of the document under his Virginia Plan (named so because Madison was a member of Virginia's delegation). Drawing from influences such as Enlightenment thinkers like John Locke and Montesiqeu as well as from classical Greece and Rome, Madison's plan established most of the Constitution's basic principles: separation of powers between the executive, legislative, and judicial branches, as well as a system of checks and balances to prevent one branch from gaining too much power and governing tyrannically, as had been the case under British rule. For example, under the final draft Constitution, Congress has the power to declare war, but the president is the commander-in-chief of the military. The president nominates judges, but his/her appointees require Congressional approval. The Supreme Court can determine the constitutionality of a law (which is not actually a provision of the Constitution, but a power the Supreme Court gained as a result of the Marbury v. Madison decision in 1803), but justices are appointed by the president and confirmed by the Senate.

    But we're getting a bit ahead of ourselves -- back to Madison in 1787! The biggest problem with Madison's plan was how it apportioned representatives to the Congress -- Madison envisioned two legislative bodies, one upper (the Senate), one lower (the House of Representatives). The Senate would be elected by the members of the House (who would be popularly elected by the voters of their respective states) and in both the Senate and House representation would be allocated proportionally, meaning that more populous states would received more representatives (Virginia was the largest state at this time). Representatives from smaller states naturally feared their states would lose political power in under this plan, and in what became known as the New Jersey Plan proposed a unicameral legislature, meaning a single house, that would have had each state receive one representative.

    The states were deadlocked until the delegation from Connecticut proposed what became known as the Great Compromise, under which a bicameral legislature (two different houses, one upper, one lower, in Congress) was established. The lower House would consist of popularly-elected representatives allocated by population, while the upper Senate would consist of two senators per state selected by state legislatures. (After the passage of the Seventeenth Amendment in 1913, senators became directly elected by state populations -- one of the reasons for passage was that a Montana businessman literally bought a Senate seat from his state legislature in 1900.) Part of the Great Compromise was the so-called Three-Fifths Compromise -- Southern states wanted their slave populations counted when it came to allocating representation. A compromise was reached where a slave counted as three-fifths of a person for representation, but also for taxes. This was and remains one of the most infamous provisions of the Constitution, as it officially sanctioned slavery. Indeed, the seeds of the Civil War sixty-four years later were being sown at the Constitutional Convention.

    The Convention also had to establish how to elect the president, eventually deciding to create an Electoral College in which each state would receive a certain number of electoral votes, the equivalent of their total House delegations plus their two senators (for example, today California has 55 electoral votes -- 53 electors for the state's House delegation, 2 electors for the Senators). It was up to the states how to select their electors, but today electors almost always vote for the presidential candidate who received the most votes in their states.

    Another important part of the original draft of the Constitution were the various small clauses, or ideas, found deep within the text. Two of the most important are the commerce clause, which empowers Congress to regulate trade and business between states -- a clause which provides the legal foundation for laws as different as the New Deal in the 1930s to desegregated businesses in the 1960s to the health care law in the 2010s -- and the elastic clause, which empowers Congress to “make all Laws which shall be necessary and proper for carrying into Execution the foregoing Powers, and all other Powers vested by this Constitution in the Government of the United States.” The elastic clause has been one of the most controversial, as different ideas about what the proper scope of the federal government have been debated throughout the years.

    After the Constitution was publicly released on September 17, 1787, two camps quickly emerged. Federalists supported the proposed Constitution, while Anti-Federalists opposed it, fearing that it took too much power from the states and would create a tyrannical federal government. When the First Congress convened in 1789, one of the first orders of business was to pass ten amendments to the Constitution. These first ten amendments became known collectively as the Bill of Rights. They guarantee that Congress shall pass no law prohibiting the freedom of speech, freedom of the press, freedom of religion, freedom of public assembly, and freedom to petition Congress (First Amendment), the right to bear arms (Second Amendment), prohibiting the forcible quartering of soldiers during peacetime (Third Amendment), restrictions against unreasonable searches and seizures of property (Fourth Amendment), guarantees of due process under law (Fifth Amendment), guarantees of a “speedy and public trial” (Sixth Amendment) judged by a jury of one's peers (Seventh Amendment), prohibition of “cruel and unusual punishments” (Eighth Amendment), and delegates powers not otherwise specified to the people and the states (Ninth and Tenth Amendments). The Bill of Rights was ratified on December 15, 1791.

    The Current Debate: What Conservatives Say

    Modern conservatives tend to take a literal and restrictive view of the Constitution, arguing that the document was drafted primarily in order to limit the power of the federal government in order to protect the liberty of the states and the people. Strict constructionist readings of the Constitution emphasize that the Constitution means literally what it says -- you cannot and should not read beyond it. Originalists go even further -- original intentionalists argue that the Constitution needs to be interpreted with a spirit consistent with the intentions of the Founding Fathers; original meaning proponents go even further -- the interpretation of the Constitution should only be determined by what “reasonable people” at the time of its drafting thought it meant.

    The Current Debate: What Liberals Say

    Liberals generally interpret the Constitution much more broadly -- the Living Constitution doctrine, a favorite of liberals and progressives, argues that the Constitution was drafted not as the final word, but as an adaptable framework. The writers of the Constitution, proponents argue, could not foresee all of the changes in the world throughout the subsequent two hundred years and never intended solutions for the eighteenth century to be used uncritically in the twenty-first. Thus, by including things like the elastic clause, they gave the Constitution enough flexibility to be able to address, for example, the immense social and political changes of the Information Age.

    Key Terms

  • Articles of Confederation
  • James Madison
  • Virginia Plan
  • New Jersey Plan
  • Great Compromise
  • Three-Fifths Compromise
  • Electoral College
  • Commerce Clause
  • Elastic Clause
  • Federalist
  • Anti-Federalist
  • ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/148236 https://historynewsnetwork.org/article/148236 0
    Electing the President: What Makes for a Great President Download this backgrounder as a Word document

    Electing the President: What Makes for a Great President

    Worth Reading

  • HNN Hot Topics: Election 2012
  • HNN Hot Topics: Electing Presidents
  • HNN Hot Topics: The Electoral College
  • Background

    The challenge of finding good leaders has preoccupied people since human beings first formed groups in the days of hunters and gatherers hundreds of thousands of years ago. Plato held that one of the key qualities a leader should possess is the ability to set aside emotions and make cool judgments. Most people, he noted, are beholden to illusions. A leader can't afford to be. He went so far as to conclude that good leaders should be taken away from their families at an early age so that they will have a chance to develop an independent and clear-eyed view of matters of state concern without regard to special interests.

    What makes for a good president? It's an impossible job for there are many considerations. The president should be able to fulfill the many duties prescribed by the Constitution and custom. He must be a good commander-in-chief, educator-in-chief, diplomat-in-chief, moral leader-in-chief, and legislator-in-chief. But that's just for starters. He (or she) also must be able to correctly gauge public opinion and move it in a direction he wants without alienating so many people that he triggers a forceful backlash. He needs to be able to inspire people. He has to be both resolute and flexible, judging coolly which approach is needed. And he needs to be able to win passage for his legislation from Congress.

    As the federal government was purposefully designed to be slow-moving, with checks and balances to prevent the abuse of power, only the president is in a position to provide strong direction. The members of the other two branches cannot. The president therefore has to be able to provide leadership. But what's leadership? The concept is slippery. People think they know it when they see it, but it's not easily pinned down. What works for one president in one set of circumstances may not work for another president in different circumstances.

    Historians regularly rate the presidents. But their judgments change with the times. No ruling authority on Mount Olympus decides for all time the order in which the presidents should be ranked. As human beings make these judgments, error is inevitable. Moreover, it is not clear what metric should be employed to judge presidents. A few years ago a libertarian ranked the presidents by their ability to advance the cause of "peace, prosperity, and liberty." By this standard, which gave high ratings to presidents who showed a strong commitment to limited government, Abraham Lincoln came in 29th and Warren Harding 6th.

    What the Left Says

    Liberals generally favor strong presidents with populist appeals who succeed in pushing through Congress large social welfare programs like Social Security and Medicare. After Watergate some liberals expressed qualms about the model they had long embraced of a vigorous presidency, worrying that the office had become imperial. But they largely remained enchanted with the presidents who fought for social change through strong federal action. The iconic liberal president remains Franklin Delano Roosevelt.

    What the Right Says

    Conservatives opposed FDR but didn't have a president of their own they could point to with pride until Ronald Reagan. He demonstrated how a conservative president could seize the public imagination through deft control of Congress, inspired speeches, witty quips, and a clear program (lower taxes, higher defense spending and a smaller deficit; he got two out three, which by presidential standards, is quite good). No previous conservative president had ever seemed to loom as large. Dedicated to smaller government, they earned little praise for saying no to government initiatives.

    Historical Background

    George Washington was every Founding Father's idea of a great president. The framers agreed to give the presidency enormous power only because Washington was expected to fill the office first, giving them the assurance that good precedents would be established to limit future incumbents. Washington was regarded as the right man for the job for several reasons. He was regarded as impartial. He was a war hero whom the country revered. And he had proved during the Revolution that he could be trusted with power. In short, he was known for his good character. Character counted!

    Washington is ranked by historians as one of the two greatest presidents. (The other is Abraham Lincoln.) But he wasn't always popular. Many people opposed his support of the Jay Treaty with Great Britain, some accusing him of treason. In death he became an unassailable figure, lionized by both the left and the right. Both claimed him as their own.

    Once the masses got the right to vote starting in the 1830s presidents like Washington became rare. From then on we turned to leaders less because of their resume than for their common touch. Like Lincoln, they had to demonstrate that they understood ordinary people's problems. If they happened to come from the people and not the upper classes, so much the better.

    When the country became a world power at the end of the 19th century with possessions off Florida and the Far East, presidents needed to demonstrate an acute sensitivity to foreign affairs. Few had the background. Many stumbled into crises.

    War repeatedly gave presidents an opportunity to change history and demonstrate leadership. Most rose to the challenge. Teddy Roosevelt, himself a war hero from the Spanish-American War, lamented that he had not been a war-time president.

    In the 1970s it was often said that the world had become so complicated that it was impossible to show greatness in the presidency anymore. Reagan demonstrated that wasn't true. As even liberal historian Sean Wilentz has conceded, Reagan bent the institution to his will and succeeded in changing American politics for at least a generation. Once it had been FDR in whose shadow presidents stood. Now it was Reagan.

    Topics for Discussion

  • Do you know anybody you think would make a great president?
  • Do we need great presidents or can we get by ok with so-so presidents?
  • How do leaders of democracies differ from leaders in other forms of government?
  • ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/148235 https://historynewsnetwork.org/article/148235 0
    Electing the President: Voter Apathy Worth Reading

  • HNN Hot Topics: Election 2012
  • HNN Hot Topics: Electing Presidents
  • HNN Hot Topics: The Electoral College
  • Should We Take Away the Voting Rights of 18 Year Olds?
  • Background

    Voter apathy is mostly a modern disease. Early in our history voter turnout in presidential elections was high. In the late nineteenth century it was common for more than 70 percent of the voting age population to cast a ballot. Some years the figure rose as high as 81 percent. Those figures declined somewhat in the first half of the twentieth century. But even in the 1950s turnout exceeded 60 percent, a sign of a healthy democracy. (See this chart for details.) Then, beginning at the end of the 1960s turnout rates began plummeting. In 1960 63 percent of voters voted. By 1980 the figure was down to 53 percent. In 1988 we hit rock bottom: 50 percent. Since then the numbers have bounced around a bit, but we've never approached the high numbers recorded earlier in our history.

    A sign of voter apathy is the indifference of voters to civics. A majority of voters can't name the three branches of government, don't know that there are 100 United States senators, and don't realize that the president's appointments to the Supreme Court are subject to a vote by the Senate.

    A majority of voters cannot name their own member of Congress or their two US senators. Nor can a majority tell you who the chief justice of the Supreme Court is.

    Ignorance of civics has been widely confirmed by surveys given Americans by researchers at the University of Michigan since the 1950s. It was more easily justified in earlier times when fewer Americans received proper schooling. In 1940 6 in 10 Americans did not graduate from the 8th grade. But today most Americans have attended college. Yet the surveys indicate they are generally as ignorant as earlier generations and by some criteria even more ignorant. More Americans in the 1950s were able to say what divided the two major parties than they were in the 1970s.

    What the Left Says

    Both liberals and conservatives decry voter apathy. But they focus on different aspects of the problem. Liberals concentrate on voter turnout. They complain that conservatives try to keep people from voting by establishing stricter requirements for registration, thereby driving down the participation of voters most likely to support Democrats. (The poor and minorities usually vote in lower numbers than people drawn from other demographics.) Many states in conservative hands have recently enacted laws requiring voters to produce more substantial identification than was common in the past. Conservatives say that this is necessary to protect the integrity of elections. Liberals charge that it's a solution in search of a nonexistent problem. They contend there is little evidence of election fraud.

    What the Right Says

    Conservatives traditionally feared the power of the people and worried that they would be unwise in their judgments. Since Ronald Reagan conservatives have celebrated the wisdom of the people, blaming bad government on liberals and the media. Voter apathy? It's usually framed in the context of their larger fight with liberals. It makes sense to conservatives that voters are apathetic given the disfunction of government. Respect for government will increase when its powers are limited.

    Historical Background

    What accounts for the apathy of Americans? When life was simpler in the nineteenth century there were few activities suited to leisure. Politics offered one of the chief forms of community entertainment. At the famous Lincoln-Douglas debates tens of thousands turned out to watch. They brought their lunches with them and sat for hours as the candidates debated. Today Americans can choose from so many leisure activities that they no longer feel the need to turn to politics for entertainment.

    Why do so few bother to vote? One primary factor has been the decline of party bosses. When the bosses ran things people voted in higher numbers at the behest of the bosses, who arranged in return to provide services and jobs to them and their families and neighbors. An unintended consequence of reforms to limit the power of bosses by giving government employees guaranteed civil service protections, was to break the direct connection voters felt with their government.

    Trust in government has declined in tandem with voter turnout, leading many critics to conclude there is probably a connection. It makes sense. People who don't believe in their politicians probably feel less inclined to take part in politics. Instead, they turn away in disgust. Several critical developments of the last half century have contributed to high levels of distrust: Vietnam, Watergate, and Iran-contra. But the issue is terribly complicated. Robert Samuelson, the Washington Post columnist, wrote a book arguing that the inflation of the 1970s was mainly to blame for mistrust. As the money in voters' pockets came to be worth less and less, they grew suspicious and lost faith in their leaders. Others contend that trust in our institutions declined when government started making large promises it couldn't keep.

    One factor everybody agrees is critical to the decline in voter turnout was the passage of the 26th Amendment to the Constitution giving 18 year olds the right to vote. While young people initially voted in high numbers, turnout declined dramatically soon thereafter, partly in response to the end of the Vietnam War and the threat of the draft. Today, the demographic that votes the least is young people.

    Topics for Discussion

  • Should voting be required?
  • Should voters have to take a test to make sure they understand civics?
  • What steps could be taken to encourage young people to vote?
  • ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/146517 https://historynewsnetwork.org/article/146517 0
    Syria Download this backgrounder as a Word document

    Worth Reading

  • Juan Cole: The Dilemma over Whether to Intervene in Syria
  • Daniel Pipes: Fin de Regime in Syria?
  • Wadah Khanfar: Syria Between Two Massacres … Hama's Memory Endures
  • David W. Lesch: What Could Shake Syria's Regime
  • Background

    Syria has been embroiled in a civil war for over a year -- a war which has claimed over ten thousand lives. The uprising against the regime of Syrian president Bashar al-Assad began in March 2011 as part of the broader Arab Spring. Among the many factors in the uprising: discontent with the authoritarian regime; high youth unemployment (around 25 percent, about average for the Middle East -- but the unemployment rate for older adults was in 2007 only 4 percent, one of the largest imbalances in the world); and the fact that the Assad regime is dominated by the Sh'ia Alawis (of which Assad himself is one), while the rest of the country is majority Sunni Muslim.

    Mass protests against the Assad regime began back in Feburary 2011, which were met by beatings and other brutalities by police. As clashes between demonstrators and the authorities intensified, the death toll began to climb. By April, the Syrian Army was being deployed against the protesters, and by the end of the month fissures erupted within the military, as some soldiers showed reluctance to fire on their own people. By July, several major Syrian cities were effectively under siege by the military, though Syria's two largest cities, Damascus (the capital) and Aleppo have been relatively quiet.

    What began as a mass protest movement has morphed into a quasi-guerrilla campaign against government forces by the Free Syrian Army, a group of deserters from the government forces as well as civilian volunteers. The Assad government dismisses the rebels as a foreign-tainted insurgency.

    How has the crisis played out overseas? U.S. relations with Syria have generally been cool for the past several decades (Syria is Iran's most important Arab ally, and Syria has been listed as a state sponsor of terrorism by the U.S. since 1979), and American politicians have universally condemned the recent actions of the Assad regime -- however, the Obama administration began a policy of rapproachment with Syria in 2010 -- Secretary of State Hilary Clinton called Assad a "reformer" as late as March 2011. The past year has seen a reversal of that policy, particularly after the U.S. ambassador to Syria was recalled for fear of his safety after he was attacked by pro-Assad mobs.

    The regime has relied on the support of Russia and China to shield itself from United Nations intervention. Both Russia and China are on the U.N. Security Council and have indicated that they will veto any resolution authorizing the use of force against Assad, and they have already vetoed a U.N. resolution calling upon Assad to relinquish power. Russia continues to be a major weapons dealer to Syria, and China has its own economic interests in the country and the region generally.

    What the Right and Left Say

    Given the ongoing U.S. presidential campaign's focus on domestic matters, differences between liberals and conservatives on Syria have remained largely in the background, and are in any event ambiguous. President Obama has gone on record calling for Assad to step down, but some Republicans -- notably John McCain -- want the United States to go further and actively arm the anti-Assad forces.

    Notably, neither the president nor his Republican opponents are calling for airstrikes.

    Even Middle East experts, generally speaking a contentious group when it comes to politics, are ambivalent about intervention in Syria. Daniel Pipes, a conservative commentator, has written that "I favor a U.S. policy of inaction, of letting events transpire as they might." Liberal historian Juan Cole, despite his support for the uprising, maintains that without a U.N. resolution, which does not appear to be forthcoming, any intervention in Syria would be illegal under international law.

    Historical Background

    Like many of the countries in the region, Syria has a history both profoundly ancient -- agriculture made its first appearance in Syria nearly 12,000 years ago -- and profoundly modern -- the independent state itself dates only to 1946. To give some idea of the centrality of Syria in the vast theater of history, the region was ruled at one time or another by the Assyrians (who lent the country their name), the Hittites, the Egyptians, the Babylonians, the Persians, the Greeks, the Armenians, the Romans (under whom the province of Syria peaked in population and influence),  the Arabs, the Crusaders, the Mongols, the Mameluks, the Ottoman Turks, and finally the French before becoming independent. Syria was critical to the growth and expansion of early Christanity, and Damascus served for a time as the capital of the Islamic Empire.

    The modern Syrian state was born in the fires of the two world wars. After World War I, the French took control of the Mandate of Syria the auspices of the League of Nations, and the country played host to fighting between pro-German and pro-British French forces in 1940. By 1946, however, the French were forced out and an independent Syrian republic was established. A series of military coups (one of which was probably sponsored by the CIA) left the country politically unstable, and by the late 1950s Syria was caught up in the great geopolitical games of the Cold War (Syria was an early and close ally of the Soviet Union in the Middle East) and pan-Arab nationalism (the country briefly unified with Gamal Nasser's Egypt as the United Arab Republic).

    In 1963, the Ba'ath Party, which mixed Arab nationalism and socialism in its ideological program, came to power in a coup. Ba'athism has played an important, if somewhat misunderstood, role in the Arab world since the 1960s -- the Ba'ath party came to power in Iraq in 1968 and eventually spawned Saddam Hussein's regime. But it's important to remember that the Ba'athist party split into pro-Syrian and pro-Iraqi factions in 1966, and that Syria even went so far as to support the U.S.-led coalition in the First Gulf War. What both Iraqi and Syrian Ba'athists shared, however, was an authoritarian bent that increasingly relied upon the military as a power base. In 1970, Hafez al-Assad, father of current president Bashar al-Assad became president in a bloodless military coup.

    Assad's was the last of the successful military coups, but his regime has faced popular uprisings and insurgencies before, particularly from Muslim extremists. In February 1982, Assad leveled the city of Hama (also a hotbed of discontent in the current uprising), killing anywhere between 17,000 and 40,000 people.

    Hafez al-Assad's death in 2000 and his subsequent succession by his son sparked hopes in the West that the regime would relax its vise-like grip on the politics of the country and, perhaps, back away from the anti-American and anti-Israeli nature of its foreign policy (Syria was a major participant in the 1948, 1967, and 1973 wars). Bashar al-Assad, after all, speaks French and English, was educated in London, has a British-born and educated investment-banker wife (who controversially appeared on the cover of Vogue magazine, but whom as subsequently been described as a Syrian Imelda Marcos") -- the very portrait of the modern transnational class of businesspersons and technocrats. This obviously hasn't prevented Assad from ordering a bloody and brutal crackdown against initially peaceful protesters in 2011.

    Discussion Topics

  • Should the U.S. and its allies aid the anti-government forces? Why or why not?
  • Why are Russia and China supporting the Syrian government
  • Why was Bashar al-Assad expected to be a liberalizer?
  • ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/145940 https://historynewsnetwork.org/article/145940 0
    Iran Worth Reading

  • Walid Phares: It's a MAD, MAD, MAD, MAD World for Iran
  • Juan Cole: How Zoroastrianism Influences the Worldview of Iran's Leadership
  • John T. McNay: The Road Not Taken by the U.S. in 1950s Iran
  • Background

    There is perhaps no situation more fraught with peril than the unfolding crisis in the Middle East over Iran's alleged nuclear weapons program. Israel is rumored to be preparing a military strike against key Iranian nuclear facilities, the Iranian government alleges that the Israeli Mossad and the American CIA are behind a series of assassinations of nuclear scientists and computer sabotage, and Israel alleges that Iranian agents are behind a series of bombings targeting Israeli embassies in Georgia and India. Iranian agents may also have plotted to assassinate the Saudi ambassador to the United States.

    The next few months will be critical, as sources within the Israeli government have indicated that this spring will be the most favorable moment for a pre-emptive strike to delay or cripple Iran's nuclear program. Such a strike, were it to occur, could potentially involve the United States, particularly since there are questions about whether the Israeli military is capable of launching an effective strike on Iran without American military support. Israel has launched pre-emptive attacks on its neighbors' nuclear programs before, destroying an Iraqi reactor in 1981 and a Syrian reactor in 2007. However, the size of Iran, its distance from Israeli military bases, and the geographical diversity of its nuclear installations, would make a pre-emptive strike on Iran far more difficult. Furthermore, Iran is a major state sponsor of terrorism, and it is widely expected that an Israeli or joint American/Israeli attack would mean that Iran would retaliate against American and Israeli targets throughout the Middle East.

    There is also an economic dimension to all of this: Iran is a major oil producer (though under heavy sanctions from the United Nations, the European Union, and the United States) and is adjacent to some of the world's most heavily-trafficked sea lanes. Oil prices have already spiked in no small part due to tensions in the region, and a military strike could send the cost of gasoline into the stratosphere. This would have serious repercussions on the U.S. economy and U.S. politics.

    Iran's nuclear energy (as opposed to weapons) program dates back to the 1950s (Iran signed the Nuclear Non-Proliferation Treaty, pledging not to develop nuclear weapons, in 1968), and was ironically enough sponsored by the United States. Since the 1979 Islamic revolution, however, there have been growing concerns that the Islamic regime is developing nuclear weapons under the auspices of the civilian program. Iran's leaders have repeatedly denied that the country has a nuclear weapons program, and even some within the U.S. Intelligence Community (meaning the sum of the various intelligence agencies like the Central Intelligence Agency, the National Security Agency, and the Defense Intelligence Agency, among others) admit that Iranian intentions are ultimately unknown.

    The Iranian military has missiles capable of hitting targets in Israel, widely considering to be the most likely target of any future Iranian nuclear strike (however, it is far from the only one -- several European and Arab capital are also potentially vulnerable). Due to Israel's small size and high population density, only a handful of nuclear weapons would be required to completely annihilate the Israeli population (and, considering the geographical proximity, the Palestinian population of the Gaza Strip and the West Bank). This fact, combined with the anti-Semitism of the Iranian leadership (the Iranian president, Mahmoud Ahmedinejad, has repeatedly denied the Holocaust ever happened and allegedly called for the annihilation of the State of Israel) has triggered fears in the Israeli government of a possible "second Holocaust" of the Jewish people.

    Israel developed its own nuclear arsenal in great secrecy in the 1960s, and though it has never officially declared itself as a nuclear power, there are, according to most estimates, around 200 warheads in the Israeli arsenal. The Israeli navy fields submarines capable of launching ballistic missiles, ensuring that, even in the event of a nuclear strike annihilating Israel proper, the remnants of the Israeli military will be able to launch a retaliatory strike. This is one of the main arguments against a pre-emptive strike against Iran's nuclear facilities -- if Iran's leadership is not suicidal, then they would never actually launch a nuclear attack on Israel, since the Israeli counterattack would destroy Iran.

    What the Left & Right Say

    Both liberals and conservatives are divided on what to do about Iran, not just with each other between themselves. President Obama has repeatedly said that no options are off the table when it comes to containing Iran's nuclear program, including military force. The president's stance comes from two distinct schools of thought. One is that the United States must protect its ally Israel from a potential nuclear attack from Iran; the other is the president's commitment (widely shared by liberals) to the principle of nuclear non-proliferation.

    Many to the left of the president, however, are against attacking Iran, arguing that the U.S. should not involve itself in yet another long, bloody, and expensive war in the Middle East; that the dangers of a war with Iran are far greater than a nuclear-armed Iran, which can be contained through classic principles of nuclear deterrence; that Iran may not actually even be developing nuclear weapons; that Israeli and American interests are, in this case, not the same; and that Israel is exercising an undue amount of influence on American policy.

    Conservatives are also divided, though less so than liberals. Neo-conservatives, out of favor since the Iraq War, go further than most, arguing that a war with the goal of regime change ought to be waged against Iran. More moderate conservatives limit themselves to calling for air strikes on Iran's nuclear installations, since, they argue, Iran's leaders are religious zealots who cannot be expected to act rationally and will therefore not be deterred through mutually assured destruction. The GOP presidential campaign has brought forth declarations of support for Israel and military action against Iran from all candidates save one. Libertarian Ron Paul is on record as opposing war with Iran, and libertarians in general are against a first strike against Iran.

    Historical Background

    What follows is a brief sketch of Iranian history, and Iran has one ancient history. The Persian Empire, in its various forms and dynasties, spanned some 2,500 years. The Achaemenids ruled from Greece to India, butted heads with the Spartans at Thermopylae, and according to the Bible allowed the Jews to return to Israel from the Babylonian captivity, and were only finally conquered by Alexander the Great; the Parthians and later the Sassanids were Rome's and Byzantium's great enemy in the East; the Muslim conquerors of Iran adopted many Iranian customs themselves while thoroughly Islamizing Iran; the country felt the wrath of Genghis Khan and Tamerlane. Zoroastrianism, one of the earliest monotheistic religions, was founded in Iran the 6th century BCE, and even to this day retains some adherents. Iranian culture, with roots stretching back millennia, is "Iran's prize possession," according to scholar of Iran Richard Nelson Frye, and Iran's ancient history and culture are a source of immense pride and passion amongst Iranians.

    Modern Iran, however, is the product of a twentieth century replete with despotism, democracy, revolution, and theocracy. In 1900, Iran was a highly decentralized country dominated by Great Britain and Russia--the king, or shah, had little power outside of Tehran. Reza Shah, who came to power in a coup in 1921, modernized the Iranian state and dramatically expanded the size of the army, which formed the basis of his power. Despite Reza Shah's frequent efforts to break free of British orbit (he was sympathetic to Nazi Germany and even requested that the name Iran--which means "Land of the Aryans"--be used instead of Persia on official documents), Britain still had an outsized influence.  When the Germans invaded the Soviet Union, the British and the Soviets invaded Iran and deposed Reza Shah, installing his son Muhammad Reza Pahlavi in his place. The monarch's power was somewhat proscribed from 1941-1953--the British encouraged constitutional monarchy and the Soviets supported the socialist movement in Iran.

    This period of liberalization ended after the war, when Prime Minister Muhammad Mossadeq, a liberal and a nationalist politician, attempted to nationalized Iran's oil industry in the early 1950s, which had hitherto been controlled by the British Anglo-Iranian Oil Company. Bluntly admitting in internal documents that "control of [Iranian oil] is of supreme importance," the British MI6, along with the CIA, launched a coup against Mossadeq in 1953, the result of which was to make the shah, backed by the army, again the supreme power in Iran. The shah presided over an oppressive, authoritarian state -- SAVAK, the shah's secret police, were renowned for their effectiveness in suppressing dissent. He was also firmly pro-American, and even maintained cordial relations with Israel.

    The shah's reign came to an end in 1979, when a popular revolution forced him to flee the country. Ayatollah Ruhollah Khomeini, a political and religious leader and fierce opponent of the shah, was declared the Supreme Leader of the newly-formed Islamic Republic of Iran later that year, outmaneuvering his more secular and less zealous fellow revolutionaries. The new republic, despite some democratic elements (including an elected president and parliament), was a theocracy, with supreme political power vested in unelected clerics.

    The new government regarded both the Soviet Union and the United States with hostility, and relations with America deteriorated to the breaking point after a group of Tehran students took hostage the staff of the U.S. embassy. Khomeini also talked in his speeches about exporting the Islamic revolution abroad, alarming the secular dictators who dominated the Arab world. Seeking to capitalize on Iran's weakened position after the revolution, Iraqi dictator Saddam Hussein launched an invasion in 1980, sparking an eight-year-long bloodbath, involving both the use of chemical weapons and the appearance of World War I-style trench warfare, and killing nearly half a million Iraqis and half a million Iranians. The war concluded in a stalemate despite Soviet, American [who went so far as to engage Iranian naval and air forces in the Persian Gulf], French, and Arab support for Iraq.

    Peace with Iraq in 1988 and Khomeini's death in 1989 moderated the regime somewhat, with Akbar Rafsanjani and moderate Mohammad Khatami serving as president, but hardliner Ali Khamenei succeeded Khomeini as Supreme Leader. In 2003, however, Iranian politics took a turn back to extremism after being labeled as part of the "Axis of Evil" by American president George W. Bush (Iran had hitherto been cooperating in a limited capacity with the United States in Afghanistan). In 2005, conservative populist Mahmoud Ahmadinejad, who caught global headlines with his bellicose rhetoric, was elected over the more moderate Rafsanjani.

    In 2009, however, Ahmadinejad's re-election was marred by massive fraud, which sparked immense national protests that culminated in the death of nearly one hundred people. Western analysts believe that as a result of the crackdown, the government enjoys little legitimacy in the eyes of its urban population; this could easily change, however, if a foreign power began bombing, as the Iraqis did in 1980.

    Topics for Discussion

  • Do you think Israel and/or the United States should attack Iran's nuclear facilities? How does this fit in with the history of foreign intervention in Iranian history?
  • Should the U.S. and Great Britain not have launched a coup against Mossadeq? What were some of the consequences of that coup?
  • ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/145210 https://historynewsnetwork.org/article/145210 0
    Electing the President: How Do You Make Up Your Mind? Worth Reading

  • HNN Hot Topic: Election 2012
  • HNN Hot Topic: Electing Presidents
  • HNN Hot Topic: The Electoral College
  • Should We Take Away the Voting Rights of 18 Year Olds?
  • Background

    What qualities should one look for in a presidential candidate? Since the advent of television, many Americans seem to have decided that presidents should be selected on the basis of their personality and image: how they come across on television. The way many Americans choose presidents today marks a sharp departure from the past. While personality and image were always important factors, they were usually not decisive until TV came along. Before TV, voters placed a high emphasis on a candidate's resume and political party affiliation.

    In the current polarized political climate party affiliation, to be sure, is still important. But how voters pick their party is different from in the past when economic considerations drove their decision. Today it is often cultural factors that shape a voter's preference for a party rather than, say, membership in a union.

    Studies conducted by the University of Michigan demonstrate that by some measures voters today know less about the issues than their counterparts forty years ago. Several factors are responsible for this decline. Three stand out: (1)television, which transmits emotion well but not information, (2) the weakness of the party system, which formerly helped voters identify particular parties with particular policies, and (3) the collapse of the union movement, which formerly helped educate voters about the issues.

    The role of the media in this decline is unquestionable. The media focus on sound bites, gotcha journalism and personality obscures the issues voters need to understand to make sensible decisions.

    The great journalist and historian Theodore White used to say that presidential elections turned on three issues: war and peace, bread and butter, black and white. No doubt these issues remain important. But today elections are just as likely to turn on media caricatures of the candidates, including such superficial questions as whether they seem comfortable on TV.

    What the Left Says

    Like conservatives, liberals are often drawn to outsiders. But unlike conservatives, they've been more willing to give relatively young candidates a shot at the presidency: John Kennedy, Gary Hart, John Edwards, Barack Obama. Experience has seemed to count less than imagination and charisma. Since Kennedy, Democrats have frequently longed for another Camelot candidate who could inspire change.

    What the Right Says

    Conservatives have tended to shy away from younger candidates, favoring instead party war horses who have already gone around the track a couple of times. Bob Dole and Ronald Reagan won the GOP nomination on their third tries. George H.W. Bush and John McCain won the nomination on their second tries.

    In the postwar period Republicans have been drawn to outsiders and military leaders. They went for Gen. Dwight Eisenhower over Sen. Robert Taft, Gov. Ronald Reagan over Rep. George H.W. Bush, Gov. George W. Bush over Sen. John McCain. Although they nominated Sen. Barry Goldwater, it wasn't because he was a senator so much as because he championed outsiders. Dole and McCain became party nominees because they were war heroes, not because they had served in the Senate.

    While conservatives say that they admire business people, no businessman since Wendell Willkie made it very far as a candidate until Mitt Romney.

    Historical Background

    They knew they didn't want a king. But in creating the office the Founding Fathers knew concretely what they wanted in a president. It was George Washington. They designed the office with him in mind. By their lights Washington was ideal: He was a hero with a long impressive resume. Most importantly, he was above politics. His signature appeal was his oft-stated desire not to be selected as president. He preferred, he insisted, to return to Mount Vernon where he could tend to his farm. His ideal was Cincinnatus, the Roman military hero who returned to his farm after his military career had ended.

    No one who succeeded Washington ever approached his heights before entering office. But the first six presidents fitted the founders' ideal. All had established themselves as outstanding individuals with national reputations and long resumes before their selection as president. Thomas Jefferson and John Adams were heroes of the Revolution. James Madison was regarded as a father of the Constitution (though modern historians note that many of the provisions he championed were dropped). James Monroe and John Quincy Adams had distinguished themselves as diplomats. Three of the six had served as secretary of state, which was regarded as a stepping stone to the presidency.

    Then the masses began voting. And once they did the resumes of presidents came to count for a lot less. Instead, voters were drawn to men (all were men) who in one way or another reminded them of themselves either by the way they dressed or talked or had made their way in the world. Unlike Washington, these men could not remain above politics, though most pretended they were.

    Topics for Discussion

  • Would George Washington be able to be "George Washington" in today's political climate?
  • Do voters make rational choices?
  • Should we prefer candidates with long resumes?
  • ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/144618 https://historynewsnetwork.org/article/144618 0
    North Korea Download this backgrounder as a Word document

    Background

    North Korea caught headlines at the very end of last year when Kim Jong-il, the supreme leader of the country since 1994, died suddenly but not unexpectedly. The older Kim had been grooming his son Kim Jong-un to take power since June 2010. Before Kim Jong-il came his father Kim Il-Sung (who, after his death, was proclaimed “Eternal President of the Republic”)—a Kim has been supreme leader of North Korea since the end of Japanese rule in 1945.

    North Korea—officially known as the Democratic People’s Republic of Korea—is one of the most repressive and authoritarian states in the world. Commonly referred to as a communist country, North Korea’s governing ideology is in fact difficult to define. Officially, the country adheres to the juche idea, which roughly translates to self-reliance. Western commentators have variously described North Korea as Stalinist, fascist, national socialist, neo-monarchist, and theocratic. The late Christopher Hitchens described Pyongyang as “rather worse” than the dystopic London of George Orwell’s 1984. B.R. Myers has argued that North Korea’s defining ideology today is not Marxism-Leninism but “an implacably xenophobic, race-based worldview derived largely from fascist Japanese myth.”

    Power is centralized in the hands of the Workers’ Party of Korea and the military. Kim Jong-il initiated a “military-first” policy in the late 1990s that emphasized the pre-eminence of the military in North Korean life. One of the direct consequences of this military policy has been North Korea’s nuclear weapons program. North Korea unsuccessfully tested a nuclear weapon in late 2006 and successfully tested a small warhead (2-6 kilotons yield, approximately one-tenth the power of the Hiroshima bomb). In response, the United Nations imposed further sanctions.

    The North Korean economy is moribund—per capita GDP is among the worst in the lowest in the world and private enterprise is officially illegal.  It’s been estimated that… North Korea, which has about 50 percent of the population of South Korea, has an economy only 3 percent as large. International trade, except on the black arms and drugs market, is almost nonexistent (partly due to U.N. sanctions), and what little above-the-board trade does occur goes overwhelmingly to China.

    In short, there’s a reason why North Korea is colloquially known as “the hermit kingdom.”

    With the death of Kim Jong-il, there has been renewed speculation that the country may undergo significant changes. Kim Jong-un is young and relatively inexperienced, and may therefore not enjoy the full support of the military. Kim Jong-nam, Kim Jong-il’s eldest son, half-brother of Kim Jong-nam and current resident of Macau, told a Japanese journalist that the regime needed to embrace reform based on the Chinese model, or face destruction.

    North Korean refugees paint a portrait of an almost impossibly closed society, with almost no information from the rest of the world allowed. The regime permitted the use of cell phones in 2008 and even established a 3G wireless network, but the state controls almost all content and does not permit phone calls outside the country.

    The collapse of the North Korean regime is desired neither by China or South Korea; the former fears the potential influx of huge numbers of refugees (since the Korean demilitarized zone is heavily mined, it’s most likely North Korean refugees would flee across the Chinese boarder, and indeed, most refugees and defectors from within North Korea use this route) and the latter fears the immense economic and social problems of integrating the populace of such a repressive and backward state into one of the most advanced economies on the globe.

    Recommended Reading

  • B. R. Myers: Dynasty, North Korean-Style
  • HNN Hot Topics: North Korea
  • What the Left & Right Say

    North Korea has been a source of relative unanimity amongst American politicians, in that both Democrats and Republicans believe North Korea to be a hostile totalitarian dictatorship. (The North Koreans, for their part, view the United States as their most implacable enemy.) How exactly to deal with North Korea has, however, been a subject of contention between the two parties.

    The defining issue in U.S.-North Korean relations for the past twenty years has been North Korea’s nuclear weapons program. The Clinton administration signed an agreement with North Korea in 1994 to freeze their weapons program over congressional Republican opposition (The GOP believed, correctly, that North Korea could not be trusted to keep its word). The Bush administration refused to negotiate bilaterally, preferring the six-party talk framework between the U.S., North Korea, South Korea, China, Japan, and Russia. Despite President Bush’s famous declaration that North Korea was a member of the “Axis of Evil” alongside Iraq and Iran, relations gradually thawed until 2009, when, in relatively quick succession, North Korea successfully tested a small atomic bomb, arrested two American journalists, and, in May 2010, attacked and sunk a South Korean corvette.

    Historical Background

    Korea, from 1910-1945 a Japanese colony, was divided into American and Soviet occupation zones at the end of World War II; the Soviet-backed Democratic People’s Republic of Korea was officially established in 1948 under Kim Il-Sung. At the time, it was one of over half a dozen Soviet-backed republics in Europe and Asia. In 1950, with the support of the Soviet Union and the newly-established People’s Republic of China, North Korea launched an invasion of the South; only U.N. intervention stopped Kim from forcibly reunifying the country.

    A peace treaty was never signed between the belligerents, only an armistice agreement which established the still extant 2.5-mile wide demilitarized zone between North and South. After the war, Kim Il-Sung consolidated his power over the country, rejecting Soviet-style de-Stalinization—indeed, Kim intensified his personality cult to the point where it reached God-like dimensions—but navigating a political tightrope between China and the Soviet Union after the Sino-Soviet split. By the 1970s, though, North Korea had begun to embrace juche rather than Marxism-Leninism as its fundamental ideology, and by the 1991 the collapse of the Soviet Union left the already cash-strapped North Korean state without its major source of foreign aid. South Korea, on the other hand, enjoyed explosive economic growth throughout the latter half of the twentieth century and transitioned to democracy from an authoritarian anti-communist regime in 1987.

    In the 1990s, North Korea suffered a severe famine which killed upwards of 3 million people due to a combination of flooding, economic mismanagement, and loss of Soviet aid. The famine, combined with the death of Kim Il-Sung in 1994, was possibly the most serious challenge to the North Korean regime since the Korean War, but the regime survived, partly by virtue of Kim Jong-il’s military-first policy, which mollified one of the few potential power bases outside of the Kim family.

    Stories

  • North Korean propaganda has a reputation in the West of making some rather … extravagant claims about the abilities of the Kim family. Kim Il-Sung was credited with defeating Japan and the United States almost single-handedly, while Kim Jong-il was credited with the ability to control the weather with his mind.
  • Kim Jong-il was the subject of much ridicule in the West for his tyranny and extravagant lifestyle. He was reported to be a huge fan of Western films, owning nearly 20,000 DVDs, even going so far as to kidnap a famous South Korean director and his actress wife to make movies in the North. He loved basketball and among his most prized possessions was a basketball autographed by Michael Jordan; he also reportedly based his trademark bouffant hairstyle on Elvis Presley.
  • In 2004-2005, North Korean state television aired a hybrid government announcement/reality show entitled Let’s trim our hair in accordance with the socialist lifestyle as part of a campaign against long hair (viewed as Western decadence). Hidden cameras were placed around Pyongyang and citizens who had long hair not in accordance with the socialist lifestyle were publicly shamed.
  • North Korea’s soccer coach during the 2010 World Cup claimed to received “regular tactical advice” during soccer matches from Kim Jong-il himself “using mobile phones that are not visible to the naked eye. The North Korean leader was reputed to have invented the device, along with, as ABC News dryly noted, the hamburger.
  • New North Korean leader Kim Jong-un reportedly underwent plastic surgery to make him more closely resemble his grandfather Kim Il-Sung before being introduced to the North Korean people in 2010.
  • Recommended Reading

  • To Sell a New Leader, North Korea Finds a Mirror is Handy
  • Questions:

  • How would you characterize the regime in North Korea? Is it communist, fascist, monarchist? All of the above? None?
  • How has the North Korean regime managed to survive?
  • Does North Korea pose a threat to its neighbors and/or the United States? Does the United States pose a threat to North Korea?
  • Why has the Kim family developed a personality cult of truly gargantuan proportions?
  • What is Historical Thinking?

    ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/144485 https://historynewsnetwork.org/article/144485 0
    Electing the President: Caucuses and Primaries Download this backgrounder as a Word document

    Worth Reading

  • HNN Hot Topics: 2012 Elections
  • HNN Hot Topics: Electing Presidents
  • HNN Hot Topics: The Electoral College
  • Timothy R. Furnish: Should We Take Away the Voting Rights of 18 Year Olds?
  • Background

    Every four years the United States elects a president. In the modern era the two main parties (Democrats and Republicans) select their nominees at caucuses and primaries that take place during the first six months of the year. Candidates compete for delegates to the national conventions, which formally select a nominee during the summer.

    Candidates compete for delegates in caucuses and primaries. At a caucus voters meet in neighborhood gathering places to discuss the election and then hold a vote. In primaries voters simply cast a ballot. In primary states politicking at the polls is actually illegal. By custom Iowa holds the first caucus and New Hampshire holds the first primary. Both states say that it's useful to the country for candidates to compete in their small states first. They argue that this system gives less-well known candidates a chance to compete since costs are relatively low. Most candidates use the elections in these two states to hone their message and engage in retail politics: meeting voters one on one. Critics argue that the states are unrepresentative of the country as a whole. After Iowa and New Hampshire, it's mostly a free-for-all, with states vying to hold their elections according to a schedule set by state and national party leaders. On Super Tuesday -- which is usually held in February or early March -- more states hold elections than on any other date. Super Tuesday often settles the race in both parties.

    Because modern campaigns cost millions of dollars -- in 2012 Barack Obama's campaign is expected to spend more than a billion dollars -- most candidates run out of money quickly and drop out early unless they are on a winning streak. Winners attract donors, who want to know that their money isn't going for a lost cause.

    Unlike the general election, the winner-take-all principle does not hold during the primary season. Candidates win delegates in proportion to their totals on election night, though Republicans shift to a winner-take-all approach in contests that take place starting in April (Florida, which held its primary at the end of January, is the sole winner-take-all primary that occurs before April). The system is designed to give losing candidates an incentive to remain in the race in hope of breaking out later on.

    What the Left Says

    Liberals bemoan the role of money in elections and insist the current system is basically corrupt. They argue that presidential contests have become so expensive that they favor either rich candidates or candidates who sell out to groups that possess the ability to generate large contributions. The birth of social media has changed the dynamic of elections somewhat, giving candidates the chance to raise millions in small donations from large numbers of ordinary voters. But most campaigns still rely heavily on large donations. While donations to a candidate's campaign are limited by law, there are many ways around the restrictions. Contributions can be bundled together from the employees of a corporation or labor union, giving those entities more leverage over a campaign than individual voters who make small contributions. And nothing prevents a wealthy individual from spending millions of dollars to help elect a candidate as long as they do not coordinate their activities with the candidate's campaign. Similarly, Super PACs (political action committees) are allowed to spend as much as they want on a candidate's behalf as long as they too do not coordinate with the campaign they are benefiting. But liberals say it's easy to get around the restriction. Nothing stops a candidate from announcing publicly how these Super PACs could be helpful.

    What the Right Says

    Conservatives say it's a fool's errand to try to place limits on money in politics. While some conservatives like Sen. John McCain (R-AZ) from time to time have supported campaign finance reform, most disparage restrictions as an impairment of political freedom. They argue that donors should be able to give as much money as they wish as long as the identity of the donors is disclosed. When liberals cry that corporations are influencing the political process, conservatives retort that labor unions also have an outsized impact. Conservatives decry the use of labor dues on behalf of political candidates the union supports given that union members are usually not given an opportunity to say who should receive support and who shouldn't. Conservatives like columnist George Will argue that liberals are misguided in thinking that there's an excess of money in politics anyway. As Will likes to joke, we spend more on potato chips every year ($7 billion) than we do on elections.

    Historical Background

    What set the United States apart from the very beginning was the commitment of the country to elections. In the early years of the Revolution before the ratification of the Constitution the suffrage was restricted to white males with property. By the time George Washington became president property qualifications had been abolished in most states, though some still restricted voting to taxpayers. It's estimated that between 60 and 70 percent of adult white males could vote. Nowhere else in the world did so many have voting rights so early on.

    The Founding Fathers' embrace of the principle of self-government was married to a deep suspicion of popular opinion. To curb the possibility that demagogues might whip voters up into a frenzy from time to time, the Founders limited direct popular control of the federal government to one half of one branch: the House of Representatives. The Senate was elected by the state legislatures. The President was elected by the Electoral College. The Supreme Court was appointed by the President with the confirmation of the Senate. Later amendments to the Constitution eased some of these restrictions. By the twentieth century the Electoral College had become a nearly worthless relic and senators were elected directly by the people.

    In the early years of the Republic elites (primarily concentrated in New York and Virginia) held sway over the government. Not until the election of Andrew Jackson did political parties in the modern sense develop. But once they did the elites learned that they had to share power with ordinary people, who now could express their will at the ballot box through organized parties. The invention of the party system is one of America's chief political innovations. From the start the parties were dominated by bosses, which somewhat diminishes the halo that hovers over our political history. They remained a keystone in the American political system until the 1960s, when television gave American voters the opportunity to see politics up close and decide for themselves which candidates they wanted to support. Once the candidates figured out that they could appeal to the people over the heads of the bosses through television, the days of the bosses were numbered.

    The first experiment with primaries took place early in the twentieth century in connection with the birth of the Progressive movement. But within a few years party bosses beat back the attempt to give voters direct control over the nominating process. They retained primary control through 1968, when the Democratic Convention under their control gave the nomination to Hubert Humphrey, even though he hadn't won a single primary. Angered by this turn of events reformers revamped the system, establishing the now-familiar process in place today.

    Topics for Discussion

  • Does the system produce good candidates?
  • Does money play too big a role in the selection of winning candidates?
  • Should Iowa and New Hampshire continue to hold the first contests?
  • What is Historical Thinking?

    ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/144482 https://historynewsnetwork.org/article/144482 0
    Occupy Wall Street Worth Reading

  • HNN Hot Topics: Occupy Wall Street
  • Background

    Occupy Wall Street began on September 17, 2011, when a group of protesters, prompted by a July 13 blog post by the Canadian anti-consumerist magazine Adbusters proposed that “20,000 people flood into lower Manhattan, set up tents, kitchens, peaceful barricades and occupy Wall Street for a few months,” borrowing tactics from the protests in Cairo’s Tahrir Square which toppled the Mubarak regime and the Indignants movement in Spain, especially their use of online social networks like Facebook and Twitter for communication.

    Though the movement itself has been criticized, even by its supporters, for its lack of specific demands and goals, Occupy Wall Street has indisputably changed the national conversation from the debt and deficit talk of August to a discussion of income inequality and the fading sense of opportunity in modern America, particularly for young people.

    Unemployment, particularly high among young people, is one of the protesters’ major grievances; another related problem is costly college tuition, which is a leading cause of exploding student debt. Without jobs, many recent college graduates feel there’s no way for them to pay back their loans and get on with their lives.

    The protesters have also voiced opposition to the influence of money in politics, particularly the 2010 Citizens United decision by the Supreme Court—the “corporations are people” ruling which held that corporations, unions, and other special interest groups could spent an unlimited amount of money on elections.

    The protest itself coalesced in Zuccotti Park in Lower Manhattan, a few blocks away from Wall Street itself. Eight days after the protest began, on September 24, the New York Police Department clashed with protesters and arrested some eighty people. Video of a police officer pepper-spraying nonviolent protesters went viral on the Internet, generating a great deal of media attention for the protests, causing them to go national, even global.

    At the peak of the movement in October 2011, nearly one hundred cities in the United States saw Occupy protests, and similar protests occurred in Canada, the United Kingdom, Israel, Japan, and even Mongolia. Violent riots at Occupy Oakland culminated in the first general strike in the U.S. since 1946.

    One of the more novel aspects of the Occupy movement has been its emphasis on consensus-based democracy in its decision-making. Indeed, one of the movement’s major talking points has been that it’s “leaderless,” though observers (especially at the original encampment in New York) have noticed individuals who have taken on the role of “organizers.” Since the City of New York does not allow megaphones in its parks without a permit, the Occupiers used a call-and-response “human microphone” that supporters claim is a unique community-building tool to ensure that every voice is heard—critics, on the other, say it’s a silly gimmick.

    What the Left Says

    Liberals, or at least mainstream liberals, have been ambivalent about Occupy Wall Street. On the one hand, it’s a movement that finally looks like it could be the Left’s answer to the Tea Party—it’s changed the national conversation from debts and deficit reduction to income inequality and corporate greed; it’s energized the grassroots; it’s even gotten support from blue-collar union workers.

    On the other hand, much of Occupy’s ire has been directed at Democrats in general, and President Obama in particular, for their purportedly close relationship to Wall Street. Democrats have come under fire for their support of the bank bailouts and the Dodd-Frank reform of the financial system (which Occupy regards as weak), and their pursuit of Wall Street dollars for re-election campaigns.

    The reaction from the Far Left has been much more positive—many have made triumphal comparisons to the ‘60s counterculture, and some even see Occupy Wall Street as the next step in an ongoing global protest movement, pointing to the Arab Spring, the anti-austerity protests in Europe, and now the anti-Putin protests in Russia as proof.

    Recommended Reading

  • Simon Hall: Occupy Wall Street Should Remember that No One Liked Vietnam Protesters, Either
  • James Livingston: Occupy Wall Street has a History Student on Its Press Team—I Sit Down with Him for an Interview
  • What the Right Says

    Conservatives have criticized Occupy Wall Street from the beginning as an unrealistic, uninformed, pointless, and illegal waste of taxpayer money and public space. Conservatives allege that their goals are incoherent, their understanding of economics in general and capitalism in particular is laughable, and the Occupy movement does not offer any specific solutions. In any event, the Occupy movement’s politics of social justice, its emphasis on the dangers of inequality and the corrupting influence of money are at odds with the conservative belief in individualism, personal responsibility, and the free market.

    One-time Republican presidential candidate Herman Cain bluntly said, “Don’t blame Wall Street, don’t blame the big banks, if you don’t have a job and you’re not rich, blame yourself. It is not someone’s fault if they succeeded, it is someone’s fault if they failed.”

    It’s not just liberals who have noted the similarity that the Occupy protesters have to the hippie counterculture of the 1960s—conservatives have, too, but with much more negative undertones. Conservatives have sneered that protesters are indigent, shabby, vague about their political goals, and hedonistic. Conservatives have also pointed to tensions within the Occupy movement between homeless participants and the more well-heeled college grads as evidence of its insincerity.

    Recommended Reading

  • Niall Ferguson: Yes, Wall Street Helps the Poor
  • Sheldon Richman: Wall Street Couldn’t Have Done It Alone
  • Historical Background

    Mass economic protests are as American as baseball and apple pie. Shays’ Rebellion in Massachusetts in 1786-1787 and the Whiskey Rebellion of 1791-1794 are early examples of unrest caused by a bad economy. Indeed, in both cases, rural farmers were facing heavy taxes supported by urban business elites. In the case of the Whiskey Rebellion, it took a 13,000-strong army led by President Washington himself to end the unrest, a move which upset many at the time (not least the farmers themselves) and which some historians believe was an overreaction.

    A more direct predecessor was Coxey’s Army, a protest march of the unemployed to Washington D.C. during a depression in 1894 to lobby the government for jobs (they were promptly arrested upon arrival).

    More famous (and much larger) than Coxey’s Army was the Bonus Army. In 1932, unemployed veterans of World War I marched on Washington to demand the immediate payment of bonuses which had been promised to them by the federal government but which were not redeemable until (ironically) 1945. The protest was massive—43,000 people, including the vets’ families—so President Herbert Hoover sent in the army. Douglas MacArthur, then the army chief of staff, personally led two regiments of infantry and cavalry (backed by tanks commanded by none other than George S. Patton) to clear the Bonus Army’s encampment. Fifty-five veterans were injured and a hundred and thirty-five were arrested. The vets ended up getting their bonuses in 1936, after congress overrode a veto from Franklin Delano Roosevelt.

    Stories

  • Western farmers in the 1790s distilled their excess grain as whiskey, since whiskey was easier to transport and there was a high demand for it. In some areas, whiskey was used as currency—so when the federal government raised taxes on whiskey the farmers rebelled.
  • The Bonus Army returned to Washington in 1933 to petition the new president, Franklin Roosevelt. Though he refused to grant the bonuses (and, as noted above, even tried to veto the bonus bill in 1936), FDR was remembered far more fondly by the veterans. Why? Eleanor Roosevelt came to visit with the protesters. Said one to the press: “Hoover sent the army, Roosevelt sent his wife.”
  • Protesters at Occupy UC-Davis (that’s the University of California at Davis), in response to campus police pepper-spraying peaceful protests (the video is here), staged a sit-in between the campus chancellor’s office and her car. When she came out of her office, she was met by hundreds of students sitting in absolute silence along her path. The video is here.
  • Topics for Discussion

  • What’s the role of protest in our democracy?
  • How will Occupy Wall Street affect the 2012 election?
  • Is Occupy Wall Street the beginning of a new global protest movement
  • Why do conservatives (and some liberals) dislike Occupy Wall Street?
  • ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/143780 https://historynewsnetwork.org/article/143780 0
    Tea Party Worth Reading

  • HNN Hot Topics: The Tea Party Movement
  • Simon Hall: The Tea Party, Patriotism, and the American Protest Tradition
  • Vote iQ Hot Topics: Tea Party History
  • Vote iQ Hot Topics: What's the History Behind the Deficit?
  • Background

    The origins of the Tea Party are broad and deep, but the spark that set it off happened on February 19, 2009, when CNBC Business News editor Rick Santelli, in a broadcast from the Chicago Mercantile Exchange, called for a “tea party” to dump derivatives into the Chicago River. His call for action went viral and by Tax Day, April 15, over 750 Tea Party rallies were held across the country, with nearly 300,000 people in total participating.

    The Tea Party is neither a political party nor a monolithic political movement—it is rather a loose collection of like-minded groups; some of these groups are small, grassroots organizations, others are well-funded, corporate-backed entities. What unifies these groups is anger over taxpayer-funded bailouts of big banks and companies and wasteful federal spending, antipathy to social entitlements in general and health care reform in particular (though polls indicate that a majority of Tea Partiers do not want cuts made in existing Medicare and Social Security benefits), and self-identification as Republican and conservative.

    Tea Party supporters tend to be older, whiter, and somewhat more affluent—and more likely to self-identify as conservative—than the general population. The most important element of Tea Party ideology is economic—supporters tend to be staunch free-marketers and staunch opponents of government intervention in the economy—in other words, they want less government, and lots of it. Tea Partiers also tend to support orthodox conservative positions on social issues like gay marriage and foreign policy, are skeptical of climate change, and support anti-illegal immigration measures. Many are skeptical of America’s involvement in foreign wars. Nonetheless, those policy preferences are less at the core of Tea Party identity.

    The Tea Party had an outsized influence on the 2010 midterm election. Tea Party-aligned Republican candidates won House, Senate, and gubernatorial seats throughout the country, with some big winners being Senator Scott Brown in Massachusetts, Senator Rand Paul in Kentucky, and Governor Nikki Haley in South Carolina. However, the Tea Party also unseated many establishment Republican candidates in the primaries; in some cases, the Tea Party candidates proved to be too extreme for their constituents—Christine O’Donnell in Delaware, Sharron Angle in Nevada, and Joe Miller in Alaska, all of whom lost close Senate elections to their Democratic opponents.

    What the Left Says

    Liberals have been very critical of the Tea Party since the beginning of the movement, seeing it as an overreaction to the election of a Democratic president. Some liberals go even further in stating that the Tea Party is an overreaction to a black president, and indeed, one of the more common liberal criticisms of the Tea Party is that it’s racist, citing the movement’s largely white demographic make-up and the occasional insensitive protest placard. Another common liberal criticism is that the movement is fundamentally inauthentic: it was hyped and sold, so they say, by the conservative Fox News Channel, and that too many of the most influential Tea Party groups are in fact fronts for wealthy and powerful Republican interests. Freedom Works, one of the largest, is run by former GOP House majority leader Dick Armey.

    Recommended Reading

  • Barbara Smith: Comparing the Modern Tea Party to the Original
  • Richard Striner: The Mad Hatter's Tea Party
  • What the Right Says

    Conservatives maintain that the Tea Party is an organic movement that has been misrepresented, even slandered, in the mainstream media as “neo-Klansman and knuckle-dragging hillbillies.” Far from embodying these odious stereotypes, conservatives believe Tea Partiers are law-abiding, patriotic American citizens who are very worried about the direction of the country. They’re concerned about excess regulations, taxes, and government corruption and crony capitalism. Indeed, for conservatives, the Tea Party provided a much needed dose of reality in contrast to the almost messianic expectations liberals had for Barack Obama.

    Recommended Reading

  • Todd Zywicki: Repeal the Seventeenth Amendment
  • Fred Siegel: Insatiable Liberalism
  • Historical Background

    The Tea Party is a movement very aware of the power of history as a force in politics. The use of the iconic Revolutionary War name and clothing has given it an instant identification in the public mind, helping reinforce its commitment to the Constitution.

    Indeed, the political power of the original Boston Tea Party of 1773 has been recognized since the bicentennial of 1976, when anti-tax activists donned colonial garb to protest what they viewed as unconstitutional taxes. Even Ron Paul, the GOP’s most uncompromising libertarian and candidate for the 2012 Republican presidential nomination, held a “Tea Party” fundraiser on the anniversary of the Boston Tea Party.

    Indeed, Tea Party-influenced commentators like Glenn Beck have been instrumental in sparking a popular reassessment of many American presidents. Woodrow Wilson, Theodore Roosevelt, and other progressive presidents and public figures have been reinterpreted not as crusaders for social justice, but statists who wished to limit individual freedoms. The most extreme reinterpretations have drawn lines connecting Wilson to Italian Fascist dictator Benito Mussolini.

    On an organizational level, the Tea Party also has antecedents in the several waves of right-wing populism that have ebbed and flowed since the Great Depression. Much of the rhetoric leveled against President Obama today—that he’s a socialist who wants to dismantle capitalism—echo charges levied against Franklin Delano Roosevelt back in the 1930s.

    Stories

  • The first Tea Party may have been held several days before Santelli’s broadcast. Keri Carender, a Seattle-based activist and improve comedy performer, organized a Tea Party rally on President’s Day 2009 on February 16.
  • The original Boston Tea Party had an unlikely critic—Benjamin Franklin. Such was his belief in the sanctity of private property that he insisted that the British be compensated for all of the destroyed tea—a hefty £9,000, £888,000 in today’s pound sterling. Indeed, several merchants from New York actually offered to repay the British, but were turned down.
  • Modern Tea Partiers deliberately took symbols and language associated with the left—the raised fist, a favorite anarchist emblem; use of the terms “Nazi,” “fascist,” and “Big Brother” as insults—and adapted them to their own purposes.
  • Topics for Discussion

  • What is the Tea Party’s political platform? What are its historical predecessors?
  • With the challenge of Occupy Wall Street on the Left, will the Tea Party continue to be as politically influential in 2012 as in 2010?
  • Can connections between the Tea Party and global unrest since the 2008 financial crisis be made?
  • Why do liberals dislike the Tea Party?
  • ]]>
    Thu, 18 Apr 2024 20:42:51 +0000 https://historynewsnetwork.org/article/143777 https://historynewsnetwork.org/article/143777 0