Did You Know?





  • How Many Emails Did Bill Clinton Write as President?
  • Who Wrote More Words than Anybody Else?
  • What Are the Origins of the Jewish Blood Libel Myth?
  • NASA Considered -- The Rejected -- Female Pilots in the 1960s
  • Ben Franklin's Musical Invention Enjoys a Revival
  • How the Zip Code Changed America
  • Does the Stock Market Do Better Under Democrats or Republicans?
  • Why Is Washington DC So Much Smaller than the Founders Envisioned?
  • What's a Spider Hole?
  • They Can't Both Be Right
  • How the Smithsonian Finally Got an African-History Museum
  • Why Is There a Pyramid on U.S. Money?
  • The Anasazi Ate Turkey Long Before the Pilgrims
  • Who Invented Port?
  • Lincoln Never Said That
  • So the Pilgrims Celebrated the First Thanksgiving?
  • When Was Cloture Devised?
  • George McGovern's Faux Pas
  • JFK Was Almost Killed as President-Elect
  • The Woman Behind Thanksgiving
  • So Where Did the First Thanksgiving Take Place?
  • Iraq's Casualties Greater than Vietnam's During the First 3 Years of War in Asia
  • His Father Fought in the Civil War -- HIS FATHER!
  • Racist Place Names
  • Name the Person Who Made These Anti-Semitic Remarks
  • What Happened to the Riderless Horse at JFK's Funeral?
  • Oldest Condom in the World
  • What Does the Stock Market Tell Us About Next Year's Election (And Vice Versa?)(
  • Islam's Division into 2 Main Camps Began When ...
  • What? Charlie Chaplin Wasn't Jewish?
  • Just How Lewd Was Elizabethan England?
  • When the Papal Chair Is Empty
  • Benedict Arnold's Flawed Gravestone
  • How Did Mobsters Hide Bodies?
  • Noted Suicides Through History
  • Did the Japanese Use Mustard Gas in WW II?
  • How the Poverty Rate Is Determined
  • Dr. Rice, I Presume!
  • Reagan Opposed the Recall
  • Edison and the Electric Chair
  • How Many Generals Have Been Elected President?
  • Did the U.S. Count Civilian Dead in Vietnam?
  • Voting in California 100 Years Ago: How Things Have Changed
  • Korea Used to Be Spelled Corea: Why That Is Now Important
  • The 10 Commandments?
  • Bush Is a Divider not a Uniter
  • Peter L'Enfant?
  • Quick: Guess Which Film Clip from the 20th Century Is the Most Popular?
  • Muslim Against Muslim: The War Within Islam
  • Keep the 10 Commandments in the Alabama Courthouse?
  • What Was the Triangle Trade?
  • Why Do Suburbs Feature Grand Lawns?
  • House-Senate Conference Committees: A Tempestuous Past
  • Road Accidents ... Are Cars to Blame?
  • Has a UN Official Ever Been Specifically Targeted for Assassination?
  • When Did World War II End?
  • What Is the Baath Party
  • Productivity Isn't Increasing Faster Now than in the Past
  • Was Duke University Built with Tobacco Money?
  • Why Don't Editors Show Dead People on TV or in Newspapers?
  • What Event in History Cost the U.S. the Most Money?
  • Why Do Museums Now Have to Worry About the Origins of Their Artifacts?
  • Bush's 2 Question Rule
  • Did the British Invent Lasagne?
  • Putting the Death Toll of Saddam's Victims in Perspective
  • The Story of the Only Enlisted Man to Be Honored with a Memorial at Gettysburg
  • The Origins of the Great Seal
  • Wright Brothers Didn't Hail from North Carolina
  • Ohio's Importance in the Presidential Sweepstakes
  • Paul Revere's Ride: A Team Effort
  • How Easy Would It Be for a PhD to Make an Atomic Bomb?
  • Do You Know the Story Behind the Nina, the Pinta and the Santa Maria?
  • How Much Do States and Localities Spend?
  • What War Since World War II Has Been the Deadliest?
  • Is the American Economy Really the Best Off in the World?
  • Just How Badly in Debt is the United States?
  • What Kind of Gas Mileage Did the Model T Get?
  • The Gay Betsy Ross
  • How Did St. Petersburg Come into Existence?
  • Does the Stock Market Affect Presidential Elections (Or At Least Reflect Economic Conditions that Decide Elections)?
  • How Did Arlington National Cemetery Come into Being?
  • Is It Iraq or Irak? (And Is It E-raq or Eye-raq?)
  • When the U.S. Government Tested Chemicals on American Citizens
  • Did State Spending in the 1990s Get Out of Control?
  • Why Do Soldiers Shout HOO-AH?
  • Where Does the Word Quarantine Come From?
  • "Render to Caesar": What Jesus Meant
  • What's a Cakewalk?
  • How Does the Bush Tax Cut Compare with Reagan's and Kennedy's?
  • When Did President Begin Releasing Their Tax Returns?
  • Was"Americanism" Always the Preserve of the Right?
  • What the Marshall Plan Cost
  • Terrorists and Pirates
  • Imbedded Journalists: Nothing New
  • New Zealand: We Beat the Wright Brothers
  • On the Relationship Between War and Money
  • Will the Bush Tax Cuts Prove Supply-Side Economics Works?
  • War-Speak
  • Iraq's Indebtedness Is as Deep as Its Oil Wells
  • War of Words
  • Public Opinion in the 1930s When Danger Lurked Everywhere

  • Do Wars Bring About Prosperity?

  • Blacks Were Not Over-Represented in the Ranks of the Vietnam Troops

  • Science and War

  • How Long Do Our Wars Last?

  • Why Do We Put Up Yellow Ribbons During Wars?

  • Will Iraq Begin Selling Oil in Dollars Again?

  • The History of the Marine Units Fighting in Iraq

  • Why do Soldiers Wave a White Flag When Surrendering?

  • Say Goodbye to the Cold War Peace Dividend

  • Just How Big Is the Projected Deficit?

  • Artists And War

  • How often has the u.s. Used the veto at the un?

  • Bush's Infrequent Press Conferences

  • How The NYT Missed The News That Crick And Watson Had Discovered Dna

  • Out Of Africa

  • So How Long Will We Stay In Iraq?

  • It Took Less Time To Try Eichmann

  • 25 Percent Of Gulf War Vets Disabled

  • Is The Stock Market Doomed This Year?

  • Hard To Get Elected In A Recession

  • Antisemitism On The Rise Among The Young

  • The Man Who Claimed To Fly An Airplane In 1901

  • Did You Miss John Hancock's Birthday?

  • 2 More Bubbles To Pop?

  • 14 Women In The Senate--A Record

  • Dow In The Dumps

  • The Dow In Perspective

  • The Senator Who Became A Governor And Appointed His Daughter In His Place

  • When The Smithsonian Snubbed The Wright Brothers

  • The Boom In Nuclear Bombs

  • Superstitions Aren't As Old As You Think

  • Killer Fog Of 1952

  • What The Us Never Knew About Soviet Nukes

  • Immigrants Constituted Half The New Workforce In 1990s--A Record

  • Sterilization

  • The Housing Bubble

  • Halloween: Debunker's View

  • Japan: Basket Case?

  • Globalization Is No Myth

  • Requiring The People To Vote On A Declaration Of War

  • Immigration Stats

  • Are We Saving Enough?

  • Bush And The Stock Market

  • Bush And Polls

  • Unemployment Rate

  • First World War

  • 9-11 Gallup Poll


    Click here to return to top of page.

    How Many Emails Did Bill Clinton Write as President?

    Ben Hoyle, writing in the London Times (Jan. 28, 2004):

    WHEN the historians are let loose in the Bill Clinton presidential library, they will have four million White House e-mails to trawl through. Fortunately for impatient biographers, it seems that the great man sent only two.

    Perhaps one of the great political communicators in recent history preferred the personal touch or a telephone call, according to Skip Rutherford, president of the Clinton Presidential Foundation, which is raising money for the library. "He's not a technoklutz," Mr Rutherford said.

    One of the Clinton messages was a test e-mail to see if the Commander-in-Chief could press the "send" button -which leaves John Glenn, the former Ohio senator and the first American to orbit the Earth, with a further distinction. He was the only person to receive an e-mail from President Clinton.

    The President e-mailed Glenn, then 77, when he returned to space, after 36 years, aboard the space shuttle Discovery in 1998. "We are very proud of you and the entire crew, and a little jealous," the message read. "Back on Earth, we're having a lot of fun with your adventure."

    Click here to return to top of page.

    Who Wrote More Words than Anybody Else?

    From the London Times (Jan. 30, 2004):

    Who is, or was, the world's most prolific author in terms of words written?

    ... [T]he ancient scholar Didymus of Alexandria (1st century BC) is credited with having written between three and four thousand books (of which a single papyrus, found in a rubbish heap, survives).

    Rhazes, polymathic Persian author of several hundred works, once recalled: "In a single year I have written as many as 20,000 pages in a script as minute as that used for amulets. I was engaged fifteen years upon my great compendium, working night and day, until my sight began to fail and the nerves of my hands were paralysed".

    His contemporary, the Arab historian Tabari, is said to have written forty pages every day throughout his long life. Voltaire left behind 15 million words; his letters alone fill 100 volumes. Jean-Paul Sartre wrote up to 10,000 words a day.

    The bibliography of Bertrand Russell lists more than 3,000 published items and itself fills three volumes; he also wrote 40,000 letters. A.C. Benson, Master of Magdalene College, Cambridge, published 100 books and kept a diary of 5 million words, filling 180 manuscript volumes, enough to fill 40 volumes of print; after his death, whole shelf-fuls of unpublished books, stories, essays and poems were consigned to the flames.

    Click here to return to top of page.

    What Are the Origins of the Jewish Blood Libel Myth?

    Hugh Levinson, BBC Radio 4, producer of"The Blood Libel," writing in BBC News (Jan. 23, 2004):

    For hundreds of years, it's been said that Jews kill Christian children and drain their blood for ritual purposes. Why has this myth persisted for so long?

    Raya Beilis still remembers the day in 1911 that her father was arrested by the Kiev secret police, and put on trial for the alleged Jewish ritual murder of a Christian boy.

    "I was too young to understand," says Raya, who now lives in a nursing home in New York City."All I knew was that they said if he's guilty, they're going to kill every one of you."

    The authorities in Kiev said Mendel Beilis had lured a teenager called Andrey Yustschinsky away from his family, killed him and drained his blood for the production of matzah, the unleavened bread eaten at Passover.

     

    The court threw out the charges, which were clearly fabricated. Mendel Beilis was freed and the feared pogrom against the Kiev Jews never happened. But where did this bizarre accusation come from?

    The origins of this anti-Semitic myth, known as the blood libel, lie in medieval England. In 1144 a skinner's apprentice called William went missing in Norwich. When his body was found, the monks who examined the corpse claimed that the boy's head had been pierced by a crown of thorns.

    Some years later, a monk called Thomas began to gather evidence about William's death. His main aim was to establish the boy as a holy martyr and draw pilgrims to the cathedral. Almost as an incidental matter, he accused the Jews of Norwich of killing the boy.

    "The unforeseen outcome of what Thomas did was to create the blood libel, which then itself takes on a life of its own," says Dr Victor Morgan, of the University of East Anglia.

    Hysteria, not evidence

    The accusation that Jews would drain the blood of children and then use it for ritual purposes is bizarre, as Judaism has a powerful taboo against blood. Indeed, kosher butchering is meant to remove all blood from meat. But the idea seems to have had a powerful hold on the mediaeval imagination.

     

    "It's not just an act of murder and of a ritual murder," says Professor Robert Wistrich, of the University of Jerusalem.

    "Removing the blood from the body and then using it for a ritual or religious purpose - there is something horrific, but yet as fascinating as it is repulsive in this notion."

    The blood libel spread across England and Continental Europe over the centuries, with hundreds of accusations, all based on hysteria rather than evidence. There were notorious blood libel cases in Lincoln in 1255 and Trento, Italy, in 1475. Many Jews were executed. Others were killed by mobs seeking revenge.

    There was another rash of accusations in the late 19th and early 20th centuries in Eastern Europe - societies gripped by economic transformation and political uncertainty, climaxing with the Beilis case of 1913.

    Even though the blood libel has been disproved countless times, it refuses to fade away. Racist groups in the US still sell videos which maintain that Jews commit ritual murder.

    Click here to return to top of page.

    NASA Considered -- The Rejected -- Female Pilots in the 1960s

    Preston Lerner, writing in the LAT Magazine (Jan. 18, 2004):

    In 1961, then an ambitious, irrepressible 22-year-old flight instructor, [Wally] Funk was the youngest of 13 women who were secretly evaluated as candidates for NASA's space program. In several tests, she and her cohorts outperformed the men--the Mercury 7--who would rocket so famously into history. But America wasn't ready for female astronauts. "The time wasn't right," Funk says. "And the old-boy network didn't want us." The program was killed before it got off the ground, and the female pilots, who much later were dubbed the Mercury 13, faded into obscurity.

    Yet more than 40 years after her brief stint as an understudy, Funk still hungers for a star turn on an astral stage, and there's nothing she won't risk to achieve her lifelong dream of rocketing into space. Her life savings? Check. Her reputation? No problem. Her life? In a heartbeat. She has signed on as a test pilot for Interorbital Systems, a tiny Mojave-based company with grand plans to make her the first human to fly into space in a privately funded spacecraft. This unprecedented launch could occur within a year if adequate funding is secured--a really big if.

    For now, Funk has the publicity machine cranked up to redline. This summer morning she's at Santa Monica Airport to fly the flag while competing in the Palms to Pines Air Race from Southern California to Oregon. Seventy-five years ago, this was the starting point of the country's first transcontinental air race for women. Amelia Earhart and Pancho Barnes were among the celebrated aviatrixes (as they were known in those days) who flew in the inaugural Air Derby. Today, for better or worse, female pilots no longer fascinate the public. So instead of the star-studded crowd on hand in 1929, the atmosphere at the airport is as sedate as lunchtime at a laundromat--except for the whirlwind being kicked up by the human tornado with a shock of short white hair.

    Although Funk reluctantly admits to being 64, she doesn't look or act her age. Dressed in red cargo pants and work boots, she's trim, athletic, gregarious and immensely likable--think instant confidante, a ball of fire who greets acquaintances with hearty hugs and refers to friends old and new as "babe." But the first thing most people notice is her tireless energy and boundless enthusiasm for the task--usually tasks--at hand. Her teammate in the race, Lou Ann Gibson, smiles indulgently as Funk wipes down their Cessna 172 while orchestrating photos, conducting interviews and mingling with a group of well-wishers large enough to constitute an entourage.

    Gibson will pilot while Funk navigates as they compete against 19 other two-woman teams over the next two days. Gibson, an American Airlines pilot, is one of 800 or so students who have soloed under Funk's tutelage. In 1958, when Funk earned her wings, pilot instructor was just about as high as a woman could go in aviation. But she soared higher still with pioneering jobs as an inspector for the Federal Aviation Administration and an accident investigator for the National Transportation Safety Board in Los Angeles. She also completed her astronaut training on her own after the Mercury 13 program fizzled, even though NASA never showed the slightest inclination to send her into space....

    Funk's Cessna looks puny and insignificant as it waits on the wide expanse of blacktop, and her dream of spaceflight seems far, far away. Then again, could Orville Wright have imagined, as he skimmed along the sand dunes of Kitty Hawk in 1903, that Charles Lindbergh would cross the Atlantic in 1927? That Chuck Yeager would break the sound barrier in 1947? That Neil Armstrong would walk on the moon in 1969? Can Wally Funk fly into space in 2004? She's got the ability. God knows she's got the drive. She's in the right place. Who's to say it's not the right time?

    Click here to return to top of page.

    Ben Franklin's Musical Invention Enjoys a Revival (posted 1-14-04)

    Brendan Miniter, writing in the WSJ (Jan. 14, 2004):

    One of America's Founding Fathers invented a musical instrument that inspired original scores from Mozart, Beethoven and other greats. That instrument is the glass armonica (named after the Italian word for harmonic), devised by Benjamin Franklin in 1761. And out of all of his inventions, Franklin once said it was the one that gave him the"greatest personal satisfaction." But for more than a century and a half this once-popular instrument -- which employed glass bowls stacked horizontally inside one another and mounted on a small table -- sat in disrepute, nearly lost to history.

    Now, however, it's enjoying a bit of a revival. Thanks to the hard work of a handful of men and women over the past 20 years, the glass armonica is being heard at festivals, at elementary-school concerts and in at least one movie score. And Philadelphia's Franklin Institute will mark the inventor's 298th birthday on Saturday by having the instrument played during their celebration....

    Franklin found he could make beautiful, haunting music using glass bowls if they had a hole in their center and were stacked inside one another while mounted on a horizontal rod. He dipped his fingers in water, spun the bowls using a foot treadle and then played them almost like a piano. Except that he could sculpt each note by varying the speed of the bowls and the amount of pressure he applied -- similar to how a violinist uses a bow.

    The idea of using glass to make music didn't originate with Franklin. It was already centuries old when he watched music being made from drinking glasses -- tuned by being filled with varying amounts of water -- while in England in the late 1750s and early '60s. But Franklin wanted to make the process less cumbersome. So with the help of London glass blower Charles James he figured out how to tune a glass bowl by varying its thickness.

    Franklin spent much of the American Revolution as a diplomat in France and often played his instrument for parlor audiences. Soon Europeans fell in love with it and began building their own. One story has Franklin curing Polish Princess Izabella Czartoryska of"melancholia" by playing the armonica for her. She liked it so much he gave her lessons. Marie Antoinette is said to have studied the instrument. Mozart and his father, Leopold, heard the armonica in Vienna in the 1770s. Wolfgang"has played upon it," Leopold wrote his wife."How I should love to have one." And in 1791, the younger Mozart composed an Adagio for the armonica solo and the Adagio and Rondo for the armonica, flute, oboe, viola and cello.

    By the 1820s and '30s, however, the armonica was gaining a reputation for driving musicians out of their minds. Marianne Kirchgessner eventually went insane after touring Europe playing it. J.C. Muller warned of its effect on the"temperament" in a 1788 instruction manual. Today many suspect the armonica's leaded glass and paint to be the real culprit, perhaps even contributing to Beethoven's likely lead poisoning.

    Click here to return to top of page.

    How the Zip Code Changed America (posted 12-30-03)

    John Schwartz, writing in the NYT (Dec. 28, 2003):

    When the Postal Service introduced its Zone Improvement Plan in 1963, the mundane goal was to identify the mail delivery station associated with an address. It drew a border between past and present, says Edward Tenner, the author of"Why Things Bite Back: Technology and the Revenge of Unintended Consequences." What resulted was a more efficient mail system, but also"a new style of demographic and social analysis, marketing and clustering" that shapes everything from the allocation of bargain fliers and mail-order catalogs to the placement of stores.

    Click here to return to top of page.

    Does the Stock Market Do Better Under Democrats or Republicans? (posted 12-17-03)

    Stephen J. Glain, writing in the Boston Globe (Dec. 12, 2003):

    According to an article published in the October issue of The Journal of Finance, share prices have for much of the last century fared better under Democratic presidents than Republicans. Using a broad index of stock prices, professors Pedro Santa-Clara and Rossen Valkanov at the University of California at Los Angeles found that the stock market between 1927 to 1998 returned about 11 percent more a year under Democrats and 2 percent more under Republicans.

    Treasury bills also performed better under Democratic presidents, yielding a 5.3 percent higher return than the Republicans' 3.7 percent.

    The cause of the disparity is uncertain, according to the study, which suggested share prices may have more influence over the outcome of presidential elections than a sitting president has over share prices.

    "In sum," the authors write, "the market seems to react very little, if at all, to presidential election news."

    Click here to return to top of page.

    Why Is Washington DC So Much Smaller than the Founders Envisioned? (posted 12-16-03)

    Derrill Holly, writing for the AP (Dec. 13 2003):

    On Dec. 12, 1800, the federal government officially moved to the District of Columbia. And 203 years later, historians say the congressionally created seat of government would be a far different place if politicians had remained true to the vision of the founding fathers.

    Had the nation's capital remained 100 square miles -- and included what are today the city of Alexandria and Arlington County, Va. -- it might have nearly 900,000 residents and be a commercial trading center rivaling New York or Philadelphia.

    "A wide swath of what is now northern Virginia was actually part of the district," noted Robert Bernstein, a Census Bureau spokesman. "In 1800 it was just a small area of 14,000 people and a lot of it was rural."

    President George Washington personally took part in the positioning of the south cornerstone for the "seat of government at Jones Point" in 1791. The stone, eight miles north of his Mount Vernon estate, was the first marker placed as surveyors plotted a federal site, measuring 10 miles on each side, as authorized by the first Congress earlier that year.

    Several other stone blocks along Virginia Route 7 also are among the surviving markers. Others exist at the boundary of the district and Maryland.

    "We have a piece of pottery here that's marked with the maker's mark that says Alexandria, D.C., which strikes people as odd," said Jim Mackay, director of the Lyceum, Alexandria's history museum.

    Until 1847, what was then known as Alexandria County was one of three major jurisdictions in the District of Columbia. The others were Georgetown and what was originally called Washington County, both formerly part of Maryland.

    "Washington was very adamant that the capital be located as close to Alexandria as possible," said Mackay. The first president and others who promoted the site on the banks of the Potomac River envisioned the region becoming the cultural and economic epicenter of the young nation.

    But members of Congress with seaport constituencies opposed appropriating federal funds to build public wharves and other facilities that would have benefited the capital area. Congress also passed the Residency Act, precluding construction of government buildings on the west bank of the Potomac.

    "Alexandria didn't receive any benefits that Georgetown and Washington County received," said T. Michael Miller, the Alexandria City historian. All of this fed retrocession sentiments that had existed as early as 1801.

    The effort finally succeeded in 1846, when George Washington Parke Custis -- the grandson of Martha Washington through her first marriage -- reversed his long-standing opposition to breaking up the district. Custis opposed retrocession for years as counter to the wishes of George Washington, who became his stepfather after his parents died.

    "If that hadn't occurred, the district would be a fiscally more viable city, not as dependent on the federal government," said Kenneth R. Bowling, a historian at George Washington University. The city's tax base would also be larger, and its population would be more diverse.

    Click here to return to top of page.

    What's a Spider Hole? (postd 12-15-03) William Safire, writing in the NYT (Dec. 15, 2003):

    Another useful bit of information is the origin of"spider hole," a phrase used by Lt. Gen. Ricardo Sanchez to describe the dugout hiding place in which the fugitive Saddam was cowering.

    This is Army lingo from the Vietnam era. The Vietcong guerrillas dug"Cu Chi tunnels" often connected to what the G.I.'s called"spider holes" — space dug deep enough for the placement of a clay pot large enough to hold a crouching man, covered by a wooden plank and concealed with leaves. When an American patrol passed, the Vietcong would spring out, shooting. But the hole had its dangers; if the pot broke or cracked, the guerrilla could be attacked by poisonous spiders or snakes. Hence,"spider hole."

    Click here to return to top of page.

    They Can't Both Be Right (postd 12-12-03) From History Today (Dec. 5, 2003):

    The Edward Jenner Museum in Gloucestershire and the George Marshall Medical Museum in Worcester are both displaying the horns of a cow used in the development of the smallpox vaccine by Dr Edward Jenner in the late 18th century. The George Marshall museum’s Dr Frank Crompton acknowledged: “I have communicated with the Jenner Museum and we have come to the conclusion that we cannot be absolutely certain which ones are genuine.”

    Click here to return to top of page.

    How the Smithsonian Finally Got an African-History Museum (posted 12-11-03)

    Bruce Craig, writing in his newsletter on behalf of the National Coalition for History (Dec. 11, 2003):
    Perhaps the most significant history-related accomplishment of this Congress ... is enactment of legislation (H.R. 3491) to establish within the Smithsonian Institution the National Museum of African American History and Culture. This legislation is the culmination of a 15-year effort by the principal sponsor of the bill -- civil rights leader, Rep. John Lewis (D-GA).

    Since 1988 Lewis has introduced legislation creating the museum, but for one reason or another his bills failed in the House or Senate -- politics makes for strange bedfellows. Because public opinion polls suggested low popularity of Republicans within the African-American community, the Republican leadership took direct action to boost support within this community. Consequently, under orders from their leaders and the White House, rank and file Republican congressmen enthusiastically embraced various funding and legislative proposals designed to benefit the African-American community, including Lewis' long-ignored bill. Republicans have reason to be proud for enacting this legislation that repeatedly failed for partisan reasons when the Democrats controlled Congress.

    Click here to return to top of page.

    Why Is There a Pyramid on U.S. Money? (posted 12-10-03)

    From the newsletter of the American Revolution Round Table (Dec. 2003):
    According to British scholar David Ovason, the Great Seal on the dollar bill reveals America's destiny. Ovason, who wowed reviewers with The Secret Architecture of Our Nation's Capital a couple of years ago, claims that there are two significant images on the dollar bill -- the truncated pyramid and the American eagle with the shield at its midsection, both framed in circles. The greatest secret is the pyramid, which includes the irradiated triangle that seems to complete the larger structure below it.

    The pyramid, with its lopped-off capstone -- the Egyptians revered the top -- was an historic reality."The Islamic invaders, once they captured Egypt, removed the face of the pyramid at the top and used it to build their mosques in Cairo," Ovason says.

    "It means that for man to return to his spiritual heritage, as we must do eventually, the Americans are charged with the destiny of replacing that pyramid. It's the destiny of the U.S. to build on the foundations already given. The stones on the foundation bear the date 1776. So that means the pyramid is specific to the United States of America."

    Editor's Note: In response to this posting, we received the following email from Vern Bullough, SUNY Distinguished Professor Emeritus of History and Social Science:

    I have lived and traveled in Egypt and I never saw a lopped off pyramid.The pyramids were often robbed over their covering stone but the thieves started from the bottom and never reached the top. I don't know where the explanation given in one of your releases comes from. It was probably if anything a Masonic symbol but I am not certain. The explanation you printed is just not true, at least from anything I have seen and observed. Egyptians did not build on top of pyramids although the robbed them for building material.

    Click here to return to top of page.

    The Anasazi Ate Turkey Long Before the Pilgrims (Who of Course May or May Not Have Eaten Turkey) (posted 12-4-03)

    Brett Prettyman, writing in the Salt Lake Tribune (Nov. 27, 2003):

    Long before the famous pilgrim feast of 1621, residents of what would later be called southern Utah gathered in redrock canyons and ate their own turkey dinner.
    And while historians say turkeys were not on the menu of the first Thanksgiving celebration, archaeologists have physical evidence that Meleagris gallopavo merriami was part of the Anasazi diet as far back as 700. No word on side dishes of the time.

    "They had them in pens, they used them for food and they used the feathers for ornaments and blankets," said Ron Rood, Utah's assistant state archaeologist.
    "The early archeologists talk about digging through turkey poop in Grand Gulch sites," said Dale Davidson, an archaeologist at the Bureau of Land Management's Monticello field office, which manages the Grand Gulch Plateau Primitive Area in southeastern Utah. "It is pretty dry in those alcoves. Backpackers complain about fresh cow turds, but there hasn't been any grazing [in Grand Gulch] since 1972. Things just don't go away down here."

    Ancient petroglyphs and pictographs also depict the turkey as part of Anasazi daily life in Utah. There is a ruin in Grand Gulch called Turkey Pen, although there is debate about whether the small enclosure was actually used for turkeys, and petroglyph panels in Nine Mile Canyon east of Price contain turkey tracks, turkey pens with birds in them and what appears to be a male turkey displaying his tail feathers for all to see.

    Click here to return to top of page.

    Who Invented Port? (posted 12-2-03)

    Isambard Wilkinson, writing in the London Independent (Nov. 29, 2003):

    HIGH in the Douro valley [in Oporto, Portugal] amid the endless slopes of vine-studded hills, the dwindling band of port dynasties will gather in the coming weeks to quietly celebrate Britain's love affair with the fortified wine.

    The merchants will raise a glass to a little known, 300-year-old trade agreement, the Treaty of Methuen, which assured their future and the beginning of a national obsession: tippling the ruby nectar.

    "I suppose we will quietly gather round and drink a bottle of port," said Paul Symington, managing director of a British family concern. "It was the Treaty of Methuen that really first encouraged port to be sold in large quantities."...

    The original port shipping pioneers left Britain in the 17th century in search of their fortune.

    They fortified the middling-quality local red with brandy to create port which quickly replaced French claret, something that was regularly unavailable because of war.

    As a result of the Treaty of Methuen in 1703 and the shortage of French competitors, port became, as one author noted, "as British as Roast Beef and God-damn". William Hogarth's victims gained bile from its juice. "Claret is for liquor; port for men," declared Dr Johnson, a noted three-bottle man....

    Yet the treaty is a source of some bitterness with Britain's oldest ally. Portuguese historians maintain that it destroyed the Portuguese textile industry by allowing cheap British imports and that British traders dealt with the country as if it were a colony.

    There is a lingering, mild resentment because of the British dominance of trade....

    Britons drink pounds 53 million worth of port per year, nearly half of which is consumed by women. Only the French drink more, though they prefer a cheaper, less gouty variety.

    For the remaining Britons of the port trade the Treaty of Methuen has become a quiet symbol of survival.

    Click here to return to top of page.

    Lincoln Never Said That (posted 12-1-03)

    Recently, the Illinois Historic Preservation Agency began collecting spurious Lincoln quotes. They are being published on the agency's website :

    Anyone who has glanced at a cereal box, herbal tea package, inspirational book, or restaurant place mat has probably encountered a Lincoln quotation that rings hollow. Lincoln is often quoted and misquoted by public officials and celebrities. Members of Congress have access to researchers at the Library of Congress to keep right with Lincoln's words. But even this resource cannot keep spurious Lincoln's quotations from being uttered by members of Congress....

    The "Ten Points" appear every February 12 in newspaper ads honoring Abraham Lincoln. In fact, these aphorisms are from the pen of Reverend William John Henry Boetcker (1873-1962).

    * You cannot bring about prosperity by discouraging thrift.
    * You cannot strengthen the weak by weakening the strong.
    * You cannot help small men by tearing down big men.
    * You cannot help the poor by destroying the rich.
    * You cannot lift the wage-earner by pulling down the wage-payer.
    * You cannot keep out of trouble by spending more than your income.
    * You cannot further the brotherhood of man by inciting class hatred.
    * You cannot establish sound security on borrowed money.
    * You cannot build character and courage by taking away a man's initiative and independence.
    * You cannot help men permanently by doing for them what they could and should do for themselves.

    ...

    Undoubtedly the most famous questioned utterance of Abraham Lincoln allegedly part of a speech delivered in Clinton, Illinois, September, 1858:

    "You can fool all the people some of the time and some of the people all the time, but you cannot fool all the people all the time."

    Click here to return to top of page.

    So the Pilgrims Celebrated the First Thanksgiving? (posted 11-26-03)

    George Allen, the former senator from Virginia, writing in the Washington Post (Nov. 23, 2003):

    As families come together this week, it is time to tell the truth about America's first Thanksgiving.

    For decades, children across America have donned the buckle-topped hats and plain dress of the Puritan pilgrims who landed near Plymouth Rock in 1620. As the old story goes, William Bradford, Miles Standish and the rest of the pilgrims held a harvest festival and were joined by their Indian friends, Samoset and Squanto, in 1621. Thankful for their safe journey and good harvest, and in celebration of their friendship with the neighboring Indians, the pilgrims feasted on turkey, venison, fish, berries and Indian corn meal. This is a good and honorable story, but it was not America's first Thanksgiving.

    Here, as Paul Harvey might say, is the rest of the story: America's first Thanksgiving occurred in what is now Charles City County, Va., on land that became part of the Berkeley Plantation on the James River. There, 38 men landed after a 10-week voyage across the Atlantic Ocean aboard the ship Margaret. The London Company, which had sent the expedition, sent explicit instructions for the settlers:

    "Wee ordaine that the day of our ships arrivall at the place assigned for plantacon in the land of Virginia shall be yearly and perpetually keept holy as a day of thanksgiving to Almighty God."

    On Dec. 4, 1619, a year before the pilgrims set foot on Plymouth Rock, the first Thanksgiving was held at Berkeley Plantation as Capt. John Woodlief and his band of settlers planted roots upriver from Jamestown in the growing colony of Virginia and gave thanks for their good fortune.

    In 1863, Thanksgiving became a national holiday. At that time there was no official connection between Abraham Lincoln's proclamation and the 1621 event held in Massachusetts, as that would come later. The reasons for affiliating our November holiday with the pilgrim feast and not the day of thanksgiving observed by Capt. Woodlief and his men are uncertain. My good friend Ross MacKenzie, who was raised in Illinois and now serves as the editor of the editorial pages of the Richmond Times-Dispatch, surmised that this myth is the result of a "northern bias." Shenandoah University history professor Warren Hofstra says New England historians were just "quicker on the jump." But in 1963, President John F. Kennedy recognized Virginia's claim to the holiday in his 1963 Thanksgiving Proclamation and Berkeley Plantation is proud to make the claim today.

    Visitors at Berkeley Plantation can find a plaque on the plantation grounds with the words of the London Company's instructions. The plantation was the birthplace of Benjamin Harrison as well as the home of President William Henry Harrison. It was also the site where Union Gen. Daniel Butterfield composed the melody for taps while camped on the grounds in 1862. Berkeley Plantation is truly one of our nation's historical jewels, and an important part of our Thanksgiving history.

    Click here to return to top of page.

    When Was Cloture Devised? (posted 11-26-03)

    Elaine S. Povich, writing in Newsday (Nov. 25, 2003):

    Before 1917, there was no such thing as "cloture" in the U.S. Senate. Senators could debate as long as they wanted, mounting an endless filibuster, with no parliamentary way to stop them.

    Today, cloture is used nearly weekly, as Senate leaders scramble to assemble the 60 votes needed to bring debate to a close on everything from judicial nominations to the Medicare bill. Cloture, which limits debate to a maximum of 30 hours, was invoked on the Medicare bill yesterday by a 70-29 vote.

    First suggested by President Woodrow Wilson, the Senate in 1917 adopted Rule 22 that allowed the Senate to shut off debate by invoking cloture. At the time, a two-thirds majority, 67 votes, was required.

    "It was a rarely used device until 1964, when President Lyndon Johnson organized the effort to invoke cloture after the 87-day debate against the Civil Rights Act of 1964," Senate Historian Donald Ritchie said. But, he said, once the issues of integration and civil rights were off the table, cloture began to be used more often.

    By 1975, in the post-Watergate reform era, Democrats succeeded in reducing the number of senators needed for cloture from two-thirds to three-fifths majority, or 60.

    Today, even the threat of a filibuster prompts the majority leader to file a cloture motion. He needs only 16 senators to join in filing the motion, and sometimes just that is enough to stop debate.

    Click here to return to top of page.

    George McGovern's Faux Pas (posted 11-26-03)

    Eric Alterman, on his blog:

    An historical aside: You know, you can trace the entire history of neoconservatism to the time when the then-still liberal Norman Podhoretz was having lunch with George McGovern about three decades ago, and they were picking a table to eat at and McGovern said something unkind about the looks of a woman at one table spoiling his appetite that I fear even included a canine reference. The woman turned out to be Decter, Podhoretz's wife, and the rest is history. The story originally appeared in Sid Blumenthal's book, "The Rise of the Counter-Establishment," and was repeated in a Washington Post's review of it. With a perfect talent for making an already ugly situation even uglier, Podhoretz wrote in a letter demanding a retraction, thereby calling attention to what must have been a horrifying situation for Decter, only to have McGovern confirm the story for everybody. And yes, this does explain a lot about John P. "Normanson" Podhoretz too, but let's leave that for another day.

    Click here to return to top of page.

    JFK Was Almost Killed as President-Elect (posted 11-26-03)

    Robin Erb, writing in the toledoblade.com (Nov. 21, 2003):

    On a bright Sunday morning nearly 43 years ago, a ramshackle Buick crept through the posh streets of Palm Beach, Fla., toward a sprawling, Mediterranean-style mansion.

    At the wheel was a disheveled, silver-haired madman. His aged right hand rested near a switch wired to seven sticks of dynamite.

    Inside the two-story stucco home was his target - president-elect John F. Kennedy - readying for morning Mass.

    Richard Pavlick stopped a short distance from the house and waited, unnoticed by U.S. Secret Service agents outside.

    It was decades before today’s proliferation of suicide bombers, but Pavlick’s plan on Dec. 11, 1960, was as simple: ram the president-elect’s car and detonate the dynamite.

    Pavlick’s suicide note had been written to the people of the United States, reading in part: "it is hoped by my actions that a better country ... has resulted."

    The mansion’s door opened. Mr. Kennedy emerged.

    But the 73-year-old Pavlick hesitated, then relaxed his fingers.

    What saved the future president from assassination that day was neither the intervention of law enforcement nor a malfunction of Pavlick’s device - a bomb that the Secret Service chief later said would have "blown up a small mountain."

    It was timing and perhaps a moment of conscience for Pavlick.

    Just steps behind the president, Jacqueline Kennedy appeared with toddler Caroline and newborn John, Jr.

    "I did not wish to harm her or the children," Pavlick would later explain. "I decided to get him at the church or someplace later."

    Pavlick never got the chance: He was arrested the following Thursday by authorities acting on information about his deep hatred for Kennedy. Sticks of dynamite were found in his vehicle.

    Click here to return to top of page.

    The Woman Behind Thanksgiving (posted 11-25-03)

    Candy Sagon, writing in the Washington Post (Nov. 25, 2003):

    Sarah Josepha Hale was relentless. She wanted a national Thanksgiving Day holiday and, by God, she would use every iota of her personality, prestige and power to get it.

    It was 1846, and Hale was editor of a highly popular women's magazine, Godey's Lady's Book. The North and the South were inexorably squaring off over the issue of slavery, and Hale believed that a nationally recognized day of thanksgiving could have a unifying effect. So she wrote letters, hundreds of them, during the next 17 years, to the governors of each state, to presidents, to secretaries of state, urging them to proclaim the last Thursday in November as Thanksgiving Day.

    "She was a marketing genius," says New Jersey writer Laura Schenone, author of A Thousand Years Over a Hot Stove (Norton, 2003), a new history of women and food in America. "She used her magazine to create an emotional aura around Thanksgiving that focused on home, hearth and family. She ran tear-jerker stories and gave advice on what to cook. She pushed a New England menu, with pumpkin pie, a roast turkey at the center of the table and vegetables in cream sauce."

    She also wrote impassioned editorials, urging all of the states and territories to celebrate the holiday on the same day "so that there will be complete moral and social reunion of the people of America," as Hale wrote in 1860. Until Hale began her crusade, says Schenone, Thanksgiving had been an erratic event, if celebrated at all. It was largely unknown in the South; in the North, it varied from state to state, held sometimes in October or November, but also in December, depending on the whims of the governors.

    An indomitable woman, Hale was widowed when she was pregnant with her fifth child. Her oldest at the time was 7. As editor of Godey's, Hale was sort of the Martha Stewart (minus the financial scandals) of her day.

    "She was a trendsetter and arbiter of national good taste," says Schenone.

    At least one writer mentions that Hale may have even visited with President Lincoln about the holiday. Whether or not that happened, Lincoln must have liked her idea. In 1863, as the Civil War raged on, he declared a national day of Thanksgiving on the last Thursday of November and asked the country to be thankful for its bounties of nature and to come together in peace.

    Click here to return to top of page.

    So Where Did the First Thanksgiving Take Place? (posted 11-25-03)

    Randy Boswell, writing in the Ottawa Citizen (Oct. 12, 2003):

    Everyone knows the story of the first Thanksgiving in the New World: how a group of English settlers who sailed to America on the Mayflower gathered in the fall of 1621 to celebrate the bountiful harvest in their Massachusetts colony.

    Although a much-mythologized tale, its essential outline is true -- except for the part about being first.

    Forty-three years earlier, on a tiny, windswept island in the Canadian Arctic, a group of ill-starred English sailors who survived a stormy Atlantic crossing knelt in prayer on a desolate shore 5,000 kilometres from home. At the urging of Martin Frobisher, leader of the 1578 expedition, and guided by a fiery ship's preacher named Robert Wolfall, 100 or more men gathered to give thanks for their deliverance from death, then devoured hearty meals of salt beef, biscuits and peas.

    It's a little known nugget of authentic Canadiana, lost in a sea of cranberry sauce and warmed-over stories from the south. But during the past decade, archeologists have been quietly exploring the place where a beloved holiday tradition has Canadian roots.

    They've not only rediscovered the romance and folly of an epic voyage of exploration. They've also unearthed tangible traces of the 16th-century adventure, including fragments of clay, a small basket and even bits of food left behind by Frobisher's party -- 425-year-old leftovers from North America's real first Thanksgiving.

    Click here to return to top of page.

    Iraq's Casualties Greater than Vietnam's During the First 3 Years of War in Asia (posted 11-18-03)

    From a Reuters report published by the Weekend Australian (Nov. 15, 2003):

    THE US death toll in Iraq has surpassed the number of American soldiers killed during the first three years of the Vietnam War, the brutal Cold War conflict that cast a shadow over US affairs for more than a generation.

    A Reuters analysis of US Defence Department statistics showed yesterday that the Vietnam War, which the army says officially began on December 11, 1961, produced a combined 392 fatal casualties from 1962 to 1964, when American troop levels in Indochina stood at just over 17,000.

    By comparison, a roadside bomb attack that killed a soldier in Baghdad yesterday brought to 397 the tally of American dead in Iraq, where US forces number about 130,000 troops -- the same number reached in Vietnam by October 1965.

    The casualty count for Iraq apparently surpassed the Vietnam figure last Sunday, when a US soldier killed in a rocket-propelled grenade attack south of Baghdad became the conflict's 393rd American casualty since Operation Iraqi Freedom began on March 20.

    Larger still is the number of American casualties from the broader US war on terrorism, which has produced 488 military deaths in Iraq, Afghanistan, The Philippines, southwest Asia and other locations.

    Statistics from battle zones outside Iraq show that 91 soldiers have died since October 7, 2001, as part of Operation Enduring Freedom, which US President George W. Bush launched against Afghanistan's former Taliban regime after the September 11, 2001, attacks on New York and Washington killed 3000 people.

    Click here to return to top of page.

    His Father Fought in the Civil War -- HIS FATHER! (posted 11-17-03)

    Jim Stingl, writing in the Milwaukee Journal Sentinel (Nov. 17, 2003):

    People are always trying to correct Bill Upham.

    "You mean your grandfather fought in the Civil War," they insist.

    "It would seem more true if it was my grandfather. But it was my father," the Milwaukee man says right back.

    When you hear Bill Upham's story, the first thing you do is the math. His father, William Henry Upham Sr., was born in 1841. That was 162 years ago.

    Bill and his brother, Frederick, both very much alive, were born in 1916 and 1921 respectively.

    "My brother is always saying we should be on 'Good Morning America' telling our story," said Frederick, now 82, still working as a geologist and living in Fort Collins, Colo. "I feel like I should be in a jar of formaldehyde in some medical school."

    A few years ago, a momentarily clueless television reporter asked Frederick, "What is your father doing now?"

    William Upham Sr. was good, but even he wasn't that good.

    The elder Mr. Upham - a Union soldier, successful businessman and for two years the governor of Wisconsin - lost his wife and married a much, much younger woman when he was 75. A year later, Bill showed up. And when William Upham was 80, he begot Frederick.

    Between the father and his sons, they have lived every second of American history save the country's first 65 years.

    The last Civil War veteran was buried nearly a half century ago. The last Civil War widow, who was 21 when she married an 81-year-old veteran, is 96 and living in Alabama. Offspring of the fighters in America's war between the states are more plentiful, but are thought to number only in the hundreds.

    Bill Upham was just 8 when his father died, but he said he remembers him well and with great fondness. As a boy, he was so devastated by his father's death that he was sent to live with an aunt in North Carolina rather than remain at home and disrupt his mother's new marriage, which came rather quickly after William died.

    Click here to return to top of page.

    Racist Place Names (posted 11-17-03)

    Bill Cotterell, writing in tallahassee.com (Nov. 17, 2003):

    Appalled by worldwide news reports that a rural Florida bridge bore the offensive name of a character in Mark Twain's "The Adventures of Huckleberry Finn," a veteran South Florida legislator wants public agencies to check their maps for any racial slurs.

    State Sen. Steve Geller, D-Hallandale, filed a bill after seeing a Reuters news report in a South Florida newspaper that said there are 144 places throughout the country with names that use the word "nigger" in some fashion. As an example, the British wire service cited "Nigger Jim Hammock Bridge" in Hendry County, on a two-lane road near Clewiston.

    The news story was picked up on several Web sites featuring political commentary.

    "It's not the highest priority on my or anybody's agenda, but there is no reason today that anybody ought to have 'Nigger Jim Bridge,'" Geller said. "If there was a 'Long-nosed Jew Highway' somewhere in the state, I'd feel the same."

    But some conservatives are worried that a politically correct witch hunt could result from his bill. And the Hendry County manager says he's never heard of a "Nigger Jim Hammock Bridge."

    Typing in the pejorative name on the index of the home page of the U.S. Board of Geographic Names, the federal agency cited in the Reuters report, brings up a place map for "Negro Jim Hammock Bridge" and a site map showing a location southwest of Moore Haven.

    Roger Payne, executive secretary of the federal board, said it officially changed all such names to "Negro" in 1963 and changed "Jap" to "Japanese" wherever it occurred in U.S. Geological Survey records in 1971. But he said "the records retain the variant or former name" of all 144 places as a secondary reference in federal databanks.

    Payne said there are 13 places in Florida with names like "Negro Cove, Negro Island, Negro Camp Island." But he said some might be rooted in the Spanish word for black, rather than referring to a race of people.

    Only four of the 13 Florida places in the USGS National Mapping Information site list the pejorative word as an original name - the Hendry County bridge and Negrotown Knoll and Negrotown Marsh, both in Highlands County, and Negro Head, a cape in Lee County.

    Click here to return to top of page.

    Name the Person Who Made These Anti-Semitic Remarks (posted 11-11-03)

    Janadas Devan, writing in the Straits Times (Nov. 9, 2003)

    The former Malaysian prime minister, Tun Dr Mahathir Mohamad, said recently that Jews rule the world by proxy. Opinion polls in Europe show a majority of Europeans feel Israel is a threat to world peace. Anti-Semitic 'hate speech' and 'hate acts' seem more frequent lately. But as Janadas Devan finds out, anti-Semitism has a long, persistent and troubling history.

    CONSIDER the following examples of anti-Semitism:

    'Reasons of race and religion combine to make any large number of free-thinking Jews undesirable.'

    'You may as well do anything most hard/ As seek to soften that - than which what's harder? -/ His Jewish heart.'

    'How I hated marrying a Jew.'

    'Down in a tall busy street he read a dozen Jewish names on a line of stores... New York - he could not dissociate it now from the slow, upward creep of these people.'

    'Jew York'. 'Jewnited States.' 'Franklin Delano Jewsfeld.'

    Who uttered these statements?

    Dr Josef Goebbels? Some Nazi poet? A blond Aryan, expressing regret for marrying a Jew during the Holocaust? A member of the lunatic Ku Klux Klan?

    None of the above.

    They were made by some of the most prestigious figures in Anglo-American culture: T.S. Eliot, William Shakespeare, Virginia Woolf (who, of course, married Leonard Woolf, a Jew), F. Scott Fitzgerald and Ezra Pound.

    Similar examples of anti-Semitism can be easily multiplied.

    In French literature - Emile Zola, Guy de Maupassant, Maurice Barres.

    In English literature - Rudyard Kipling, Hilaire Beloc, G.K. Chesterton.

    In American letters - Henry Adams, H.L. Mencken. Among industrialists - Henry Ford.

    Among 'All-American heroes' - Charles Lindbergh. Among royalty - King Edward VIII, later the Duke of Windsor.

    And on and on, ad infinitum.

    But these are only examples of 'hate speech'.

    The list of 20th century anti-Semitic 'hate acts' is more gruesome.

    The Holocaust, when six million Jews were exterminated by Hitler, was only the final act.

    Pogroms during and after the 1917 Russian Revolution resulted in the death of 75,000 Jews.

    In Germany, after World War I, Jewish communities in Berlin and Munich were terrorised by anti-Semitic organisations.

    After the Munich Soviet was crushed, all foreign-born Jews were expelled from the city.

    The Holocaust didn't happen out of the blue; Europe was well-primed for the 'Final Solution'. And it was not the work of only a few decades, but of centuries.

    As historian Paul Johnson points out in his History Of The Jews, though the term 'anti-Semitism' was not coined until 1879, anti-Semitism, 'in fact if not in name', undoubtedly existed from 'deep antiquity'.

    Click here to return to top of page.

    What Happened to the Riderless Horse at JFK's Funeral? (posted 11-6-03)

    Ed Turner, writng in the Washington Times (Nov. 6, 2003):

    Among the most indelible images of American history is the caisson bearing President Kennedy's body during his funeral on Nov. 25, 1963. Black Jack, the Army's riderless horse, pranced restlessly and majestically behind the military carriage bearing the fallen president's casket as it was being taken to Arlington National Cemetery for burial.

    Black Jack and the four soldiers and seven horses that led the caisson came from the Caisson Platoon of the U.S. Army's 3rd Infantry Regiment, also known as the Old Guard, the oldest active infantry unit in the Army. The Caisson Platoon, which has been stationed at Fort Meyer Army Post in Arlington since 1948, takes part in some 1,500 full honor military funerals each year at Arlington Cemetery and participates in parades, ceremonies and pageants in the Washington area.

    Black Jack and the Caisson Platoon became national icons after Kennedy's funeral. In fact, after Black Jack died in 1976, his ashes were placed in a memorial at Summerall Field at Fort Meyer, just blocks from the stable where the horse was kept during its 21 years of service as a riderless horse. Black Jack was famous enough to visitors who toured the post that the Army created a special museum inside the John C. McKinney Memorial Stables in memory of him.
    People who come to the stables still ask about Black Jack.

    "Visitors usually ask what Black Jack did and when he died, whether he was the one in the Kennedy funeral," says Alan Bogan, director of the Old Guard Museum. "He's still the most famous horse. I doubt if anyone can name any other one."

    More than 10,000 people visit the Caisson Platoon's stables each year to see the caissons and horses and where Black Jack resided. The Old Guard Museum down the street also houses artifacts and memorabilia from the full regiment, which provides sentinels at the Tomb of the Unknowns, demonstrations by its U.S. Army Drill Team, performances by its Fife and Drum Corps, and presentations of the colors by its Continental Color Guard.

    Visitors who come to the John C. McKinney Memorial Stables, where the Caisson Platoon keeps many of it 44 horses — three of them "riderless" horses like Black Jack — can receive a guided tour from a soldier or explore the premises on their own. The stable, which was built in 1908, consists of tack rooms, a farrier room, caisson rooms and the Black Jack Museum in honor of the famous riderless horse that took part in President Kennedy's funeral march.
    "Black Jack was the last horse that was bred and issued by the Army," says Spc. Matthew Moore, who has been in the Old Guard for 13 months. "The horses that we get now are either donated to us or purchased."

    Click here to return to top of page.

    Oldest Condom in the World (posted 11-5-03)

    From BBC News (Oct. 30, 2003):

    The oldest known condoms in the world - 17th Century creations made of animal and fish intestine - are to leave the UK to be displayed at a Dutch sex exhibition. The five contraceptives were excavated from a medieval toilet in Dudley Castle in 1985 - they are thought to have lain there since before 1646.

    A spokesman for Dudley Council, which has care of the rare items, said they would be on show at the Drents Museum in the province of Drenthein from 11 November to 8 February.

    Because the sheaths are so fragile, Dr Vincent Vilsteren, keeper of archaeology, is making a special visit this weekend to collect them.

    The museum is staging an exhibition called 100,000 Years of Sex.

    It's good to know that the earth has moved for many generations in the borough

    Councillor Charles Fraser Macnamara
    Adrian Durkin, exhibitions officer at Dudley Council, said: "It is very rare for such items to survive so well.

    "Indeed the next oldest condoms in the world are over 100 years younger and will also be on display in the exhibition."

    Councillor Charles Fraser Macnamara, lead member for culture and leisure, added: "This exhibition certainly has the opportunity to put Dudley on the map.

    Click here to return to top of page.

    What Does the Stock Market Tell Us About Next Year's Election (And Vice Versa?)(posted 11-4-03)

    Tom Walker, writing in the Atlanta Journal-Constitution (Nov. 2, 2003):

    If you own stocks, you'd best hope that President Bush is re-elected next year. Politics aside, that would be better for the market than his defeat.

    That's history's message, according to the best-known compendium of Wall Street statistics and information, the "Stock Trader's Almanac."

    The 37th edition of this deep mine of data is just out. Its focus, naturally, is on next year's presidential election.

    "Positive market action usually accompanies re-election of a president," says Jeffrey A. Hirsch, who with his father, Yale Hirsch, compiles and edits the annual almanac.

    This time, however, Hirsch detects a note of caution in the numbers that may or may not bode well for Bush as the election approaches in 2004. It has to do with whether the stock market is performing as well at this time next year as it is now, and Hirsch wonders whether it will.

    The theorizing starts with the premise that bear markets are more likely in the first two years of the four-year presidential cycle, with bull markets more likely in pre-election and election years.

    There's no mystery to this. Incumbents pull out all the stops to stimulate the economy so that voters feel good and prosperous at election time.

    If that's right, then Bush's administration is running true to form. The market was in the tank from early 2000 until October 2002 and has been on a tear ever since.

    But Hirsch says this pre-election year has been unusually strong and not likely to continue at that pace beyond the early part of 2004.

    That means year-over-year returns in 2004 "are likely to be more tame," says Hirsch --- right about the time of the presidential campaign.

    Click here to return to top of page.

    Islam's Division into 2 Main Camps Began When ... (posted 11-4-03)

    Corinne Atkins, writing in History Today (Nov. 11, 2003):

    On August 29th, 2003, a huge car bomb went off in the central Iraqi town of Najav, killing more than 100 people, including the Shi’ite cleric Ayatollah Mohammed Baqr al-Hakim. Coming hard on the heels of an equally devastating explosion at the UN headquarters in Baghdad, it emphasised the dangers inherent in the reconstruction of Iraq, and the tensions within the country, many of them derived from the country’s political and religious past.

    It was in this region, then known as Mesopotamia, that some of the most significant and tragic events of early Islam occurred. The three towns of Kufa, Najav and Kerbala, which all lay relatively close to each other, south of Baghdad, became pivotal to what is now known as the Shia branch of Islam.

    The Sunni-Shia schism in Islam can be traced back to the issues that arose over the leadership of the Muslim community shortly after the death of the Prophet Mohammed in ad 632.

    Since Mohammed’s only daughter, Fatima, could not step into her father’s shoes, three caliphs (deputies) assumed control for the brief period ad 632-656. Some, however, refused to recognise them. Known as the Shias, they were followers of Mohammed’s charismatic son-in-law, Ali ibn Abi Talib (ad 600-661).

    The word ‘Shia’ is an abbreviation of the phrase ‘Shiat Ali’, meaning the ‘partisans of Ali’. Arguing that only the blood line could be the recipient of Mohammed’s divine guidance, they believed the Prophet had designated Ali as his political successor and had imparted to him the power of interpreting religious knowledge. Ali and his descendants were therefore the only rightful successors of the Prophet.

    Their opponents, the Sunnis, supported the view that the Prophet’s legitimate successor could be chosen by man and should be an elected member of the Prophet’s own tribe.

    In AD 656, after the assassination of Uthman, the third caliph, Ali ascended to the caliphate. Seven months after taking charge, he moved the capital of the caliphate from Medina in Arabia to Mesopotamia.

    Click here to return to top of page.

    What? Charlie Chaplin Wasn't Jewish? (posted 10-30-03)

    Michael Ollove, writing in the Baltimore Sun (Oct. 26, 2003):

    Jews liked to think that Charlie Chaplin was Jewish. Nazis liked to think Charlie Chaplin was Jewish. McCarthyites liked to think Charlie Chaplin was Jewish. Sometimes, Charlie Chaplin liked to think Charlie Chaplin was Jewish.

    Charlie Chaplin was not Jewish.

    That he wasn't did not stop people from conceiving of him and the characters he played as Jewish, which colored the way they experienced his films. Or, maybe they recognized something "Jewish" in his films and that gave rise to their assumption that he was as well.

    Whatever the original spark, Jewish audiences delighted in Chaplin's Little Tramp, viewing him as a heroic stand-in for their own painful immigrant experiences. "What do they want from him, the goyim," one woman was overheard crying out in a theater while watching The Gold Rush.
    Sponsored Links What's this?

    Hitler's minions denounced Chaplin as a Jew and banned his movies. Anti-Communist witch hunters in the 1950s uttered asides about his alleged Jewishness as a way to discredit him.

    Chaplin himself was coy about his suspected Jewish lineage. "If they wanted me Jewish," he once said, "they would have me Jewish."

    So it is with ironic intent that authors J. Hoberman and Jeffrey Shandler identify Chaplin as the first "Jewish" superstar in modern American entertainment.

    In their book Entertaining America: Jews, Movies and Broadcasting, Chaplin is a beginning point in a fascinating conversation about Jewish identity in the context of American entertainment. The book accompanies an exhibit that was staged earlier this year at the Jewish Museum in New York. Today, it opens here at the Jewish Museum of Maryland, where it will remain until early next year.

    Click here to return to top of page.

    Just How Lewd Was Elizabethan England? (posted 10-23-03)

    John Ezard, writing in the Guardian (Oct. 18, 2003):

    However foul it has got, the language of television soaps pales beside the sexual insults traded publicly on the streets of Britain for three centuries, according to a new book.
    The real-life street theatre in the 16th to 18th centuries drew on a richer, far lewder lexicon, according to Professor Bernard Capp, of Warwick University.

    It included the insults jade, quean, baggage, harlot, drab, filth, flirt, gill, trull, dirtyheels, draggletail, flap, naughty-pack, slut, squirt and strumpet.

    All of these words were synonyms for whore, which had been weakened by massive overuse. The nouns were "generally heightened by adjectives such as arrant, base, brazenfaced or scurvy".

    Prof Capp's book, When Gossips Meet, draws on court documents showing that prostitution was seen as a far worse disgrace than fornication.

    "Venereal disease, especially syphilis or the pox, also featured prominently in abusive language," he adds. "Taunts such as 'burnt-arsed whore' and 'pocky whore' were familiar throughout the country.

    "At Bury St Edmunds, Faith Wilson told her neighbour in 1619 to 'pull up your muffler higher and hide your pocky face, and go home and scrape your mangy arse'."

    Insults and gossip had a function: to give "women some control over erring husbands, abusive employers or sexually disreputable women. When someone is gossiped about, they restrict their behaviour".

    But it could tear apart families and parishes. According to archdeaconry court papers, "Joan Webb of Wittlesford, Cambs, was rumoured in 1596 to be worse than any whore", because she allegedly paid men to have sex with her... The stories prompted a man who has been planning to marry her to break off the match, giving her £5 'to be rid of her'."

    Click here to return to top of page.

    When the Papal Chair Is Empty (posted 10-17-03)

    Paul Collins, writing in the Australian Financial Review (Oct. 17, 2003):

    The Catholic church is by far the largest multinational institution in the world, with more than one billion adherents. As John Paul II's papacy approaches its end, intense interest is building in the process by which popes are elected.

    The papacy of John Paul II [1978- ] has been the third longest in history (if we leave out St Peter we have no idea of the length of his papacy). John Paul would overtake Leo XIII [1878-1903] on March 11 next year. The longest papacy was that of Pius IX [1846-78]. But John Paul's physical weakness indicates that the end of this pontificate may come soon.

    The pope remains pope until he dies or resigns. There have been very few resignations the last was that of Celestine V in 1294. There has been much discussion of John Paul II resigning. This is unlikely because he sees his illness and sufferings as uniting himself with the agony of Jesus on the cross, and failure to see this through would be an abandonment of God's will for him.

    Largely because pre-modern medicine was primitive and dangerous for the patient, sick popes usually died quickly. A number were murdered. The average length of a papacy is about seven years.

    There is a danger that with contemporary medicine keeping people alive so much longer, a pope could become totally incapacitated by dementia, Alzheimer's disease or another form of mental or physical deterioration, leaving the church without leadership for a considerable period.

    This could create a constitutional nightmare. There is nothing in canon law about removing a senile, sick or crazy pope unless he ultimately resigns. For instance, Urban VI [1378-89] was clearly mad, but he was still pope when he died. Despite the fact that the Council of Constance [1414-18] said that in extraordinary circumstances a general church council was superior to a pope, most church lawyers deny that a council can depose a pope. However, it may be the only way to resolve the impasse created by an unhinged incumbent.

    John Paul will almost certainly have entrusted a written resignation to either the dean of the College of Cardinals, Josef Ratzinger, or to the Camerlengo of the Holy Roman Church, Cardinal Eduardo Martinez Somalo, or to his personal secretary, Archbishop Stanislaw Dziwisz. This letter will state something along the lines that if, in the opinion of the College of Cardinals, he has become mentally incapable of continuing in office, his resignation will automatically come into effect.

    The church is run by the College of Cardinals during the 15- to 20-day interregnum between the death of the previous pope and the conclave. The period is technically called Sede vacante, which means "the (papal) chair being empty". In this period the cardinals operate according to strict rules laid down by popes Paul VI in October 1975 and John Paul II in February 1996. These rules cannot be changed by the cardinals during the interregnum.

    The most important person in a papal interregnum is the cardinal camerlengo, or chamberlain. He is assisted by the Apostolic Camera, a small office, originating in the 11th century, that helps him in the administration of the temporal goods of the papacy during a Sede vacante.

    A rotating committee of three cardinals is chosen by the cardinal electors to assist the camerlengo in preparing for the conclave, and in making day-to-day decisions that cannot be deferred. Cardinals are strictly bound not to make important decisions, above all rulings that would be binding on the next pope.

    Daily meetings of all the cardinals are held, presided over by the dean of the College of Cardinals. When the pope dies, all cardinals who are heads of Vatican departments cease to hold office, except the camerlengo and the major penitentiary, the American Cardinal Francis Stafford. Since this cardinal deals with confessional matters the idea is that forgiveness should always be available. Also the vicar-general of the diocese of Rome, Cardinal Camillo Ruini, remains in office so that the government of the local church may continue.

    During the Sede vacante the cardinals will spend time getting to know each other, and quietly discussing the profile of the kind of man they want, and think that the church needs, as the next pope. Cardinals over the age of 80 can participate in these discussions, but are excluded as soon as their colleagues enter the conclave.

    Pre-existing ad hoc groupings of cardinals with theological, political and regional interests in common will have been discussing the issue of the next pope among themselves. Those working in the Vatican will be most active in this type of discussion because of their proximity to each other and their common interests.

    They will have been doing this very discreetly and obliquely, and will always deny that anything like this is happening, especially if asked by the media. Most argue that the aim of the secrecy is to avoid party politics in the church, but the real reason is that Vatican politics, still very much influenced by the Latin mentality, are always played out obliquely and behind closed doors.

    The pope is elected on the basis of an extremely narrow franchise: those members of the College of Cardinals under the age of 80. Since about the middle of the 12th century the popes have almost always been elected by the cardinals. The only real exception to this was at the end of the Great Western Schism, when all three papal pretenders were dismissed by the Council of Constance [1414-18]. Martin V [1417-31] was elected by a mixed group of cardinals, bishops and others representing the council.

    The fundamental role of the pope is to be bishop of Rome. During the first 700 years of church history it was usually the clergy and laypeople of Rome, as well as bishops from nearby towns, who played the major role in electing the pope.

    By the 8th century the franchise had became limited to the senior clergy of Rome. These were the priests who ministered at the "titular" churches, that is the oldest churches in the city. The title "cardinal", from the Latin cardo meaning "hinge", or "door", was first applied to these parish priests from as early as the 7th century. They became known as "cardinal priests".

    The title was slowly extended to the senior deacons of Rome. These were ordained men who were not priests, but who were in charge of church administration and the distribution of social welfare to the poor.

    In the 8th century the title of cardinal was also extended to the bishops of the central Italian dioceses immediately around Rome. With the pope, these bishops formed the Roman Synod, advising and assisting him in the administration of the Roman church. They eventually evolved into "cardinal bishops".

    As the senior pastors and administrators, cardinal priests, deacons and bishops gradually assumed control of the Roman church during a papal vacancy. They also had an increasing say in the election of the new pope. In order to break the influence of secular rulers in papal elections, Stephen III [768-72] decreed in 769 that only cardinal deacons and priests of the Roman church were eligible for election as pope, and that the laity should have no vote. Lay participation had sometimes led to riots and vicious factional in-fighting.

    Despite this, in the 9th and 10th centuries the papacy came under the influence of lay forces, especially the Mafia-like clans who controlled parts of Rome and its immediate surroundings from fortified mansions. Many of the popes of this period were members of these families, and they were often utterly unworthy of office.

    From about 1030 onwards a reform movement permeated Rome. The greatest figure in the campaign to break lay control of ecclesiastical office was Gregory VII [1073-85]. Reformers saw that the papal election process was the key to ensuring that a worthy person was elected.

    Since the 12th century the College of Cardinals has elected the pope in an closed meeting called a "conclave", from the Latin cum clave, meaning "with a key". This referred to the fact that the cardinals were locked up, sometimes with graduated fasting, until they elected the pope.

    Most modern conclaves have been held in the Sistine Chapel, surrounded by Michelangelo's now gloriously restored paintings of the creation and the last judgement. After the early 14th century the cardinals were isolated from outsiders in uncomfortable circumstances until the new pope was elected. Even in 20th-century conclaves the cardinals and their assistants did not always have separate rooms. They resided in the cramped and very inconvenient makeshift area surrounding the Sistine chapel.

    The purpose of locking them away was to guard against outside influence and to hasten papal elections. In the next and subsequent conclaves cardinals and their assistants will reside in the purpose-built and comfortable Domus Sanctae Marthae, a motel-style building of 130 suites and single rooms with dining facilities, erected in 1996 within the Vatican.

    The election decree of the Third Lateran Council of March, 1179 required that for a valid election a two-thirds majority of cardinals must vote for a candidate. The purpose of this was to force cardinals to compromise in order to preclude the danger of a disputed election. It also avoided the problem of an elected pope's authority being weakened by having to deal with a large minority of disgruntled cardinals who had opposed his election. This rule remained in force until 1996, when it was modified suddenly and without apparent reason by John Paul II.

    On February 2, 1996, John Paul issued a new set of rules governing the election process. Firstly, only election by scrutiny, that is by secret, written ballot, was permitted. There was also a seemingly small, but extremely significant modification to the two-thirds majority requirement.

    John Paul decreed that ballots in the conclave were to proceed at the rate of four per day, two in the morning and two in the afternoon. If after three days no-one has been elected, a day of prayer and discussion is to be held. If, after a further 21 ballots, no-one has received the two-thirds vote required, the camerlengo can invite the cardinals to vote for another election procedure. The cardinals can then decide to drop the two-thirds majority requirement, and elect by an absolute majority, that is elect the cardinal who gets more than half of the votes.

    The problem with this is that in contested elections there is no incentive to compromise. What has actually happened in most modern conclaves is that two candidates have emerged relatively quickly with large blocks of cardinals supporting them. But neither has had the required majority. The two-thirds requirement forced a compromise, and persuaded the great electors (the leading cardinals from various factions) to seek a compromise candidate, someone who would eventually be acceptable to a large majority from both blocks. What John Paul's change does is encourage a small majority to hold out against a large minority. It could prove disastrous in a strongly contested election.

    Since the start of the 20th century the composition of the College of Cardinals has become more and more internationalised. Italians no longer hold the majority. In the first conclave of the 20th century, which elected Pius X, more than half of the cardinal electors (38 of the 62) were Italian.

    As of this October 1, there are 135 electors from 59 countries. There are 23 from Italy. It is often forgotten that the pope's primary title is bishop of Rome and it could be argued that it is appropriate that he be an Italian, or that at least that he be able to speak excellent, idiomatic Italian, and be completely at home in western European culture.

    Click here to return to top of page.

    Benedict Arnold's Flawed Gravestone (posted 10-15-03)

    From the website of Connecicut's TV station, WTNH (Oct. 15, 2003):

    His name is synonymous with traitor. Still, Benedict Arnold is Norwich's most famous native son.

    Bill Stanley, president of the Norwich Historical Society, will place a memorial stone at Arnold's gravesite at Saint Mary's Church in London on the Thames River.

    The granite memorial is being made by a stone-crafting company in Vermont and will be shipped to Norwich before its final destination.

    Stanley says he and his wife are paying the costs.

    He says he wants to correct a mistake on the memorial that is painted on the wall at the British church. It reads that Arnold was born in 1801 and died in 1951.

    He was born in 1741 and died in 1801.

    Stanley says the new memorial also will correct the name of Arnold's wife.

    He says he will accompany the stone to London in May.

    Click here to return to top of page.

    How Did Mobsters Hide Bodies? (posted 10-14-03)

    From the Sydney Morning Herald (Oct. 9, 2003):

    A New York gangster family that was said to be the inspiration for the hit television drama The Sopranos used so-called double-decker coffins to dispose of murder victims, a Manhattan court has been told.

    Anthony Rotondo, a defector from "the Mob", told jurors on Tuesday that caskets with false bottoms were sold to trusting customers of a well-known New Jersey undertaker.

    He said the DeCavalcante crime family had used the coffins to secretly bury the victims of mob executions, along with the bodies of people who had died from natural causes.

    "The family would put the body of the murdered victim below the regular customer, thus disappearing forever," Rotondo explained, while giving evidence in the trial of the reputed DeCavalcante family boss, Girolamo "Jimmy" Palermo.

    The ruse risked exposure at times because of the surprise of pall bearers as they carried the two-for-one coffins.

    "Everyone would kind of look at one another," Rotondo recalled. "There would be six grown men carrying someone's 80-pound [36-kilogram] grandmother, and they looked like they were having a problem."

    Rotondo said the double-decker coffins had been used as early as the 1920s, and were the brainchild of Carlo Corsentino, an undertaker member of the DeCavalcante family.

    Corsentino's son, Carl, still runs the family's funeral home in Elizabeth, New Jersey.

    Click here to return to top of page.

    Noted Suicides Through History (posted 10-8-03)

    From the Ottawa Citizen, a list of noteworthy suicides (Oct. 6, 2003):

    Socrates, Greek philosopher
    Mark Antony and Cleopatra; Roman politician/general, Egyptian queen
    Judas Iscariot, disciple of Jesus Christ
    Lucius Domitius Nero, Roman emperor
    Vincent van Gogh, painter
    Virginia Woolf, writer
    Adolf Hitler, Nazi leader
    George Reeves, actor, 1950's TV Superman
    Ernest Hemingway, Nobel Prize-winning writer
    Marilyn Monroe, actress
    Sylvia Plath, poet
    Thich Quang Duc, Buddhist monk, self-immolated on Saigon street, creating enduring and infamous image of the Vietnam war.
    Brian Epstein, Beatles manager
    Bobby Sands, imprisoned Irish
    Republican Army hunger striker
    John Robarts, former Ontario premier
    Abbie Hoffman, '60s counter-culture figure, founder of Yippie movement, Chicago Seven defendant
    Margaux Hemingway, model, actress, granddaughter of Ernest.
    Michael Hutchence, rock musician with INXS
    Vince Foster, adviser to U.S. president Bill Clinton
    Kurt Cobain, rock musician with Nirvana
    David Kelly, embattled British government weapons expert

    Click here to read the article accompanying this list.

    Click here to return to top of page.

    Did the Japanese Use Mustard Gas in WW II? (posted 10-2-03)

    Brendan I. Koerner, writing in Slate (Oct. 1, 2003):

    A Tokyo court has awarded $1.7 million to 13 Chinese citizens who claim they were sickened by Japanese mustard gas left over from World War II. Did Japan actually use chemical weapons during the conflict?

    Yes, although the extent of Japan's chemical warfare has never been resolved. What's known for sure is that the Japanese Imperial Army left behind thousands of tons of chemical weapons when it left China in 1945. Japan estimates that 700,000 such shells, bombs, and supply drums remain in the country, buried throughout its provinces; China puts the figure closer to 2 million. What's less clear is exactly when and where these weapons were used. The most well-documented instance occurred in 1941 at Yichang, a city in the central province of Hubei. The Japanese reportedly used mustard gas and lewisite when seizing the city and again to repel the Chinese Nationalist troops who attempted to recapture it. Additionally, Japan's infamous Unit 731, a covert biological warfare program, tested chemical weapons on thousands of Chinese citizens.

    Click here to return to top of page.

    How the Poverty Rate Is Determined (posted 9-26-03)

    Jared Bernstein, economist with the Economic Policy Institute, writing in the NYT (Sept. 26, 2003):

    Today the Census Bureau will release the official poverty rate for 2002. While that figure is likely to indicate that the ranks of the poor have increased, it unfortunately won't really tell us much of anything about the true extent of poverty in America.

    The problem is that the official definition of poverty no longer provides an accurate picture of material deprivation. The current measure was created 40 years ago by a government statistician, Mollie Orshansky, and hasn't much changed since. "Anyone who thinks we ought to change it is perfectly right," Ms. Orshansky told an interviewer in 2001.

    The current procedure takes the 1963 poverty thresholds for each given family size devised by Ms. Orshansky and updates them for inflation. For example, if the income of a family of four with two adults and two children fell below $18,244 last year, they were counted as poor by the bureau. Simple, yes, but there are two basic problems.

    First, it fails to capture important changes in consumption patterns since the early 1960's. The research underlying the original thresholds was based on food expenditures by low-income families in 1955. Since her calculations showed that families then spent about a third of their income on food, Ms. Orshansky multiplied a low-income food budget by three to come up with her poverty line. But even she suspected this method underestimated what it took to meet basic needs, and was thus low-balling the poverty rate.

    And that mismeasurement has worsened over time, as food has become less expensive in relation to other needs like housing, health care and transportation, meaning the share of income spent on food by low-income families has fallen further.

    The National Academy of Sciences has estimated what the Orshansky measure would look like today if it were updated for changes in consumption patterns, and found the threshold could be as much as 45 percent higher, implying higher poverty rates.

    Second, the current measure leaves out some sources of income and some expenditures that weren't relevant when it was devised. The Census Bureau counts the value of cash transfers, like welfare payments, but it ignores the value of food stamps and health benefits, as well as newer tax credits that can significantly add to the income of low-end working families. Not only would taking these additions into consideration bring down the poverty rate figure, it would also provide a real measure of the effects of these antipoverty programs.

    On the other side of the ledger, the current method also ignores important costs to low-income families. For example, these days many more women with young children participate in the labor force, yet the money they spend on child care is not factored into the poverty calculation.

    If the Census Bureau's poverty findings were simply an accounting tool, these failures might not be important to anyone but economists and demographers. But the official figure plays an important role in determining eligibility for the federal and state safety nets: if we're not getting the measurement right, we're not providing services to the right people.

    Click here to return to top of page.

    Dr. Rice, I Presume! (posted 9-25-03)

    From Steven Aftergood in his newsletter, Secrecy News (Sept. 25, 2003):

    The background briefing is an odd convention in which a government official speaks to reporters on a not-for-attribution basis. The questionable premise seems to be that the official will speak more freely and frankly if his or her identity is concealed.

    But at a background briefing with an unidentified"Senior Administration Official" this week, a reporter addressed the official as"Dr. Rice," and the White House faithfully included the reference in the transcript of the briefing posted on the White House web site.

    See"Senior Administration Official Briefing," September 24: http://www.whitehouse.gov/news/releases/2003/09/20030924-8.html

    On September 26, 2003 Mr. Aftergood issued the following addendum:

    The White House moved swiftly yesterday to delete a reporter's
    reference to "Dr. Rice" from the transcript posted on the
    White House web site of what was supposed to be a background
    briefing with an unnamed "senior administration official"
    (SN, 09/25/03).

    But the raw transcript of the September 24 briefing, provided
    by the Federal Document Clearing House, is still preserved in
    the Lexis-Nexis database.

    And a replica of the original, unaltered page (printer friendly
    version) including the reference to Dr. Rice has been posted
    here:

    http://www.fas.org/sgp/news/2003/09/wh092403.html

    The pretense of anonymity in such circumstances is silly. But
    it is also pernicious since it adds one more veil for
    officials to hide behind, distancing the public from its
    government and shielding officials from responsibility for
    their own words.

    Click here to return to top of page.

    Reagan Opposed the Recall (posted 9-25-03)

    Rick Perlstein, in the course of a review of Lou Cannon's new biography of Ronald Reagan; in the Chicago Tribune (Sept. 21, 2003):

    In summer 1968, [Ronald] Reagan's approval rating in California was 30 percent, and he was the object of a serious recall campaign himself (that this was also the period in which he was being seriously pushed as a Republican presidential nominee gets to the heart of the enigma of Ronald Reagan, that creature of dreams, whose mysteries neither [Edmund] Morris nor [Lou] Cannon, and certainly not conservative propagandists like Dinesh D'Souza, have yet come even close to figuring out). The recall was run by liberals, it was a shoestring operation--as of June 1968 they had spent $7,000 and had $52 in the bank, compared to the nearly $2 million that had been pledged to the Davis recall by June this year--and it did rather well, considering: They got about half the signatures the Davis petitions did, though they were gathered so unprofessionally that only about 30 percent of them counted.

    Click here to return to top of page.

    Edison and the Electric Chair (posted 9-25-03)

    From the Economist (Sept. 18, 2003):

    FOR nearly two centuries America has struggled to reconcile its affection for the death penalty with its image of itself as a humane and just society. Few episodes in this long history have been as strange as the one told by Mark Essig in his new book, “Edison & the Electric Chair”. To describe Mr Essig's tale as shocking would be in poor taste, but its odd mixture of technology-worship, industrial sabotage, judicial complacency and rank hypocrisy should at least make American readers a bit queasy about their continued support for capital punishment.

    In the 19th century America had grown to dislike hanging, the usual method of executing condemned prisoners. Hangings often took place in public, frequently leading to riots and other unseemly behaviour among spectators. Hangings also were often botched, resulting in slow strangulation or decapitation. Opponents of the death penalty gained adherents by arguing that hanging was cruel and barbaric. To restore respectability to executions, supporters of the death penalty came up with the idea of electrocution.

    Electricity was a new and glamorous technology. It was, above all, modern. At the same time, it made people nervous. America was just beginning to wire up its major cities, and although the benefits of electric light were obvious to everyone, no one was quite sure how safe it was. After surveying execution methods from the guillotine (too bloody) to morphine overdoses (too pleasant), a commission appointed by the New York state legislature recommended in 1888 the use of electrocution, which it promised would be “instantaneous and painless” and “devoid of all barbarism”. The man who had persuaded the commission of this was Thomas Edison, America's most famous inventor.

    Edison's primary interest in recommending electrocution was to discredit his chief rival in the race to wire America, George Westinghouse. Edison's company used direct current. Westinghouse's firm used alternating current. Edison not only argued that electrocution would be the best new way to kill condemned prisoners, but that Westinghouse's alternating current would be better at it than his own direct current. In other words, his support for electrocution was a marketing ploy. Edison hoped that using alternating current for executions would indelibly associate it with death in the public mind, and give him an edge in the electricity market.

    Westinghouse responded by secretly hiring a high-profile lawyer to defend the first defendant selected by the state of New York to sit in the electric chair and, after he was convicted, to appeal against the use of the chair all the way to the Supreme Court.

    Click here to return to top of page.

    How Many Generals Have Been Elected President? (posted 9-19-03)

    Twelve presidents were generals: George Washington, Andrew Jackson, William Henry Harrison, Zachary Taylor, Franklin Pierce, Andrew Johnson, Ulysses S. Grant, Rutherford B. Hayes, James Garfield, Chester Arthur, Benjamin Harrison, and Dwight Eisenhower.

    The list of generals who have run for president, won their party's nomination and then lost include: Lewis Cass, Winfield Scott, George McClellan, Winfield S. Hancock.

    Admiral Dewey, Douglas MacArthur and Al Haig wanted to be president but failed to win their party's nomination.

    Note: This entry has been repeatedly updated in light of new information brought to the attention of the editor.

    Click here to return to top of page.

    Did the U.S. Count Civilian Dead in Vietnam? (posted 9-18-03)

    Chris Appy and Nick Turse, writing on www.tomdispatch.com (Sept. 18, 2003):

    "During the Vietnam War, the U.S. military counted virtually everything. Most notoriously it made enemy 'body counts' the central measure of American 'progress.' But it also counted sorties flown, bombs dropped, tunnels destroyed, propaganda leaflets dispersed, and toothbrushes distributed. In the bowels of the National Archives you can even find out how many X-rays were taken at the U.S. Army's 93d Evacuation Hospital in 1967 (81,700). But nowhere in this surreal and grisly record of bookkeeping can you find one of the war's most elemental statistics: civilians killed. Civilian casualties were routinely denied, ignored, or lumped together with those of enemy combatants; thus, the infamous GI saying, 'If it's Vietnamese and civilian, it's Viet Cong.'

    "Mike Davis's assertion that one million Indochinese civilians were killed from the air by the American military is a reasonable estimate, but even figures on overall civilian deaths in Vietnam alone cannot be precisely determined. The Vietnamese government believes that two million Vietnamese, most of them southerners, were killed in the American War (a figure that excludes hundreds of thousands of Cambodian and Laotian civilian deaths). The closest thing to an official American calculation of Vietnamese civilian deaths was done by a Senate subcommittee on refugees. Though relying too heavily on hospital figures (many Vietnamese casualties never made it to hospitals), that committee estimated 430,000 South Vietnamese civilian deaths.

    Click here to return to top of page.

    Voting in California 100 Years Ago: How Things Have Changed (posted 9-17-03)

    Cecilia Rasmussen, writing in the LA Times (Sept. 14, 2003):

    Fewer polling places and myriad candidates await California voters in the gubernatorial recall election, but for all the obstacles, voting may be less complicated than it was 100 years ago. Then, the official system for identifying voters relied on a logbook of their descriptions and deformities.

    Beginning in 1892, well before the driver's license became a universal form of identification, copperplate handwritten logs gave physical descriptions of voters, often noting missing limbs and other characteristics inflicted by a rough-and-tumble frontier life.

    In leather and canvas-bound voter registration books kept under lock and key at a Los Angeles museum are "the great registers," decades' worth of the county's voter rolls from 1866 to 1908. The rosters include each voter's birthplace and, if he wasn't born in the U.S., the date he was naturalized.

    William M. Hamlin was described as missing his "4th finger on his left hand and his 2nd, 3rd and 4th fingers on his right hand," while plumber Harry Ellsworth Dascomb was registered with an "artificial left eye," according to inky entries in the 1892 and 1896 voter registration books -- years when Grover Cleveland and then William McKinley were elected president.

    Irish immigrant Richard Dwyer and Missouri-born Charles Herman Brown were each missing a left foot. Henry Judson Ball, a merchant, had a "ballet girl tattooed on his left arm." Henry Drake was mute. Real estate salesman Casper Caesar Cohn had "locomotion of the eyes," and Edward Griffiths, 22, had a "cork left leg." Andrew Boton Gillett was a 6-foot-1 gardener whose sole description was "Negro" -- the only black voter listed in the 1892 register.

    Such vivid nuggets are found in the registration books at the Seaver Center for Western History Research, in the basement of the Natural History Museum of Los Angeles County. They are relics of a time when reformers tried to clean up corrupt election procedures in the wake of frequent and wide-scale vote-buying. Under the original state Constitution, eligible voters were "white males of U.S. citizenship" and, because of the Treaty of Guadalupe Hidalgo in 1848, "Mexicans 21 years of age." Voters had to have lived in California for at least six months and in the county where they were voting at least 30 days. Native Americans could not vote.

    Before formal voter registration began in California in 1866, counties did not print official ballots with the names of all candidates. Voters simply cast a ballot with a "vest pocket ticket," which was printed by a political party and bore the names of only that party's candidates. Democratic tickets were one color, Republican tickets another....


    Politicians embraced a new technology in the 1920s, using adding machines to tabulate election results. In 1928, the Board of Supervisors bought 150 mechanically operated counters for a total of $225,000. All a voter had to do was flip the lever alongside the candidate's name.

    The machines worked well in small elections, but didn't have the capacity to list all the candidates in larger elections. Each thousand-pound hunk of metal, known as an "automatic," was consigned to storage and later sold to San Francisco for 15 cents on the dollar.

    In 1949, the county Board of Supervisors voted to buy 200 Shoups -- named after the Ransom Shoup family in Pennsylvania. Each clunky voting machine had thousands of parts and cost $1,500. The machine could handle a ballot 10 columns wide, with 50 rows of names in each column, but it still wasn't big enough.

    Soon the Shoups were gathering dust in county warehouses. Although a few labor unions rented them for their own elections, the cost of transporting and setting them up was just too high. In 1953, the county sold them to smaller cities for $180,000 -- a net loss of $120,000.

    In the late 1950s, Los Angeles County sank nearly $1 million into developing a prototype "dream counter," an electronic vote-tallying device. The county earned almost $200,000 in royalties when the machine went into production. But within a few years the machines needed expensive improvements and county officials refused to sink any more money into the project.

    In 1962, Joseph P. Harris, a political science professor at UC Berkeley, came up with the idea of listing the candidates and issues on a single standard IBM card and putting it into a ballot holder. The result was the "Votomatic," which he sold to IBM six years later. The method is still in use today, "chads" and all.

    Although the machines are considered "obsolete, antiquated and unacceptable" and the state decertified their use as of March 2004, several counties plan to use them Oct. 7 -- including Los Angeles.

    Click here to return to top of page.

    Korea Used to Be Spelled Corea: Why That Is Now Important (posted 9-17-03)

    Barbara Demick, writing in the LATimes (Sept. 15, 2003):

    Yes, say Korean scholars and politicians who have begun a drive to change the official English-language name of their country to "Corea." The seemingly arcane campaign is based on an increasingly prevalent belief that the original "C" was switched to a "K" by the Japanese at the start of their 1910-45 occupation of the peninsula so that their lowly colonials would not precede them in the English alphabetical hierarchy.

    The controversy used to be fodder only for linguists and historians, but lately the debate has seeped out of academia and into the realm of the political. Twenty-two South Korean legislators last month introduced a resolution in their parliament calling for the government to adopt the Corea spelling -- the first time such a proposal has been made in official quarters in South Korea.

    North and South Korean scholars, who rarely agree on much, also held an unusual joint conference last month in Pyongyang, the North's capital, and resolved to work together for a spelling change. They hope it can be accomplished in time for the 2004 Olympics in Athens, when the estranged countries intend to field a joint team.

    "Scholars who have studied this more deeply than I believe it was part of the legacy of Japanese imperialists to eradicate our culture," said Kim Sung Ho, a South Korean legislator who was one of the sponsors of the new resolution.

    Most evidence supporting the claim is circumstantial. English books and maps published through the 19th century generally spelled the country's name as Corea, as did the British government in laying the cornerstone of its embassy in Seoul in 1890 with the name "Corea." But sometime in the early 20th century, "Korea" began to be seen more frequently than "Corea" -- a change that coincided with Japan's consolidation of its grip over the peninsula.

    Chung Yong Wook, a historian at Seoul National University, believes the Japanese -- who controlled the peninsula for four years before officially colonizing it in 1910 -- changed the name by the time of the 1908 Olympics in London so that Japan would come ahead in the ordering of athletes. But the closest thing he has found to a smoking gun is a 1912 memoir by a Japanese colonial official that complained of the Koreans' tendency "to maintain they are an independent country by insisting on using a 'C' to write their country's name."

    "I am sure, though, if the Japanese archives were opened you would find much more evidence to support the claim that the name was changed," Chung said.

    The North Koreans have embraced the movement to restore the "C" in Korea with much more enthusiasm than their Southern counterparts. Following the conference Aug. 21 at Pyongyang's Kim Il Sung University, the North Korean news agency KCNA referred to the current spelling as "a never-to-be-condoned, state-sponsored crime."

    "The Japanese imperialists deliberately changed the English spelling of the country's name in a bid to hurt the pride and dignity of the nation, while stretching their tentacles of aggression to it," declared the official news agency.

    Lee Sang Tae, a South Korean government historian who attended the Pyongyang conference, notes that North Korea, unlike South Korea, has not yet received reparations from Japan over the occupation and therefore might want to add the spelling manipulation to its claims for damages.

    Click here to return to top of page.

    The 10 Commandments? (posted 9-12-03)

    Jeffrey Weiss, writing in the Dallas Morning News (Sept. 10, 2003):

    A lot of what Americans think they know abut the Ten Commandments may not be true - or at least it's at odds with ancient tradition.

    For instance, there is no mention of "Ten Commandments" in the original Hebrew. In Exodus,

    when Moses comes down from the mountain, the rules inscribed on his tablets aren't named. When Moses does get around to talking about them, he calls them "aseret ha-d'vareem." The phrase can be translated as "10 words" or "10 statements," or even, in context, "10 things God said."

    That makes sense because, in the traditional Jewish numbering, the first on the list isn't a commandment: "I am the Lord your God, who has taken you out of the land of Egypt, from the house of slavery."

    OK, so maybe they aren't all exactly commandments, but we all know what they looked like, right? Just about everybody recognizes the connected double-tombstone as the shape of the stone Tablets of the Law.

    Except there's no mention of that shape - or any other - in the Bible.

    Jewish sacred tradition, contained in the Talmud, says the tablets were square or rectangular. And some tradition holds that they were made of sapphire, not granite.

    Experts say the familiar shape came from 11th-century English artists. Images comparing Christianity to Judaism - to the detriment of the older faith - used the tablets as a symbol of "outmoded" legalism. The artists used a shape that even those who could not read would instantly recognize - the "diptych," a writing tablet made of two arched, hinged slabs of wood or stone.

    The image was also used in badges that English Jews were forced to wear. Today, however, it has lost for most Jews its connotation as a Christian symbol or a symbol of intolerance, and it can be found in many synagogues.

    OK, so the shape is made up, but at least we agree on the words, right?

    Well, the King James Bible popularized the edict, "Thou shalt not kill." But scholars of Hebrew say the line is much better translated as, "Don't murder."

    Even the seemingly straightforward command, "Don't steal" may not be what it seems. A traditional rabbinic interpretation is that the line really means, "Don't steal another person" - in other words, "Don't kidnap."

    And Jews traditionally understood the end of the first line - "Who has taken you out of the land of Egypt, from the house of slavery" - to mean that these laws applied specifically to them. A shorter set of commands, the so-called Noachide laws, applied to non-Jews.

    Some Jewish sages made a case that the Ten Statements were a summary of all the laws of the Torah. But other sages said that other passages were even more central. Tradition holds that the Ten Statements weren't placed into the standard worship liturgy because to do so might have caused Jews to focus on those 10 laws to the exclusion of some 600 others contained in the Torah.

    Even Christian tradition is ambivalent about how central the Commandments are. In the Gospels, Jesus cited other Bible passages when asked about the most important laws. Paul, in the Letter to the Romans, does likewise:

    "The commandments, `Do not commit adultery,' `Do not murder,' `Do not steal,' `Do not covet,' and whatever other commandments there may be, are summed up in this one rule: `Love your neighbor as yourself.'"

    Click here to return to top of page.

    Bush Is a Divider not a Uniter (posted 9-12-03)

    Craig Gilbert, writing in the Milwaukee Journal Sentinel (Sept. 7, 2003):

    Opposition to the war, bitterness over the 2000 election count and the perception that Bush has governed from the right and not the center have helped deepen the animus many Democrats feel toward this presidency. At a huge Dean rally in Seattle last month, signs read: "I Really Hate Bush" and "Oust the Fink in 2004."

    The flip side of that antipathy is Bush's historic, almost unanimous support among Republican voters.

    The result is a polarized electorate, experts say.

    Poll sees huge gap

    In one CBS poll released last week, there was a roughly 60-point gap between Bush's approval ratings among Republicans (86%) and Democrats (27%). That pattern, repeated in other recent polls, places Bush with Bill Clinton as the most polarizing of recent presidents.

    Click here to return to top of page.

    Peter L'Enfant? (posted 9-5-03)

    Benjamin Forgey, writing in the Washington Post (August 30, 2003):

    Everybody knows who designed the new American nation's capital city back in 1791. It was Maj. Peter Charles L'Enfant, of course.

    Peter, not Pierre.

    Does that sound a little strange? You bet. We've been referring to the great man as Pierre for the better part of a century. Peter became Pierre, more or less officially, on April 28, 1909.

    That was the day L'Enfant's remains were moved from an obscure grave to lie in state for half a day in the Capitol Rotunda, and then were reinterred at that marble tomb high in Arlington National Cemetery, overlooking the beautiful city he conceived.

    But Peter is the name the French-born Pierre preferred during most of his long life in the United States, according to historian Kenneth R. Bowling, who has written a lively little book on L'Enfant. It's not the authoritative, full-scale biography L'Enfant needs and deserves, but it's feisty and informative, and it makes an excellent case for the name Peter.

    Click here to return to top of page.

    Quick: Guess Which Film Clip from the 20th Century Is the Most Popular? (posted 9-5-03)

    From History Today (Sept. 4, 2003):

    ITN has revealed the most requested news footage of the 20th century is the 1969 moon landing featuring Neil Armstrong. The assassination of John F Kennedy six years earlier is next in the poll of most poplar clip from the news archives. England captain Bobby Moore with the 1966 World Cup is also amongst the top 20 list but the oldest film dates to 1913, when suffragette Emily Davison was killed protesting in front of King George V's horse at the Epsom Derby. Managing director of the archive's Alwyn Lindsey said: "The ITN archive is probably the finest visual document of the people and events that shaped the world since capturing it on film first became possible." Other moments include Hitler’s 1933 rise to power, the Hindenburg disaster in 1937 and the atom bombs being dropped on Hiroshima and Nagasaki at the end of the Second World War. ITN, whose archives also include Reuters, Channel 4 and British Pathé, compiled another ‘unusual request’ list, which includes the Queen Mother's horse falling over in the 1956 Grand National and the wife of American president Harry Truman unveiling a new 1945 battleship but unable to break a bottle of champagne on its side. (Sept 3rd)

    Click here to return to top of page.

    Muslim Against Muslim: The War Within Islam (posted 9-2-03)

    Sydney Freedberg, Jr., writing in the National Journal (May 9, 2003):

    Three thousand Americans died in less than two hours on 9/11. Two Gulf Wars, operations in Afghanistan and Somalia, and numerous Islamist terrorist attacks since 1979 add another thousand Americans dead. In those same conflicts, U.S. troops killed probably more than 30,000 Muslims, mostly Iraqis; peace groups blame the U.S. for another hundred thousand Iraqi deaths. But those figures add up to only a fraction of the Muslims killed by other Muslims in the past two decades: 60,000 Iraqi Kurds and Shiites slain by Saddam Hussein in the 1991 uprisings alone; another 50,000 Kurds -- perhaps as many as 180,000 -- in the six months of Saddam's"Anfal" campaign in 1988; more than 500,000 -- maybe a million -- Iraqis and Iranians dead during the eight-year war between the secular Saddam and the fundamentalist Ayatollah Khomeini; 25,000 Afghans in Kabul alone in a single year, 1994; at least 50,000 Tajiks in their country's civil war; 150,000 Algerians in theirs; and on, and on, and on. Some Muslims have attacked America, and America is striking back. But conflicts, holy or otherwise, between Islam and the West are dwarfed by the war of Muslim against Muslim.

    Click here to return to top of page.

    Keep the 10 Commandments in the Alabama Courthouse? (posted 8-28-03)

    Question:"Do you approve of a federal court order to remove a Ten Commandments monument displayed in an Alabama court building?"

    Answer (according to a USA Today poll 8-28-03): 77 percent said"no."

    Click here to return to top of page.

    What Was the Triangle Trade? (posted 8-22-03)

    Libbie Payne, writing in the Boston Globe (August 17, 2003):

    What was the Triangle Trade?

    A. One of the most profitable international business arrangements involving Europe, the Caribbean, and North America, the Triangle Trade of rum, slaves, molasses, and goods played an important role in the early growth of this country.

    The name comes from the triangular pattern of shipping routes used to transport products unavailable in one part of the world to another. While the direction of shipping could begin at any point of the triangle, the cargo remained fairly consistent.

    Holland, Portugal, Spain, France, and England shipped textiles and other manufactured goods to Africa, where they were traded for slaves. The slaves were, in turn, shipped to the West Indies and put to work on British sugar cane plantations. Sugar and molasses were shipped to the Colonies, where they were traded for tobacco, fish, lumber, and rum, which was then shipped to Europe or the West Indies.

    Rum was big business in the Colonies. In the 1760s there were 22 distilleries in Rhode Island and 63 in Massachusetts. Medford became especially known for its rum, an industry begun in the early 1700s by John Hall and carried into the 20th century by the Lawrence family.

    The British outlawed slavery in 1772 and banned the Atlantic slave trade in 1807. The importation of slaves was declared illegal in the United States in 1808. However, the institution of slavery, and the Triangle Trade, persisted well into the 19th century.

    Click here to return to top of page.

    Why Do Suburbs Feature Grand Lawns? (posted 8-22-03)

    Sam Allis, writing in the Boston Globe (August 17, 2003):

    Welcome to the insanity of lawns. Our national lawn pathology, always severe, is exacerbated this summer by the monsoons that have produced our very own elephant grass. Mowing is now an even more obnoxious activity than usual, if that is possible.

    So why mow? More to the point, why have lawns at all? Give me that ground cover that requires little more than a nod every morning. What does it say about us that the two neighbors in a TV ad - Scotts, I think - spend most of their waking hours competing for the better lawn? Guys, get a grip....

    No other country in the world shares our lawn obsession. Lawn historians say the whole thing started when 18th-century Brit aristocrats favored manicured sweeps at their country homes, and these, in turn, spawned our particular lawn disorder. But then we've aped the Brits about almost everything, badly. (Remember our national genuflection at "Masterpiece Theatre"? Forget lawns, what the Brits are justifiably lionized for are their magnificent country gardens - confections of loose beauty, designed precisely to reject the strictures of the formal continental gardens of old, where everything looked like an Escher print.

    Contrast the inspired Brit package with the stark uniformity of a suburban street around Boston, where you grow clover at your peril. Where you'd best change the locks if you don't get rid of your dandelions before the fluff blows across neighbors' lawns like pollution from a Midwest power plant. Where lawns without walls that are designed to be open affairs are, in fact, fiercely held turf.

    Face it, lawns define the suburban ethos more than book groups and Fluffernutters combined.

    Click here to return to top of page.

    House-Senate Conference Committees: A Tempestuous Past (posted 8-22-03)

    Bill Walsh, writing in the Times-Picayune (August 18, 2003) about the conflicts that arise between Senate and House conferees:

    No one disputes that the majority party holds sway in conference. But there is a long-running debate on Capitol Hill and in academia about whether the House or Senate has an advantage. That rivalry has sometimes boiled over into petty disputes, such as on which side of the Capitol the conference committee should meet.

    Things got so tense concerning the location of a spending bill conference in the early 1960s, according to Associate Senate Historian Don Ritchie, that deliberations on all federal appropriations stopped. Neither side thought its members should have to walk the length of the Capitol to meet on the "turf" of the other body.

    Fortunately, Ritchie said, a new room being constructed as part of a Capitol expansion happened to fall dead center between the House and Senate. The room, EF-100, served as neutral ground for House and Senate conferees.

    Click here to return to top of page.

    Road Accidents ... Are Cars to Blame? (posted 8-21-03)

    Bob Montgomery, writing in the Irish Times (August 20, 2003):

    THE GOOD OLD DAYS: Road accidents, however regrettable, are by no means a peculiarity of the motor vehicle. They have been happening as long as there have been roads.

    Deaths on the roads of France in 1899, the first year for which comprehensive records are available, numbered the surprising total of 876 - two were the result of motors while the rest involved horse-drawn traffic. In the same year, an incredible 8,700 were injured in horse-related accidents on France's roads.

    Click here to return to top of page.

    Has a UN Official Ever Been Specifically Targeted for Assassination? (posted 8-21-03)

    Steven Edwards, writing in the Montreal Gazette (August 20, 2003), about the death of Vieira de Mello, the UN representative in Iraq:

    If personally targeted in yesterday's attack, Vieira de Mello, 55, would be the first UN official assassinated since 1948, though some historians have speculated Soviet operatives were responsible for causing the 1961 plane crash in Africa that killed Dag Hammarskjold, then secretary-general.

    Click here to return to top of page.

    When Did World War II End? (posted 8-14-03)

    Stroube Smith, writing in the Washington Times (August 14, 2003):

    There is some confusion over when we should say this cataclysmic conflict ended. The AP Stylebook says Aug. 15, the day Japanese Emperor Hirohito broadcast the news to his people. Because of that notation, that is the date most often used by newspapers. Others insist on Sept. 2, when Gen. Douglas MacArthur presided over the formal surrender signing aboard the USS Missouri in Tokyo Bay.

    To me, though, it will always be that evening of Aug. 14 and the wild celebrations Truman's announcement set off on South Lee Street in Alexandria, in the rest of the city and across the nation. It is also the day the killing, for the most part, came to an end.

    Click here to return to top of page.

    What Is the Baath Party (posted 8-11-03)

    Cameron McWhirter, writing in the Atlanta Journal and Constitution (August 10, 2003):

    What is Baathism?

    At one time, Baathism was a movement espousing lofty ideals of Arab brotherhood and equality, with a goal of uniting all Arabs into one powerful, secular state. But, like other "isms" of the 20th century, Baathism was twisted and corrupted by tyrants. By the time U.S. troops invaded Iraq, Baathism had degenerated into a vehicle of control for dictators, its founding principles long since abandoned.

    "This party embodies all the things that went wrong in the Middle East," said Juan Cole, an historian at the University of Michigan who has written extensively about modern Islamic movements. "It was started by ideologues, but by the '90s, it was a mafia kind of thing."

    In the 1950s and 1960s, Baathism had supporters in Syria, where it was founded, as well as Iraq, Jordan, Lebanon, Yemen and other parts of the Middle East. Today, the only nation where a Baath Party still holds power is Syria, and it long ago ceased trying to spread Baathism to other countries.

    Whatever the future holds for the turbulent Middle East, scholars agree that it won't include Baathism.

    Origins in Syria

    The ideology grew out of discussions among intellectuals in the cafes of Damascus, the capital of French-occupied Syria, in the late 1930s. Two schoolteachers, Michel Aflaq, a Christian, and Salah al-Din Bitar, a Muslim, formed a movement that they called Baath, Arabic for "rebirth." The movement attracted students and others interested in overthrowing the French colonial government.

    The principal tenet of Baathism was unifying all Arabs into one nation. The founders believed that the various colonial states in the Middle East had been imposed in part to divide Arabs and weaken them.

    "The basic idea of Baathism, which was pan-Arabism, made a lot of sense," said Cole at Michigan. "If I was an Arab, I would be pan-Arabist. Imagine if there was a 'United States of Arabs.' It would be a huge powerhouse. The problem is that it's a conclusion that one comes to in the abstraction. When you try to implement it, you run into problems."

    Ole Holsti, a political science professor at Duke University and an expert on the Middle East, said Baathism downplayed Islam and offered what he called "Islam-lite" to supporters. Baathists argued that the Arab people chiefly were not united by religion, but by language, culture and history.

    "Islam had its place, yes, but there was a clear understanding that the Baathists were going to have a secular regime," he said.

    Alan Godlas, associate professor of religion at the University of Georgia and expert on Islamic and Arabic movements, said the Baathist vision was to create a unified democratic Arab state, with state control of the entire economy. "The intention was not to form dictatorships," Godlas said.

    Internal squabbles

    The first Baath political party was organized in Syria in 1943. By 1946, after the French left Syria, the Baath Party grew into a major player in the country's politics.

    In fierce competition for supporters with the communist party, the Baath set up a tightly controlled party structure, similar to the communist concept of "cells," small groups of devoted followers.

    But from its beginnings, the party suffered from internal squabbles. Baathist nationalists thought the party should take over one country and work on socialist reforms there before uniting with other states. Baathist regionalists, though, argued the countries should unite first, then work on reforms. This bickering eventually tore the movement apart.

    In 1963, the Baath Party took control of Syria. As the party consolidated power, a large faction of the leadership --- led by military officers --- were nationalists. A civilian regionalist faction, led by movement founders Aflaq and Bitar, argued that the Baathists now must export their "revolution" to other Arab states.

    Gaining power in Iraq

    Baathist parties already operated in neighboring states, including Iraq, where a young party cadre, Saddam Hussein, was climbing up the ranks.

    In 1966, a split among Syrian Baathists led the military faction to exile Aflaq and Bitar. They both denounced Syrian Baathism as a betrayal of their movement.

    In 1968 a coup in Iraq brought the Baathists to power there, with the help of the military. Initially, Aflaq and Bitar hoped Iraq would follow their lead of spreading Baathist revolution, but soon party leaders focused on rooting out internal enemies. Saddam was in the top leadership from the beginning, but did not seize total control of the party and the country until 1979.

    In Syria, dictator Hafez Assad had taken control of the party and the government in 1970. He ruled until his death in 2000, when his son, Bashar Assad, took over.

    Georgetown University Professor Steve Heydemann, an expert in Middle East politics and the author of a book on the Syrian Baath Party, said the parties in both countries were vehicles for ambitious and ruthless men to set up dictatorships.

    Click here to return to top of page.

    Productivity Isn't Increasing Faster Now than in the Past (posted 8-11-03)

    Economist Dean Baker, commenting on an article in the Washington Post (August 11, 2003):

    The Post article notes the poor employment performance of the economy in the recovery thus far, which it attributes to "unusually big gains in productivity." Actually, productivity growth in this recovery has not been very different from that during past recoveries. The table below shows the average rate of productivity growth in the current recovery and the prior five recoveries, for the non-farm business sector, during the first seven quarters of each recovery.

    2001:3 -- 03:2 – 4.5%
    1991:1 -- 92:4 – 3.7%
    1982:3 -- 84:2 – 3.8%
    1975:4 -- 77:3 – 3.0%
    1970:4 -- 72:3 – 4.3%
    1960:4 -- 62:3 – 5.4%

    While the 4.5 percent rate of productivity growth in the current recovery is somewhat more rapid than the 4.0 percent average of the prior five, the difference is not large enough to explain the huge difference in employment growth. In the past, rapid productivity growth did not seem to impede employment growth. For example, the 4.3 percent rate of productivity growth following the 1970 recession did not keep the economy from creating 4.5 million jobs between November 1970 and November 1972.

    Click here to return to top of page.

    Was Duke University Built with Tobacco Money? (posted 8-6-03)

    Jeff Elder, writing in the Charlotte Observer (August 6, 2003):

    Q. Is it true that Duke University was built with tobacco money?

    Many folks in the Carolinas -- including many Duke alums and students -- will tell you this is true.

    It is not.

    At least according to Robert Durden, professor emeritus of history at Duke and author of several books about the Duke family.

    The Dukes certainly made plenty of money in tobacco early on. But after the turn of the century James B. Duke turned his attention to -- and invested his money in -- hydroelectric power. Duke was convinced that textile manufacturing could transform the Carolinas. All he needed was cheap, plentiful energy.

    He was right. In 1905, the Dukes founded the Southern Power Co., now known as Duke Power. Within two decades, this company was supplying electricity to more than 300 cotton mills and various factories, cities and towns in the Carolinas. The Duke family earned tremendous profits, and that money is what went toward building Duke University.

    Duke's scandalous love affair

    There's another matter Durden says many folks might not know about James B. Duke. A quiet, private man and a devout Methodist, Duke hated personal publicity. At one point he got plenty.About the time he was investing in hydroelectricity, Duke met what professor Durden describes as "a very well-shaped, good-looking divorcee" named Lillian McCredy. He bought her a house in New York and would steal away to see her.

    After a decade of this, Duke's father, Washington, fell and broke his hip. Feeling that he was dying, Washington Duke called his son to his side and made him promise to marry his mistress to make an honest woman out of her.

    McCredy was "a loose lady," Durden says, and not the type Duke wanted to marry. But he felt that he had to honor his father's wishes. The marriage was a disaster. Duke quickly learned his bride was running around on him. He hired private detectives to get proof, and sued her for divorce.

    The tabloids pounced on the story. The scandal literally made Duke sick, and he nursed health problems.

    Later he married a widow from Atlanta, Nanaline Holt Inman, with whom he spent the rest of his life.

    Durden's latest book, "Bold Entrepreneur: A Life of James B. Duke," tells much more about a man who shaped the Carolinas like few others.

    Click here to return to top of page.

    Why Don't Editors Show Dead People on TV or in Newspapers? (posted 7-31-03)

    Ken Ringle, writing in the Washington Post (July 25, 2003):

    War is an unpleasant business, death itself rarely less so. Therefore, if you're a government or a newspaper or broadcaster, how do you treat the visual images of the shattered corpses that war provides?

    Yesterday, the allied forces in Iraq reached a Solomon-like (or maybe Kafka-like) decision regarding photographs of Uday and Qusay, the two repellent sons of Saddam Hussein who were killed in a shootout with U.S. forces near Mosul....

    This squeamishness about violent death is a relatively modern sensibility. Highwaymen and bandits were once drawn and quartered, and hung in pieces at country crossroads as a cautionary display. In the early days of this republic, pirates' corpses rotted in chains on what is now Ellis Island to give mutiny-prone sailors something to think about as they left New York harbor.

    Tamerlane built towers displaying the skulls of enemies who defied him. Six hundred years later, some of those towers still stand. And Pol Pot left warehouses of skulls and bones after the Cambodian genocide of the 1970s.

    In the modern West, however, the industrialization of death has generally been coupled with a curious reluctance to display -- at least immediately -- photographic evidence of what that industrialization means. The famous photos of Pearl Harbor show no American dead. The government banned publication of any photos of dead U.S. servicemen until more than two years into World War II.

    "Eventually they decided this was dishonest and released three photos from Buna Beach in New Guinea," said historian and critic Paul Fussell yesterday from his home in Philadelphia. "The pictures of the dead didn't show any faces, of course. The soldiers looked like they could have been asleep."

    Fussell, a World War II combat veteran and author of "Wartime" and "The Great War and Modern Memory," acknowledged that the photos did at least show bodies, but said, "Unless you show guts hanging from the trees like Christmas decorations, you're not showing what war is really all about."

    There have been exceptions when dealing with the enemy. In April 1945, many newspapers and newsreels showed pictures of Italian dictator Benito Mussolini and his mistress, Clara Petacci, after they had been shot and hung upside down in a public square while trying to escape to Switzerland.

    The picture, one of the most famous of World War II, showed blood and gore aplenty, but it also showed the fastidiousness of the era: Petacci's skirt had been roped shut against any crotch display that might offend....

    If the rotting detritus of war has rarely graced the front pages and television screens of the United States, some photographs taken at a particular moment of death or horror have become almost iconic: Robert Capa's Spanish Civil War soldier caught, arms flung out, at the moment of his fatal bullet's impact; Eddie Adams's famous Vietnam War shot of the pistol execution of a Viet Cong guerrilla.

    These, however, have been exceptions. One looks in vain in British or American newspapers of World War I for the photographic images to match the horrific recollections of letter writers and diarists chronicling the monstrous Golgotha of trench warfare where, as historian John Keegan has noted, the most frequent foreign material encountered by surgeons treating soldiers' wounds was bits and pieces of other people....

    In October 1862, Mathew Brady opened in New York one of the first exhibits of war photographs. It was called "The Dead of Antietam," and it portrayed in almost elegiac fashion the rotting and bloated human remnants of the bloodiest day of the Civil War. The pictures, of course, were in black and white, and they showed no recognizable faces. But New Yorkers found them profoundly disturbing. .

    As the 20th century dawned and wore on, Americans began to fence off death verbally with euphemisms ("passed away," "passed on"), geographically with "sleep rooms" and "funeral parlors," and visually by hiding photographs of corpses as a matter of taste.

    Click here to return to top of page.

    What Event in History Cost the U.S. the Most Money? (posted 7-31-03)

    Peter Hartcher, writing in the Australian Financial Review (July 26, 2003):

    If you had to guess the event that cost the United States more money than any other in its history, would you choose the Civil War? World War II? Or the Wall Street Great Crash of 1929? All of these are in the top five, but none even begins to approach the scale of America's most stupendously expensive event: the stockmarket collapse that the country has just lived through. The cost to date is $US6.5 trillion in lost shareholder wealth. For proportion, World War II ranks second - in today's dollars, it cost the country $US3.4 trillion.

    Click here to return to top of page.

    Why Do Museums Now Have to Worry About the Origins of Their Artifacts? (posted 7-29-03)

    Carol Kino, writing in Slate (July 28, 2003):

    [Why don't museums turn a blind eye to the provenance of their artifacts as used to be the case?]

    The laws that allow countries to seek restitution of what's known as "cultural property" are a byproduct of the early 20th century, when art-rich countries like Turkey, Italy, and Greece began to introduce what are known as "patrimony laws." (These essentially deem all newly discovered artifacts found within their borders to be the property of the state.)

    The movement to protect world culture dramatically intensified after World War II, during which the Nazis and the Russian army confiscated unprecedented numbers of artworks from individuals and public institutions throughout Europe. 1954 saw the drafting of the Hague Convention—the first major international agreement to establish guidelines for protecting cultural property during wartime. Then, in the 1960s, the international art market heated up so much (resulting in increased trade of stolen goods) that UNESCO, in 1970, drafted another convention that encouraged countries to work together as much as possible to enforce each other's export restrictions. (By 2003, UNESCO's guidelines had been ratified by 96 countries, including the United States.) As Thomas Hoving, a former director of the Metropolitan Museum of Art, famously wrote in his 1993 memoir, Making the Mummies Dance, "I recognized that with the UNESCO hearings, the age of piracy had ended."

    Today, trying to make sense of all the different international laws is enough to set anyone but a lawyer wailing like the tortured figure in Edvard Munch's "The Scream." In 1995, UNIDROIT (originally the legal auxiliary of the old League of Nations) drafted a convention that aims to enforce export restrictions and help unify cultural property laws worldwide. Within most countries, illegally gotten cultural property is generally covered by a nation's stolen property laws. But transport that cultural property across a border, and you may have violated civil law, criminal law, an import or an export prohibition, or a combination of the above, depending on which country we're talking about, what the object is, and who owns it—and that's just for starters. Much also depends on the particulars of the bilateral and multilateral agreements, if any, between the countries in question, which stipulate whether and to what degree one will honor another's export restrictions.

    Obviously, when the dispute is between nations, national pride, politics, and political grandstanding tend to take precedence over law. That's probably why such disputes have a habit of becoming so emotional, and so unresolvable—as evidenced by the long-running brouhaha over the Elgin Marbles, which escalated about 20 years ago. Britain holds that the sculptures, removed from the Parthenon in the early 19th century, were legally purchased by Lord Elgin from the Ottoman Empire, which then controlled Greece—a move that thereby saved them from destruction during Greece's War of Independence and by modern-day Greek air pollution. Yet Greece counters that the seller was an occupying force, therefore the purchase shouldn't count. Both nations regard the sculptures as their cultural patrimony. But Greece didn't exist as an independent nation until 1832—and in any case, its 20th-century patrimony laws can't be applied retroactively. Perhaps that's why Greece, so far, has attempted to resolve the matter through diplomacy, rather than in court.

    Last December, an alliance of about 40 major museums, known as the Bizot Group, issued a statement in support of the so-called "universal museum"—one whose collection brings together work from many periods and cultures. (18 museum directors signed the statement, including those of the Metropolitan, the Louvre, the Museum of Modern Art, and the Hermitage.) The statement argues that with time, objects become "part of the heritage of the nations which house them." Clearly, the signatories were also trying to protect their own backs: If the British Museum were ever to return the Elgin Marbles to Greece, the act would likely unleash a torrent of similar claims that could drain the resources—and the collections—of some of the world's great treasure-house museums.

    Nonetheless, the Bizot statement has since been slammed by various museum associations and cultural watchdogs for being "Eurocentric" and for taking "a George Bush approach to international relations."

    Click here to return to top of page.

    Bush's 2 Question Rule (posted 7-16-03)

    Richard W. Stevenson, writing in the NYT (July 13, 2003):

    Mr. Bush almost never holds formal news conferences. Instead, he frequently takes a few questions from reporters, especially after meetings with foreign leaders. He has a strict rule: he calls on two American reporters and his counterpart calls on two reporters from the other country's press corps.

    Mr. Bush is a stickler about the practice, even if it means chiding another leader on his own turf. When President Festus G. Mogae of Botswana tried to start one of these sessions on Thursday by saying, "Does anyone want to ask . . ." Mr. Bush cut him off good-naturedly and said, "That's not the way we do it."

    Click here to return to top of page.

    Did the British Invent Lasagne? (posted 7-15-03)

    Fropm the BCC News (July 15, 2003):

    It's so British the court of Richard II was making it in the 14th Century and most likely serving it up to ravenous knights in oak-panelled banqueting halls.

    The claim has been made by researchers who found the world's oldest cookery book, The Forme of Cury, in the British Museum.

    A spokesman for the Berkeley Castle medieval festival, with whom the experts were working, said: "I defy anyone to disprove it because it appeared in the first cookery book ever written."

    It is not known whether he has dared put the claim to outspoken Italian prime minister Silvio Berlusconi.

    But the Italian embassy in London reportedly responded: "Whatever this old dish was called, it was not lasagne as we make it."

    I think it must have been the Romans who brought it over

    And Bristol restaurateur Antonio Piscopo fired an emphatic warning shot.

    "I think it's rubbish. I think it must have been the Romans who brought it over. It is definitely Italian."

    The recipe book does not mention meat - a staple of a good lasagne.

    And such an early use of tomatoes in food would have had medieval cooks spluttering into their espressos.

    But it does describe making a base of pasta and laying cheese over the top.

    It calls this "loseyns", which is apparently pronounced "lasan", although it fails to mention whether it should be followed with a sweet tiramasu and a glass of Amaretto.

    Pasta faded from the British diet when potatoes arrived, according to the researchers. The hearty roast dinner soon swept all before it.

    Britain would be well advised not to make a meal of the claim, because Italy's track record on food fights is impressive.

    Click here to return to top of page.

    Putting the Death Toll of Saddam's Victims in Perspective (posted 7-9-03)

    Sharon Waxman, writing in the Washington Post (July 7, 2003):

    An estimated 290,000 people are missing and believed to be buried in mass graves throughout Iraq. In a country of 22 million, that is more than 1 percent of the population, the equivalent of about 3.5 million people in the United States. The vast majority of these bodies have not been found.

    By comparison, forensic experts working in the former Yugoslavia estimated that"ethnic cleansing" left 30,000 dead in mass burial pits. It was there that the specialty of forensic archaeology emerged and proved its worth, as the careful evidence-gathering of experts was later used in trials that succeeded in convicting war criminals. In the Iraq war, the U.S. government did not wait long to recruit a group of forensic archaeologists with expertise in things like human anatomy and geophysics. Most of them are in their twenties and come from universities around the globe or from other projects involving crimes of war. Since the fall of Saddam Hussein in April, these researchers have identified 80 to 100 mass graves in Iraq. The number depends upon how one counts, since some sites include several mass graves in close proximity.

    Click here to return to top of page.

    The Story of the Only Enlisted Man to Be Honored with a Memorial at Gettysburg (posted 7-9-03)

    Mark Roth, writing in the Pittsburgh Post-Gazette (July 6, 2003):

    Amos Humiston is the only enlisted man at Gettysburg who has his own monument on the battlefield. It wasn't because of his heroism in the battle. A Union sergeant in New York's 154th "Hardtack" regiment, Humiston was killed on the first day of fighting in Gettysburg, after Confederate troops overwhelmed his company at a spot known as Kuhn's Brickyard.

    What earned him a permanent marker was his love for Frank, Freddie and Alice.

    Humiston was just one of more than 3,000 Union soldiers who died in the monumental three-day conflict. But when his body was found later that week, lying in a secluded spot at York and Stratton streets in Gettysburg, he was holding an ambrotype -- an early kind of photograph -- and on it were the serious, round faces of his three adored children: 8-year-old Frank, 6-year-old Alice and 4-year-old Freddie.

    Somehow, historians believe, Amos Humiston had managed to drag himself to this patch of ground after he had been wounded, and was probably looking at his children's faces when he died.

    Even then, Humiston might have faded into obscurity, because there was nothing on his body to identify him and the few soldiers from his unit who survived the battle had moved on before he was found.

    Somehow, though, the image of his children ended up in the possession of Dr. John Francis Bourns, a 49-year-old Philadelphia physician who helped care for the wounded at Gettysburg. Months after wrapping up his volunteer work there, he decided to try to find out the identity of the children's father.

    His efforts produced a wave of publicity that swept the North and became the People magazine cover story of its day.

    It began quietly enough, on Oct. 19, 1863, when the Philadelphia Inquirer published a story under the provocative headline: "Whose Father Was He?"

    "After the battle of Gettysburg," the article read, "a Union soldier was found in a secluded spot on the battlefield, where, wounded, he had laid himself down to die. In his hands, tightly clasped, was an ambrotype containing the portraits of three small children ... and as he silently gazed upon them his soul passed away. How touching! How solemn! ..."

    "It is earnestly desired that all papers in the country will draw attention to the discovery of this picture and its attendant circumstances, so that, if possible, the family of the dead hero may come into possession of it. Of what inestimable value will it be to these children, proving, as it does, that the last thought of their dying father was for them, and them only."

    When the article appeared 140 years ago, newspapers were not able to publish photographs, and so the story, subsequently reprinted in dozens of newspapers and magazines throughout the North, had to rely on a detailed description of the children. The eldest boy, it said, was wearing a shirt made of the same fabric as his sister's dress. The younger boy in the middle was sitting on a chair, wearing a dark suit. It estimated their ages at 9, 7, and 5, only a year off the mark.

    One of the reprints appeared in the American Presbyterian, a church magazine. That is where Philinda Humiston, living in Portville, N.Y., first saw word of the ambrotype and the dead soldier. She hadn't heard from Amos since weeks before Gettysburg, and when she saw the description of the children, she feared the worst.

    But she couldn't be sure. So she contacted Bourns through a letter written by the town postmaster.

    Bourns had printed copy upon copy of the children's picture to respond to inquiries, but so far, none of the people who had contacted him had turned out to be the right family. He replied to Philinda's inquiry as he had to the others.

    And so it was that one mid-November day, four months after the battle, she opened the envelope from Philadelphia and knew for sure that she had been widowed for a second time, and that her children were fatherless.

    The story might have ended there if it weren't for another idea Bourns had. He believed he could capitalize on the outpouring of sympathy toward the Humistons to raise funds for an orphanage in Gettysburg, to house the children of fallen Union soldiers.

    And so a second publicity campaign began, appealing for donations.

    Click here to return to top of page.

    The Origins of the Great Seal (posted 7-9-03)

    Linda Hales, writing in the Washington Post (July 4, 2003):

    The Continental Congress named three of the best minds in the new country -- Benjamin Franklin, Thomas Jefferson and John Adams -- to devise an emblem for a free people with great aspirations. The imagery that now seems so obvious -- 13 stars and stripes and an American bald eagle carrying weapons of war but facing an olive branch -- required multiple committees, with consultants in tow, over six years. The deliberative process went on so long that the worst ideas were weeded out. An important and timely symbol -- the olive branch of peace -- survived.

    The results can be judged by glancing at the back of the $ 1 bill. The actual working seal, which resembles a large silver dollar, resides in a faded mahogany cabinet in a plexiglass cage in the State Department's Exhibition Hall. It is put to use almost weekly in the time-honored ritual of stamping presidential appointments and envelopes bearing ambassadorial credentials, plus the occasional peace treaty.

    "This is history," said Sharon L. Hardy, chief of the State Department's presidential appointments staff, as she executed a perfect seal on a document....

    As a coat of arms, the Great Seal decorates military uniform buttons and plaques over entrances at embassies and consulates abroad. It also served as the model for the Presidential Seal, but history records that the original designer of that later seal, in 1880, got one of the most important details wrong: the direction of the eagle's gaze. Instead of facing to its right -- in heraldry, the direction of honor -- the presidential bird was made to face left, toward the sinister side and the talon holding the arrows of war. In an ironic quirk of White House history, the error is preserved in a bronze seal that once graced the entrance to the mansion. According to the White House curator's office, the eagle now hangs over the entrance to the Diplomatic Reception Room on the ground floor of the residence.

    In the beginning, the design committee wandered through an intellectual forest of classical and biblical themes. At one point, Franklin and Jefferson are said to have favored a design with Moses crossing the Red Sea chased by the pharaoh. They consulted with a portrait artist, Pierre Eugene du Simitiere, and managed to wrap up their assignment in 46 days. But the proposal they submitted to the Continental Congress was an unwieldy amalgam of 13 shields and two heraldic figures. Congress dismissed it with an order to "lie on the table."

    The effort was not a total failure. The committee's motto, "E Pluribus Unum" ("Out of Many, One"), has survived for eternity. And an unfinished pyramid under the eye of Providence found its way onto the reverse side of the die, and to the dollar bill.

    Four years later, a second committee was named, and Francis Hopkinson, designer of the American flag, signed up. But Congress was not swayed by his proposal for an oversize red-white-and-blue shield flanked by two 18th-century figurines. Hopkinson deserves credit for adding the olive branch, which endures as a symbol of peace.

    The first eagle appeared two years later, when a third committee, working with a heraldry expert from Philadelphia, William Barton, produced a design with a small imperial bird. Again, the Continental Congress was underwhelmed.

    The following month, in June 1782, lawmakers turned to one of their own, Charles Thomson, secretary of the Congress. He was the official who would wield the seal. Thomson, neither an artist nor a designer, sifted through his predecessors' ideas before adding his own. He liked Barton's idea of a bird but decided the native American bald eagle was more appropriate. He kept the motto, a shield and the olive branch. After sketching a fresh concept, he asked for Barton's help. Their joint description was presented to the Continental Congress without a sketch.

    Essentially, it called for a red-white-and-blue shield floating in front of an eagle that was carrying a ribbon inscribed with a motto in its beak, an olive branch in one talon and 13 arrows in the other, under a constellation of 13 stars.

    Congress adopted the words June 20, 1782. The design sounds more complicated than it became in the hands of a skilled Philadelphia engraver. The name of the man who cast the first die in brass was not recorded, but his work was in use for nearly 60 years. The die is preserved at the National Archives, along with illustrations of preliminary committee designs.

    The current die, at least the sixth in a series, was engraved by Bailey Banks & Biddle based on a highly regarded, but short-lived version produced by Tiffany in 1885. Over time, the eagle has become more muscular, the olive leaves more numerous and the 13 arrows very finely tuned.

    "The arrows are my guide," Hardy said as she worked. "If the arrows are clear, I know that we have a good seal."

    The Great Seal has always portrayed the eagle facing toward the olive branch, in keeping with heraldic custom. How the presidential eagle came to face the arrows remains a mystery. But in 1945, Harry S. Truman gave the order to change it.

    A 1978 book, "The Eagle and the Shield," by the late State Department historian Richardson Dougall and Richard Patterson, includes an account from Clark Clifford, who served as a naval aide to Truman. Clifford reported that Truman considered the dropping of the atomic bomb to be so momentous that he wanted a symbolic reference incorporated into the seal.

    "What Truman did was to turn the eagle's head to face the olive branch," said Milton Gustafson, a National Archives expert who has the out-of-print book.

    Washingtonian George M. Elsey, the naval aide assigned to sketch a new presidential flag and seal for Franklin D. Roosevelt, remembers the story differently. In the winter of 1944, generals had just been offered an upgraded ranking of five stars, and Roosevelt thought the president's flag should rise from four to five stars, too. The flag's central element was the presidential seal.

    The project languished with FDR's death, but was resurrected by Truman. Elsey called on Arthur E. DuBois, the U.S. Army's heraldic expert. He credits DuBois with noticing the errant eagle, which had no basis in heraldic tradition. He also pointed out to Truman that the design had no basis in law.

    "DuBois was a purist," Elsey said. "Truman said okay."

    Presidential aides were left to explain the shift to the public in symbolic terms. Elsey still recalls the line written for the announcement in October 1945: "Truman is now changing from the arrows of war to the olive branch of peace."

    Although presidents have the power to redesign their seals, the nation's Great Seal is sacrosanct. Dies wear out. But there is no worry about what its replacement might look like. In 1986, the U.S. Bureau of Engraving and Printing made a master copy so that future seals will bear the elegant detail of a 1903-04 engraving.

    Click here to return to top of page.

    Wright Brothers Didn't Hail from North Carolina (posted 7-8-03)

    On a recent trip to Dayton, Ohio President Bush took pains to point out the importance of the Wright Brothers in history. Why? Because the Wright Brothers were from Dayton. And as the NYT account explained,"Much to the annoyance of Dayton residents, there is a mistaken impression that the airplane was invented in Kitty Hawk, NC, because that is where Wilbur and Orville Wright first flew their plane, on Dec. 17, 1903."

    Click here to return to top of page.

    Ohio's Importance in the Presidential Sweepstakes (posted 7-8-03)

    NYT July 4, 2003:

    According to the Almanac of American Politics, no Republican has ever won the presidency without winning Ohio, where registered Republicans outnum,bered Democrats by about 400,000 in the 2000 election. Mr. Bush, who is up for re-election in 2004, won Ohio in 2000 with 50 percent of the vote, compared to 46 percent for his opponent, Al Gore.

    Click here to return to top of page.

    Paul Revere's Ride: A Team Effort (posted 7-1-03)

    Rod Paschall, writing for thehistorynet (July 1, 2003):

    According to Paul Revere’s account of his historic 1775 ride, warning the countryside of the approach of the British was more a team effort than is generally realized.

    The enduring image of a lone Patriot nightrider rousing the countryside to arms has been burnished in American poems, books, and movies for two and a quarter centuries. The underlying message is always the same: A single brave man can make all the difference. In a letter written in 1798 to Massachusetts Historical Society founder Dr. Jeremy Belknap, Paul Revere described his actual adventures during his "Midnight Ride" of April 18-19, 1775.

    His mission was to warn of danger to Patriots outside Boston, particularly to two leaders who were opposing the government -- Samuel Adams and John Hancock. Revere began his account by recalling suspicious activities of British forces in Boston during the week preceding April 18. His original letter to Belknap is the property of the Massachusetts Historical Society.

    On Tuesday evening, the 18th, it was observed that a number of soldiers were marching towards the bottom of the Common. About 10 o’clock, Dr. Warren [Joseph Warren, one of the few Patriot leaders who had remained in Boston] sent in great haste for me and begged that I would immediately set off for Lexington, where Messrs. Hancock and Adams were, and acquaint them of the movement, and that it was thought they were the objects.

    When I got to Dr. Warren’s house, I found he had sent an express [fast messenger] by land to Lexington -- a Mr. William Daws [Dawes]. The Sunday before, by desire of Dr. Warren, I had been to Lexington, to Messrs. Hancock and Adams, who were at the Rev. Mr. Clark’s. I returned at night through Charlestown; there I agreed with a Colonel Conant [provincial militia veteran William Conant] and some other gentlemen that if the British went out by water, we would show two lanthorns [lanterns] in the North Church steeple; and if by land, one, as a signal; for we were apprehensive it would be difficult to cross the Charles River or get over Boston Neck. I left Dr. Warren, called upon a friend and desired him to make the signals.

    Click here to return to top of page.

    How Easy Would It Be for a PhD to Make an Atomic Bomb? (posted 6-25-03)

    Oliver Burkeman, writing in the Guardian (June 24, 2003):

    Its one of the burning questions of the moment: how easy would it be for a country with no nuclear expertise to build an A-bomb? Forty years ago in a top-secret project, the US military set about finding out....

    Dave Dobson's past is not a secret. Not technically, anyway - not since the relevant US government intelligence documents were declassified and placed in the vaults of the National Security Archive, in Washington DC. But Dobson, now 65, is a modest man, and once he had discovered his vocation - teaching physics at Beloit College, in Wisconsin - he felt no need to drop dark hints about his earlier life. You could have taken any number of classes at Beloit with Professor Dobson, until his recent retirement, without having any reason to know that in his mid-20s, working entirely as an amateur and equipped with little more than a notebook and a library card, he designed a nuclear bomb. Today his experiences in 1964 - the year he was enlisted into a covert Pentagon operation known as the Nth Country Project - suddenly seem as terrifyingly relevant as ever. The question the project was designed to answer was a simple one: could a couple of non-experts, with brains but no access to classified research, crack the "nuclear secret"? In the aftermath of the Cuban missile crisis, panic had seeped into the arms debate. Only Britain, America, France and the Soviet Union had the bomb; the US military desperately hoped that if the instructions for building it could be kept secret, proliferation - to a fifth country, a sixth country, an "Nth country", hence the project's name - could be averted. Today, the fear is back: with al-Qaida resurgent, North Korea out of control, and nuclear rumours emanating from any number of "rogue states", we cling, at least, to the belief that not just anyone could figure out how to make an atom bomb. The trouble is that, 40 years ago, anyone did.

    The quest to discover whether an amateur was up to the task presented the US Army with the profoundly bizarre challenge of trying to find people with exactly the right lack of qualifications, recalls Bob Selden, who eventually became the other half of the two-man project. (Another early participant, David Pipkorn, soon left.) Both men had physics PhDs - the hypothetical Nth country would have access to those, it was assumed - but they had no nuclear expertise, let alone access to secret research.

    "It's a very strange story," says Selden, then a lowly 28-year-old soldier drafted into the army and wondering how to put his talents to use, when he received a message that Edward Teller, the father of the hydrogen bomb and the grumpy commanding figure in the US atomic programme, wanted to see him. "I went to DC and we spent an evening together. But he began to question me in great detail about the physics of making a nuclear weapon, and I didn't know anything. As the evening wore on, I knew less and less. I went away very, very discouraged. Two days later a call comes through: they want you to come to Livermore."

    Livermore was the Livermore Radiation Laboratory, a fabled army facility in California, and the place where Dave Dobson, in a similarly surreal fashion, was initiated into the project. The institution's head offered him a job. The work would be "interesting", he promised, but he couldn't say more until Dobson had the required security clearance. And he couldn't get the clearance unless he accepted the job. He only learned afterwards what he was expected to do. "My first thought," he says today, with characteristic understatement, "was, 'Oh, my. That sounds like a bit of a challenge.'"

    They would be working in a murky limbo between the world of military secrets and the public domain. They would have an office at Livermore, but no access to its warrens of restricted offices and corridors; they would be banned from consulting classified research but, on the other hand, anything they produced - diagrams in sketchbooks, notes on the backs of envelopes - would be automatically top secret. And since the bomb that they were designing wouldn't, of course, actually be built and detonated, they would have to follow an arcane, precisely choreographed ritual for having their work tested as they went along. They were to explain at length, on paper, what part of their developing design they wanted to test, and they would pass it, through an assigned lab worker, into Livermore's restricted world. Days later, the results would come back - though whether as the result of real tests or hypothetical calculations, they would never know.

    "The goal of the participants should be to design an explosive with a militarily significant yield," read the "operating rules", unearthed by the nuclear historian Dan Stober in a recent study of the project published in the Bulletin of the Atomic Sciences. "A working context for the experiment might be that the participants have been asked to design a nuclear explosive which, if built in small numbers, would give a small nation a significant effect on their foreign relations." ...

    Eventually, towards the end of 1966, two and a half years after they began, they were finished. "We produced a short document that described precisely, in engineering terms, what we proposed to build and what materials were involved," says Selden. "The whole works, in great detail, so that this thing could have been made by Joe's Machine Shop downtown."

    Agonisingly, though, at the moment they believed they had triumphed, Dobson and Selden were kept in the dark about whether they had succeeded. Instead, for two weeks, the army put them on the lecture circuit, touring them around the upper echelons of Washington, presenting them for cross-questioning at defence and scientific agencies. Their questioners, people with the highest levels of security clearance, were instructed not to ask questions that would reveal secret information. They fell into two camps, Selden says: "One had been holding on to the hope that designing a bomb would be very difficult. The other argued that it was essentially trivial - that a high-school science student could do it in their garage." If the two physics postdocs had pulled it off, their result, it seemed, would fall somewhere between the two - "a straightforward technical problem, but one that involves some rather sophisticated physics". ...

    Einstein was famously said to have commented that if he had only known that his theories would lead to the development of the atom bomb, he would have been a locksmith. Dave Dobson, having designed one, got a job as a teacher.

    Click here to return to top of page.

    Do You Know the Story Behind the Nina, the Pinta and the Santa Maria? (posted 6-23-03)

    Alfred Van Peteghem, writing in the Montreal Gazette (June 23, 2003):

    Everybody remembers the names of Columbus's ships were the Nina, the Pinta and the Santa Maria, for example, but few know that before the church censored the names, his rotting caravels were called the Nina (the girl, or rather, the "working" girl); the Pintada (or the "painted" one; i.e. the girl wearing make-up, in other words, the prostitute) and the Maria Galante (the surname of another "lady of leisure").

    In other words, we never really learned historical characters were just that - characters, and very colourful ones at that. We might learn of their greatness as an edifying example, but we almost never learned of their faults, foibles and oddities.

    It's only mentioned in passing, for example, that when Champlain married Helene Boulle (after whom St. Helen's Island is named), she was 11 years old and he was over 50.

    And you'd never know from high-school history books "the first truly Canadian hero," Pierre Le Moyne d'Iberville, was first and foremost a man who believed "the end justified the means." His career as a fighting man began just as the guardians of one Jeanne-Genevieve Picote de Belestre brought a paternity suit against him, indicating his heroics were not confined to the battlefield. When he died suddenly in Havana in 1706, he'd been trying to dispose of iron ore he'd taken from France for the purpose of illicit trade with the Spaniards. What better proof of this great man's belief in the spirit of free enterprise?

    Madeleine de Vercheres is well known for fighting off Indian attacks. Not so well known, however, is that she spent far more time fighting her neighbours in court. Archivist Andre Vachon wrote she and her husband had bad tempers and that they threatened their tenants and even beat them up.

    In 1730, their parish priest took the couple to court, claiming Madeleine had accused him of having composed burlesque litanies full of impious, obscene and defamatory terms.

    The priest lost his case, but appealed to the Superior Council and won. In 1732, Madeleine went to France to plead her case before the King's Council, but was rejected. Finally, the matter was settled amicably in 1733; both parties were ordered to pay their costs and to refrain from talking about the matter.

    One of my favourite heroes was Martine Messier, who had the odd nickname of "Parmanda" - or "I swear" in her dialect. She earned it in 1652 for something she said after she'd fought off three Iroquois who attacked her about 100 yards outside Montreal's city walls.

    "The woman defended herself like a lioness," wrote Montreal's parish priest and first historian, Dollier de Casson, "but as she had no weapons but hands and feet, at the third or fourth blow they felled her as if dead. Immediately one of the Iroquois flung himself upon her to scalp her and escape with this shameful trophy. But as our amazon felt herself so seized, she at once recovered her senses, raised herself and, more fierce than ever, caught hold of this monster so forcibly by a place which modesty forbids us to mention that he could not free himself. He beat her with his hatchet over the head, but she maintained her hold steadily until once again she fell unconscious to the earth, and so allowed this Iroquois to flee as fast as he could, that being all he thought of at the moment, for he was nearly caught by our Frenchmen, who were racing to the spot from all directions.

    "In addition this episode was followed by a most amusing thing. When these Frenchmen who came to her help had lifted up this woman, one of them embraced her in token of compassion and affection. But she, coming round, and feeling herself embraced, delivered a heavy blow to this warm-hearted helper, which made the others say to her: 'What are you doing? This man but wished to show his friendly feeling for you with no thought of evil, why do you hit him?'

    'Parmanda,' she answered, 'I thought he wanted to kiss me.'" (The French text reads: "Je croyois qu'il me vouloit baiser," which is not entirely covered, in my opinion, by Ralph Flenley's translation.)

    Click here to return to top of page.

    How Much Do States and Localities Spend? (posted 6-23-03)

    Paul Overberg, writing in USA Today (June 23, 2003):

    Economic historian Richard Vedder says the public has signaled how much government it wants. State and local spending has accounted for roughly 11% or 12% of gross domestic product for the past 30 years.

    "When we get to the top of the range, there's a tax revolt," says Vedder, of Ohio University. "When we get to the bottom, there's a push for more spending. We revert back to the middle."

    Last year, state and local spending reached 13% of gross domestic product, the highest since record-keeping began in 1929.

    That could indicate that states will now emphasize spending cuts more than tax increases. Or it could represent a fundamental shift in what people expect from state and local governments. Similar changes have occurred twice before: in the Depression and during the 1960s, when new social programs added to state and local spending.

    Over the next few years, states will determine whether a new era has begun: Will citizens pay permanently higher taxes in exchange for better-funded public schools and health care? Or will they demand that taxes return to a more comfortable level?

    The narrow question is: Tax increases or spending cuts? The broader question is: What do citizens want from government? In the boom, governors and legislatures gave people what they wanted. In the downturn, people will decide whether to pay for it.

    Click here to return to top of page.

    What War Since World War II Has Been the Deadliest? (posted 6-20-03)

    From the Atlantic Monthly (July/August, 2003):

    What conflict has taken more lives than any other since World War II? Don't look to Asia, the Balkans, the Middle East, or even Rwanda for the answer. According to a recent mortality study released by the International Rescue Committee, the record breaker-by far-is the ongoing and under-reported war in the Democratic Republic of Congo (formerly Zaire).

    The IRC estimates that since the conflict began, in 1998, some 3.3 million"excess deaths" have occurred--that is, deaths from combat and from"easily treatable diseases and malnutrition, linked to displacement and the collapse of much of the country's health system and economy." (The rate of deaths in the second category rises and falls pro-portionally with the rate in the fust) Young children in particular have suffered: in three of the ten zones described in the IRC report, more than half of all children born since the conflict began have died by the age of two. As a result of the conflict the DRC now has a mortality rate of at least 2.2 people per thousand per month--the highest in the world, according to UN figures, and twice the African average.

    [Source:] "Mortality in the Democratic Republic of Congo: Results From a Nationwide Survey," International Rescue Committee

    Click here to return to top of page.

    Is the American Economy Really the Best Off in the World? (posted 6-13-03)

    Philippe Legrain, chief economist of"Britain in Europe," the campaign for Britain to join the euro; writing in the New Republic (June 11, 2003):

    Pause for a second. Allow some awkward facts to intrude. Which economy has performed better in recent years--Europe's or America's? Surprise: According to the International Monetary Fund, an institution more often accused of imposing Washington's ways than of knocking them, Europe's has. Over the past three years, living standards, as measured by GDP per person, have risen by 5.8 percent in the European Union but by only 1 percent in the United States. An unfair comparison, perhaps, given America's recent recession? Then look at how the European Union and the United States size up since 1995, a period that includes the go-go late '90s, when America apparently advanced by leaps and bounds. While living standards in the United States have risen by a healthy 16.1 percent over the past eight years, they are up by 18.3 percent in the European Union. Another statistical sleight of hand? Not at all. Pick any year between 1995 and 2000 as your starting point, and the conclusion is the same: Europe's economy has outperformed America's.

    To be fair, on a different measure, the United States has outpaced Europe. Its economy has grown by an average of 3.2 percent per year since 1995, whereas Europe's economy has swelled by only 2.3 percent. These headline figures transfix pundits and policymakers alike. But this apparent success is deceptive. Not only are U.S. growth figures inflated because American number-crunchers have done more than their European counterparts to take into account improvements in the quality of goods and services, but America's population is also growing much faster than Europe's. It has increased by nearly one-tenth in the past eight years, whereas Europe's population has scarcely grown at all. So, although America's pie is growing faster than Europe's, so too is the number of mouths it has to feed. Most people, though, care about higher living standards, not higher economic growth. If size were all that mattered, the United States could simply annex Canada and, presto, its economy would be larger, whether people in Peoria felt any better or not.

    U.S. economic triumphalism is based on more than just GDP growth, of course. Boosters claim that it has enjoyed markedly faster productivity growth, too. Really? It is tough enough to measure how fast productivity is growing in the United States--remember all those wrangles about whether the step up in productivity in the late '90s was a giant leap, a modest bounce, or an illusion. International comparisons are harder still. Even so, the Conference Board, a New York-based business-research group that is hardly a fan of European ways, has taken a stab at it. Their figures show that, although the average U.S. labor-productivity growth of 1.9 percent per year since 1995 exceeds the EU average of 1.3 percent, five individual European countries have done better than the United States. Belgium managed 2.2 percent per year, Austria 2.4 percent, Finland 2.6 percent, Greece 3.2 percent, and Ireland 5.1 percent. If you take a longer time span, 1990 to 2002, not only does the European Union as a whole outpace the United States, so do ten of the 14 individual EU member states for which statistics are available. (The Conference Board does not include figures for Luxembourg.)

    Not only is productivity growth higher in several European countries than in the United States--so too are absolute productivity levels. The average American produces $38.83 of output per hour, measured in 1999 dollars, according to the Conference Board. Average productivity in the European Union is still 8 percent less, largely because of lower productivity in Britain, Spain, Greece, and Portugal--although the gap has closed over the past decade. But six European countries have overtaken the United States: Germany, the Netherlands, Ireland, France, Belgium, and Norway, where output per hour is $45.55, over one-sixth higher than in the United States.

    Click here to return to top of page.

    Just How Badly in Debt is the United States? (posted 6-9-03)

    Economist Dean Baker, commenting on an article in the Washington Post (June 9, 2003):

    This article reports on a new set of budget projections from the Democrats on the House Budget Committee. It refers to a projected deficit for 2003 of $416 billion as a new record, exceeding the $290 billion deficit of 1992 even after adjusting for inflation. The more relevant measure is the deficit to GDP ratio, which measures the economic impact of the deficit. By this measure, the post-World-War- II record was 6.0 percent of GDP in 1983, with the 2003 deficit coming in at approximately 3.9 percent of GDP. The present deficit would be closer to the record if one used the "on-budget" deficit, which excludes the Social Security surplus. This deficit in 2003 will be approximately $600 billion, or 5.6 percent of GDP.

    The article goes on to discuss projections for the national debt. It reports that the publicly held debt is projected to reach $7.9 trillion, or 44 percent of GDP in 2013. For purposes of assessing the nation's overall debt burden, the total federal debt (including the debt owned by the Social Security trust fund) would probably be more appropriate. This is projected to be close to $12 trillion in 2013, just under 69 percent of GDP.

    Click here to return to top of page.

    What Kind of Gas Mileage Did the Model T Get? (posted 6-6-03)

    From Reuters (June 4, 2003):

    The Sierra Club, a leading U.S. environmentalist group, plans to run advertisements criticizing Ford Motor Co. for making vehicles that are less fuel-efficient now -- on its 100th birthday -- than when it began.

    The ads, scheduled to run in The New York Times and BusinessWeek, note that the Model T got 25 miles to the gallon nearly a century ago. The headline reads, "1903-2003 A Century of Innovation ... except at Ford." Dearborn, Michigan-based Ford's average vehicle now gets 22.6 miles per gallon, with its popular Explorer sports utility vehicle getting 16 miles per gallon, according to the Sierra Club ad.

    Ford, which will observe its 100th birthday on June 16, countered that it has three models that are best in class for fuel economy and another three that produce almost no emissions. The company also said its Ford Focus has almost half the tail-pipe emissions level of the Model T.

    Click here to return to top of page.

    The Gay Betsy Ross (posted 5-30-03)

    Steven W. Anderson, writing in PlanetOut.com, about the origins of the gay Rainbow Flag:

    Color has long played an important role in our community's expression of pride. In Victorian England, for example, the color green was associated with homosexuality. The color purple (or, more accurately, lavender) became popularized as a symbol for pride in the late 1960s -- a frequent post-Stonewall catchword for the gay community was "Purple Power." And, of course, there's the pink triangle. Although it was first used in Nazi Germany to identify gay males in concentration camps, the pink triangle only received widespread use as a gay pop icon in the early 1980s. But the most colorful of our symbols is the Rainbow Flag, and its rainbow of colors -- red, orange, yellow, green, blue and purple -- represents the diversity of our community.

    The first Rainbow Flag was designed in 1978 by Gilbert Baker, a San Francisco artist, who created the flag in response to a local activist's call for a community symbol. (This was before the pink triangle was popularly used as a symbol of pride.) Using the five-striped "Flag of the Race" as his inspiration, Baker designed a flag with eight stripes: pink, red, orange, yellow, green, blue, indigo and violet. According to Baker, those colors represented, respectively: sexuality, life, healing, sun, nature, art, harmony and spirit. In the true spirit of Betsy Ross, Baker dyed and sewed the material for the first flag himself.

    Baker soon approached San Francisco's Paramount Flag Company about mass producing and selling his "gay flag." Unfortunately, Baker had hand-dyed all the colors, and since the color "hot pink" was not commercially available, mass production of his eight-striped version became impossible. The flag was thus reduced to seven stripes.

    In November 1978, San Francisco's gay community was stunned when the city's first openly gay supervisor, Harvey Milk, was assassinated. Wishing to demonstrate the gay community's strength and solidarity in the aftermath of this tragedy, the 1979 Pride Parade Committee decided to use Baker's flag. The committee eliminated the indigo stripe so they could divide the colors evenly along the parade route -- three colors on one side of the street and three on the other. Soon the six colors were incorporated into a six-striped version that became popularized and that, today, is recognized by the International Congress of Flag Makers.

     

    Click here to return to top of page.

    How Did St. Petersburg Come into Existence? (posted 5-29-03)

    Neal Ascherson, writing in the Independent (London) (May 29, 2003):

    Most cities have a reason. St Petersburg only has a cause. Most capital cities - and this one was Russia's capital for over two centuries - grew up around a ford on a river, or a steep crag easy to defend. But St Petersburg did not grow up round anything. Instead, it was created by a sudden bayonet-thrust of will.

    The will was Peter's. On 16 May (in the Old Style calendar, 27 May in the new) 1703, he snatched a bayonet from one of his soldiers and made two cross-shaped cuts in the soggy turf of an island in the middle of the Neva river. "Here a city begins!" he is supposed to have said, but probably didn't. What he wanted, at this point, was a fort and then a harbour. It was later that he decided on a town as well.

    Why here? There was nothing to be seen but a huge, shallow, racing river, a flock of low islands, an endless scrubby pine forest on both banks. There were no people to speak of, only a handful of Finnish-speaking fishermen and some Swedish prisoners that he had just captured. But Peter the Great said "Here!" because this was where he was that day. In his war against the Swedes, he had reached the banks of the Neva near its outfall into the Baltic. He looked happily at the wide waters, breathed in the moist air, and said: "Now!" If he had waited a bit, he could have captured the ancient port-city of Riga, a few hundred miles to the west, whose harbour stays ice-free for much longer than the Neva. But Peter was not a man for waiting.

    So, the foundations for what would become the Peter-Paul fortress were dug. The Tsar lived in a log-cabin on the island, still preserved. Soon he moved his naval shipyard down from Lake Ladoga and built a naval base on the south bank of the river, the "Admiralty". Then, Peter wanted proper permanent stone buildings, including a cathedral, and he brought in the first of the foreign architects who designed St Petersburg over the centuries. Domenico Trezzini set to work, in the Dutch-baroque manner. The Peter- Paul cathedral began to rise, with its 400ft spire (more of a spike dipped in blinding gold). Other impressive buildings followed. Peter sent for his family and then for the court nobility from Moscow. They were ordered to settle in, and pay for the construction of their own mansions.

    Click here to return to top of page.

    Does the Stock Market Affect Presidential Elections (Or At Least Reflect Economic Conditions that Decide Elections)? (posted 5-1-03)

    Stephen Moore, president of the Club for Growth, writing in the Weekly Standard (May 5, 2003):

    Not long ago, I sat through a Ted Kennedy slideshow presentation on the economy. It was depressingly persuasive. To summarize a 20-minute talk in two sentences: Under Clinton, the budget deficit and unemployment went way down, while the GDP, jobs, and the stock market soared upward. Under Bush, the deficit and unemployment went up, while the GDP, jobs, and the stock market went down. The Democrats are preparing to Herbert Hooverize George W., and they've got a lot of ammunition to do it with.

    In particular, if the stock market doesn't recover soon, Bush will be running headlong against history in his reelection bid. Since Bush was inaugurated in January 2001, the Dow Jones has fallen 20 percent and the Nasdaq has tumbled 45 percent--though the mini-rally since the end of the Iraq war is helping to reverse these declines. Still, the stock market collapse has led to a liquidation of $5 trillion in wealth--some of which has been absorbed by foreigners, but most of it by American shareholders. These losses are bigger than the GDP of virtually every country in the world.

    So I got to wondering how many times in the last 100 years a president has been reelected when the stock market fell during his first term, as it has under George W. Bush. Not once has this happened. Twice a president came up for reelection after a term in which the stock market fell, and both incumbents got the boot. They were Herbert Hoover and Jimmy Carter--not the kind of company Bush wants to keep.

     

    Click here to return to top of page.

    How Did Arlington National Cemetery Come into Being? (posted 4-30-03)

    Sue Anne Pressley, writing in the Washington Post (Apeil 27, 2003):

    Arlington was designated for the war dead. On June 15, 1864, when it must have seemed as if the deaths from the Civil War would never stop coming, President Abraham Lincoln's secretary of war named the original 200 acres as a cemetery for the military. Creation of the cemetery was also a slap in the face of Confederate Gen. Robert E. Lee, who had lived there before the war; the federal government had seized the property. In time, the cemetery grew to the present 625 acres. Soldiers from the American Revolution and the War of 1812 were reburied there about 1900. And today, any grieving family can be reassured that the standard-honors ceremony is little different from that given a soldier of World War I -- a uniformed casket team, a team that fires off a volley of shots, the flag over the coffin folded and presented to the family, the bugler playing taps. There is no cost for the ceremony, the grave site, the tombstone or maintenance.

    Over the years, famous Americans have been interred there: North Pole explorers Robert E. Peary and Matthew A. Henson; World War II Medal of Honor winner Audie Murphy; bandleader Glenn Miller; heavyweight boxing champion Joe Louis. Two of the most popular tourist sites at the cemetery are the Tomb of the Unknowns and the grave of President John F. Kennedy. Kennedy and William Howard Taft are the only presidents buried there.

    Click here to return to top of page.

    Is It Iraq or Irak? (And Is It E-raq or Eye-raq?) (posted 4-29-03)

    Juan Cole, writing in H-Diplo (April 23, 2003):

    Transliteration is an attempt to represent in one alphabet the notations of another. It is always inexact. In Arabic al-`Ira_q is spelled with an `ayn and a short"i" (it is closer to eerak than to eye-rak) at the beginning. The last letter is a qaf, which is like a"k" but made deeper in the throat and with more of a click. (Arabic also has a"k" similar to that in English). Nineteenth century Orientalists represented the qaf as a K with a dot underneath. General sources would then spell it Irak without the dot.

    The more recent trend is to use a"q" instead. I believe the standard English orthography is now Iraq. In French it remains Irak, which may account for some of the confusion.

    The root `araqa has to do with perspiring, being well-watered or deeply rooted, which is probably the semantic field suggested to desert Arabs by the sight of the Tigris and Euphrates valley.

    Click here to return to top of page.

    When the U.S. Government Tested Chemicals on American Citizens (posted 4-28-03)

    Kevin, Ogle, writing for KFOR.com (April 25, 2003):

    Earlier this year news that the army would conduct bio-terrorism tests in central Oklahoma sent a near panic through some communities.

    Clouds of clay dust and other substances were dropped to see if weather radar could detect a bio-terrorist attack. The army was up front and told those concerned what they were doing and promised there was nothing to worry about.

    But it hasn't always been that way when the army was testing the atmosphere.

    Oklahoma City and a local solider endured a secret exposure.

    It was in the 1950s and America was in the height of the Cold War.

    The United States and the Soviet Union were locked in a deadly race to produce the most powerful nuclear bombs. A cartoon turtle taught the children what to do in case the Soviets attacked.

    Air raid drills were staged in every classroom and city across America. Little did Americans know they were already under attack -- by America.

    "The human populations didn't know, the local governments didn't know, this was a secret army project that went on for 20 years," said author Leonard Cole.

    The U.S. government was preparing for germ warfare by secretly spraying biological agents on its own citizens. The tests were conducted in 239 cities, including one of Oklahoma's most prominent communities.

    "Among the hundreds and hundreds of tests that the army did, Stillwater, Oklahoma was targeted," said Cole, an expert on the Army's development of biological weapons. In some cities reports indicate Americans actually died because of the testing.

    Government records show florescent particles of zinc cadmium sulfide were released in Stillwater in 1962.

    "Cadmium itself is known to be one of the most highly toxic materials in small amounts that a human can be exposed to," Cole said.

    Could Oklahomans have been made sick by that all those years ago could they still have lingering effects from it?

    "If there were concentrations of it enough to make one sick, you could have serious consequences a person over a period of time could have illnesses that could range from cancer to organ failures," Cole said.

    There was no medical monitoring of the population exposed to the particles and Payne County health officials have no records to show the affect, if any, on the people in the Stillwater area.

    But different secret exposure tests would forever change the lives of other 0klahomans.

    Arnold Parks of Oklahoma City loves to work in his yard. But he does it on painful legs and with aching arms, not to mention a bad heart. In 1965 Arnold was in the army when he was told he was going to be a test subject for some new medications.

    But when he recently was given access to his medical records from 1965 he was stunned to learn those "medications" were anything but.

    "And it states right in there on this date they gave me VX, on this date they gave me Sarin, on this date they gave me LSD," Parks said. "I was angry. As a matter of fact, I came unglued."

    It hasn't been medically linked yet, but Parks now believes the small doses of the nerve agents Sarin and VX have affected his arms, legs and heart over the years.

    Click here to return to top of page.

    Did State Spending in the 1990s Get Out of Control? (posted 4-28-03)

    Economist Dean Baker, commenting on an article by Timothy Egan in the NYT,"States, Facing Budget Shortfalls, Cut the Major and the Mundane" (April 21, 2003):

    This informative article examines some of the cutbacks that have been instituted around the country as state and local governments have attempted to cope with massive budget shortfalls. At one point the article cites a study from the Cato Institute which attributes the budget crises to excess spending. According to the article, the study claims that state spending rose an average of 5.7 percent annually between 1990 and 2001, which it describes as"nearly double the inflation rate." The more obvious basis of comparison is the rate of economic growth. Other things equal, we would expect state spending to grow at roughly the same rate as the economy, implying that the states' share of economic output is neither increasing nor decreasing. From 1990 to 2001, the economy grew at an average annual rate of 5.2 percent, only slightly less rapidly than the growth of state spending.

    Click here to return to top of page.

    Why Do Soldiers Shout HOO-AH? (posted 4-25-03)

    Steve Chawkins, writing in the LA Times (April 15, 2003):

    Military talk in the last few weeks has run to bunker-busters and daisy-cutters -- and, from the Beltway to Baghdad, a heck of a lot of hoo-ah.

    Or, more properly: HOO-AH!

    That's the all-purpose exclamation, affirmation and declaration of pride that started in the Army but has since made its way into the Air Force, and on occasion has even augmented the Navy's ancient aye-aye. The Marines have their own chest-thumping version -- OOH-RAH! -- but they'll tell you that an ooh-rah is no more to be confused with a hoo-ah than a caisson is with a quesadilla.

    Where these joyful noises come from nobody knows exactly. Theories run the gamut, from a toast in the Indian wars of the 1840s to an abridged version of "heard, understood and acknowledged," courtesy of eager acronym spinners in the U.S. Department of Defense. All that's really known is that, for years, the expression was barely heard outside of military bases.

    Then in 1992 came the famous volley of hoo-ahs from retired Army Lt. Col. Frank Slade, the blind, alcoholic, tango-dancing officer portrayed by Al Pacino in "Scent of a Woman." And now, thanks to nonstop coverage of the war in Iraq, it's all over the place.

    "It started out as kind of an exclamation point, and that was just fine," said retired Brigadier Gen. Creighton Abrams, director of the Army Historical Foundation in Arlington, Va. "Then it became something almost perfunctory, as in saying 'Hoo-ah!' instead of saying goodbye. Unfortunately, it's become a bit much."

    In San Diego, Marine Sgt. William Dullard recalled the thrill of his first ooh-rah, when he and his platoon graduated boot camp.

    "Our C.O. dismissed us, we did an about-face and everyone screamed, 'Ooh-rah!' " he said. "It was like a movie moment."

    As administrative chief of the drill instructors school at the Marine Corps Recruit Depot, he has since belted out, and gratefully received, his share of ooh-rahs. His enthusiasm for the expression is such that he even ooh-rahs at home: "When we got approved for a loan, it was, 'Wow! Ooh-rah!' "

    Like an Army entrenching tool, the expression has a multitude of uses. It means: "Yes, sir. I'm ready to do the job. Good to go." And: "Congratulations!" And: "Absolutely, I agree." And: "Howdy!" And: "Let's go cover ourselves with glory!" And even: "Have a nice day!"

    A spokesman for the Air Force Flight Test Center at Edwards Air Force Base, John Haire can distinguish between front-line hoo-ahs and supply troop hoo-ahs, between the Army's rendition and the "fuller vowels" of the Air Force version.

    "It can be used the way the French use 'non' at the end of a sentence to mean 'Do you understand?' " said Haire, an Army veteran as well as a former Navy reservist. "After you've briefed somebody on something, you might ask, 'Hoo-ah?' "

    Where Does the Word Quarantine Come From? (posted 4-22-03)

    According to Abraham Verghese, writing in the NYT Magazine (April 20, 2003):

    ''Quarantine'' is a loaded word with metaphorical implications that many of us in America have fortunately forgotten. As the medical historian Howard Markel has noted, it derives from the Italian words quarantina and quaranta giorni, referring to the 40 days ships suspected of carrying plague were made to wait in the port of Venice before discharging their cargo. In America, the Federal Quarantine Legislation was passed in 1878, prompted by outbreaks of yellow fever. The power of quarantine was used vigorously, cruelly and arbitrarily during the New York City epidemics of typhus and cholera in 1892. (About 1,150 healthy people, mostly Russian Jews, were quarantined on North Brother Island, for example.) Fear of catching disease from immigrants (as well as fears of losing scarce jobs to them) caused citizens to rally against immigrants and immigration. Indeed, it is not far-fetched to think of race-based immigration bans as the ultimate form of quarantine.

    "Render to Caesar": What Jesus Meant (posted 4-22-03)

    Richard N. Ostling, writing for the AP (April 21, 2003):

    With Americans recovering from the annual angst of filing tax returns with the IRS, it's an apt season to reflect on history's most famous directive to taxpayers:

    "Render to Caesar the things that are Caesar's, and to God the things that are God's."

    Jesus' words were memorable enough to be included in three of the four Gospels (Mark 12:17, Matthew 22:21, Luke 20:25).

    But modern readers often distort what Jesus meant, according to David T. Ball, who holds degrees in both law and theology. Ball teaches at the Methodist Theological School in Ohio and works at the Ohio Legal Assistance Foundation to provide free legal clinics.

    Writing in Bible Review magazine, Ball recounts the context of Jesus' saying.

    Jesus is confronted by opponents hoping to trap him with his own words. For that purpose, they ask whether it is "lawful" to pay taxes to Caesar and the Roman regime that occupied Israel.

    That puts Jesus on the spot.

    If he says no, he will be guilty of civil disobedience or even revolt against Rome, which is a criminal offense, Ball explains.

    But if he says yes, that expresses unpopular submission to foreign domination. Also Jews might implicate Jesus in the sin of idolatry, because Rome's emperor is worshipped as divine. That violates the first commandment, "You shall have no other gods before me."

    The problem is exemplified by the silver denarius, the most common coin then in use. It is probably what Jesus asks someone to give him as the incident continues.

    In Jesus' day a denarius showed the profile of the Emperor Tiberius with a Latin legend meaning "Tiberius Caesar, son of the Divine Augustus." So this token of idolatry was found in the pockets of most Jews.

    After a question-and-answer session about the coin, Jesus utters the famous "render" saying.

    "It might seem like Jesus is telling us that we should cheerfully fulfill our annual obligation to the IRS," Ball writes, or more generally is advocating the fulfillment of civic obligations.

    Others have said the words mean citizenship exerts claims that stand apart from religious responsibilities, or that religion and politics should be kept separate, or even that Jesus forbids any kind of civil disobedience, including protests against governments that are unjust.

    Ball contends that such sweeping political interpretations are mistaken and result from modern readers' frequent tendency to ignore the second half of the saying. If that's all that Jesus meant, Ball contends, he would simply have talked about rendering to Caesar without the punch line about God.

    Jesus cleverly evades his opponents' trap and shifts the focus toward what people should "render" to God.

    To Ball, the key clue comes when Jesus refers to the image of Caesar on the coin and asks onlookers "whose likeness and inscription is this?"

    Here Jesus was practicing a classic form of byplay among first-century rabbis known as forensic interrogation, answering a question with a counter-question. Typically the challenge would include a specific reference to the Hebrew Scriptures.

    Ball believes "likeness" was meant to recall Genesis 1:26, which says human beings bear the "image" and "likeness" of God. And he thinks "inscription" referred to Exodus 13:9, which says God's law is inscribed as a sign.

    (Ball objects to modern translations that obscure these Old Testament links. For instance, the New Revised Standard Version says "head" and "title." Other versions use terms including "portrait," "face, "picture" or "name."

    In Ball's interpretation, Jesus doesn't specify the precise nature of a Christian's duty regarding taxpaying or other civil obligations. But he suggests that "Christians shouldn't respond to civil issues without considering, first and foremost, their religious duty in the matter."

    So Jesus' rejoinder means that "one may owe takes to Caesar, but one owes one's very being to God," so that all obligations must be understood in the context of responsibilities to God. Religious obligations are supreme.

    In terms of what is owed to God, "we are all in the very highest tax bracket."

    What's a Cakewalk? (posted 4-21-03)

    Brendan I. Koerner, writing in Slate (April 3, 2003):

    Defense Policy Board member Kenneth Adelman has taken heat for predicting that the war would be a "cakewalk" for the United States. Where does the term cakewalk come from, and why is it synonymous with "easy"?

    The cakewalk was originally a 19th-century dance, invented by African-Americans in the antebellum South. It was intended to satirize the stiff ballroom promenades of white plantation owners, who favored the rigidly formal dances of European high-society. Cakewalking slaves lampooned these stuffy moves by over-accentuating their high kicks, bows, and imaginary hat doffings, mixing the cartoonish gestures together with traditional African steps. Likely unaware of the dance's derisive roots, the whites often invited their slaves to participate in Sunday contests, to determine which dancers were most elegant and inventive. The winners would receive cake slices, a prize which gave birth to the dance's familiar name.

    After Emancipation, the contest tradition continued in black communities; the Oxford English Dictionary dates the widespread adoption of "cakewalk" to the late 1870s. It was around this time that the cakewalk came to mean "easy"—not because the dance was particularly simple to do but rather because of its languid pace and association with weekend leisure.

    The cakewalk's fame eventually spread northward, and it became a nationwide fad during the 1890s. Legendary performers Charles Johnson and Dora Dean were the dance's great popularizers, and cakewalk contests were a staple of Manhattan nightlife around the turn of century, for whites as well as blacks. Early ragtime songs, with their trademark syncopated beats and brassy sounds, were often known as cakewalk music.

    How Does the Bush Tax Cut Compare with Reagan's and Kennedy's?(posted 4-18-03)

    U.S. Newswire (April 14, 2003):

    Congress's decision last week to pare back the President's $726 billion tax cut ignores economic as well as political history: that' s the conclusion of a study released today from the non-partisan National Taxpayers Union (NTU). By virtually any fiscal measure, Bush's original package, even when added to the cut signed into law two years ago, is more moderate than reductions proposed by either John F. Kennedy or Ronald Reagan.

    "Self-professed deficit hawks in the House and Senate have forged a twin-scenario tax-cut compromise that's been called an historic first in Congressional budgeting, but history actually has more to teach us about tax cuts in times of economic and national security challenges," said NTU Director of Government Affairs and study author Paul Gessing. "A careful analysis of government data shows President Bush's tax cuts -- not to mention the paltry counter-proposals from Congress -- are modest compared to those of his predecessors." Among the findings:

    -- As a share of Gross Domestic Product (GDP-the nation's economic output), Bush's 2003 plan would reduce taxes by an annual average of 0.44 percent over its lifespan. In contrast, John F. Kennedy proposed to slash taxes by 2.0 percent of GDP, and Ronald Reagan by 3.3 percent.

    -- Even when taken together, the 2001 cut law and Bush's current proposal amount to less than those of former White House occupants. Bush's combined 2001 and 2003 cuts are still less than half the size of Reagan's 1981 cuts (3.3 percent vs. 1.6 percent) and are significantly smaller than Kennedy's tax cuts (again, 2.0 percent of GDP).

    -- Similar results occur when comparing Bush's combined tax cuts to the Kennedy and Reagan initiatives against projected federal revenues. By this measure, Bush's cuts are about two-thirds the size of Kennedy's (8.95 percent vs. 12.6 percent), and less than half the size of Reagan's (18.7 percent). The estimates use "static" revenue assumptions, and don't account for offsetting revenue growth due to any beneficial economic impact of cutting taxes.

    When Did President Begin Releasing Their Tax Returns? (posted 4-17-03)

    From the NYT (April 12, 2003):

    No law requires the president or vice president to make their returns public. Presidents Ronald Reagan, George Bush and Bill Clinton all followed President Jimmy Carter in fully disclosing their tax returns.

    The tradition began in 1977, when President Carter released his tax returns to ensure public trust in the office of the presidency. He said he acted because of the tax evasion charges to which Vice President Spiro T. Agnew pleaded no contest in 1973 and the fraudulent $576,000 tax deduction that President Richard M. Nixon took in his first year in the White House.

    Mr. Nixon was never charged, but Edward L. Morgan, a White House lawyer and a Treasury assistant secretary in the Nixon administration, pleaded guilty to tax fraud in 1975 over the preparation of Mr. Nixon's return. He served four months in federal prison.

    Was"Americanism" Always the Preserve of the Right? (posted 4-16-03)

    Geoffrey Nunberg, a Stanford linguist, writing in the NYT (April 13, 2003):

    [Of American patriotism] Alexis de Tocqueville complained that "It is impossible to conceive a more troublesome or more garrulous patriotism; it wearies even those who are disposed to respect it."

    It's that need to justify national pride that has traditionally tied discussions of patriotism here to the notion of Americanism, as a name for the doctrines and qualities that make our nation exceptional. Americanism has been the refuge of quite as many scoundrels as patriotism itself. In "Main Street," Sinclair Lewis listed "One Hundred Per Cent Americanism" among the clichés of the patriotic stump orator, along with "Bountiful Harvest" and "Alien Agitators."

    But Americanism was also a touchpoint for progressives and radicals. Americanization programs for immigrants were often the benign twin of early 20th-century nativism; striking workers in the 1930's carried American flags on the picket lines. The American Communist Party chief, Earl Browder, famously declared that "communism is 20th-century Americanism."

    It was not until the cold war that Americanism became the exclusive property of the right, particularly when the House Committee on Un-American Activities made "un-Americanism" a synonym for every sort of left-wing activity. In the end, "Americanism" was an unintended victim of McCarthyism. After that period the word virtually disappeared from the American political lexicon.

    But American patriotism was most thoroughly transformed in the 60's, when antiwar radicals repudiated American exceptionalism in tantrums of flag-burning. True, the flag-burners were always a small minority in the antiwar movement, and in fact the American flags at antiwar rallies greatly outnumbered Vietcong ones. But from then on, patriotism became largely a matter of defending contested symbols like the flag and the Pledge of Allegiance; for the first time in modern history, the flag itself acquired an explicitly partisan meaning.

     

    What the Marshall Plan Cost (posted 4-16-03)

    According to John Micklethwait and Adrain Wooldridge, who write for the Economist, the Marshall Plan in today's dollars would amount to $120 billion. (NYT, April 13, 2003.)

    Terrorists and Pirates (posted 4-10-03)

    Alan Wood, writing in the Australian about the threat terrorism poses to global trade (April 1, 2003):

    A historical analogy offers an unexpected insight. Piracy, despite Hollywood's best effort to romanticise it, was ever a brutal and bloody business, like terrorism. What is not generally realised is the extent to which it hindered the development of international trade.

    US economic historian Douglass North won a Nobel Prize in 1993, and the Nobel committee cited an essay, Sources of Productivity Change in Ocean Shipping, 1600-1850. This essay showed that after the European powers eliminated piracy, international shipping costs fell by more than 80 per cent and the industry's productivity rose by about 500 per cent in the first half of the 19th century. This was an important factor leading to the previous episode of globalisation in the late 19th and early 20th centuries.

    In the 21st century, terrorism has the potential to do the reverse by imposing new costs and barriers to international trade, including shipping.

    The OECD cites an estimate made after September 11 that security measures applied in response could add 1 per cent to 3 per cent to the transaction costs of the US's international trade. A rise of 1 per cent would be enough to reduce international trade flows by 2 per cent to 3 per cent.

    Imbedded Journalists: Nothing New (posted 4-10-03)

    Katie Grant, writing in the Scotsman (March 31, 2003):

    Incidentally, this war is not the first experience of"embedded" journalism. A Norman clerk from Evreux called Ambroise was"embedded" - or"embedded" himself - into the Christian troops of the Third Crusade. He wrote an epic poem and it is fantastic. Luckily for him, he was one in a million, so he did not suffer the yawns I predict for his 1,000 modern-day counterparts, whose veritable plague of"my war" books will hit publishers' doormats as soon as they return.

    New Zealand: We Beat the Wright Brothers (posted 4-10-03)

    Kathy Marks, writing in the Independent (March 31, 2003):

    AS EVERY schoolchild knows, the world's first powered flight was made by the Wright brothers, taking to the skies above Kitty Hawk, North Carolina, in December 1903.

    Or was it? Nine months earlier, a little-known New Zealand farmer, Richard Pearse, climbed into a bamboo monoplane and flew for about 150 metres before crashing into a gorse hedge on his South Island property.

    The flight was witnessed by family members and a scattering of locals from nearby Waitohi, a small farming community. But no documentary evidence of it has survived. Pearse left no journal or diary. A picture of his plane stuck in the hedge, taken by a local photographer, was lost in a flood. Hospital records relating to a shoulder injury he suffered in the crash were destroyed in a fire. Nevertheless, New Zealanders are convinced that Pearse has been deprived of a place in the history books, and a group of aviation enthusiasts gathered at the weekend at Timaru, near Waitohi, to celebrate the centenary of his flight on 31 March 1903. The Pearse devotees had made two replicas of his plane, which they tried to get airborne with the help of a modern microlight engine.

    On the Relationship Between War and Money (posted 4-10-03)

    Gary Duncan and Antonia Senior, writing in the Times (London) (March 29, 2003):

    From before the Romans, the cost of war has been the engine for the evolution of money, tax and finance in all its forms.

    Even the most basic language of finance is rooted in the blood spilt in battles long ago. In his account of The History of Money, Glyn Davies notes that the word "pay" itself stems from the Latin "pacare", the original meaning of which was to make peace with another, usually through some form of compensation for injury.

    Much later, England was to coin the term "soldier". In the time of Henry II, the king scrapped the obligation on his barons to serve a period in his armies. This was replaced with a requirement to pay a levy known as "scutage". Henry then used this money to finance a professional standing army. As Davies relates, the men became known as soldiers, after the solidus, or king's shilling, which they earned.

    In the earliest times, emperors and princes would resort to the most brutal of methods to pay for their often fabulously expensive wars, although conflict could often pay for itself through plunder. At the height of Alexander the Great's campaigns in Asia, their cost has been estimated at half a ton of silver a day.

    But Alexander's seizure of enormous supplies of Persian gold and silver was later to make his adventures pay for themselves.

    Francis Drake's 16th century defeat of the Spanish Armada was aided by up to Pounds 1.5 million in booty grabbed from the Spaniards between 1577 and 1580.

    Centuries earlier, Caligula, the Roman Emperor, had less success with the spoils of war after his abortive mission to invade Britain. Having falsely claimed to have seized the island, he returned to Rome with vast quantities not of precious metals but of seashells. Crazed, Caligula said these were "plunder from the ocean".

    By medieval times, when the power of British monarchs was immense but not absolute, kings started having to justify their wars and the cost of fighting them. Helen Castor, of Sidney Sussex College, Cambridge, says: "National state taxes developed in medieval England. The idea that the King would tax his subjects developed during the wars waged by Edward I and Edward III because the king needed funds. The principle was that if the king could justify the war, he could demand money from the nation to fund it, and the nation couldn't really refuse."

    But when the king could not justify war, the funds frequently were not forthcoming. Henry III decided to conquer Sicily for his second son, paying the Pope vast sums for the privilege. Dr Castor says: "He tried to get the money from the political community in England, which provoked massive opposition because they didn't see why the war was justified. This directly contributed to the civil war that followed." Similarly, the outbreak of the English Civil War was driven by Parliament's resistance to Charles I's right to levy taxes without its consent.

    In the 20th century the immense destruction of the two world wars that flowed from the development of military technology, was matched in scale by their economic fallout. If the Second World War boom in America transformed the US after the Great Depression, it also set the stage for decades of British economic underperformance. Vietnam would lead to the collapse of the Bretton Woods system of fixed exchange rates.

    Will the Bush Tax Cuts Prove Supply-Side Economics Works? (posted 4-8-03)

    Economist Dean Baker, commenting on an article by Daniel Altman in the NYT,"The End of Taxes as We Know Them" (March 30, 2003):

    This article discusses the merits of President Bush's proposals for large tax cuts directed primarily at the wealthy. The article refers to the record of the Reagan era tax cuts and argues that this experience left the merits of supply-side tax cuts open to debate, because military spending increases created large deficits, which "swamped whatever supply-side benefits the tax cuts might have had."

    Actually the military build-up of the Reagan years does not make it any more difficult for economists to assess whether the tax cuts had their intended effect. The alleged goal of supply-side tax cuts is to increase incentives to save. A simple way to determine whether the tax cuts were effective is to see what happened to the savings rate in the eighties. In the five years before the tax cut was implemented (1977-81), the savings rate averaged 9.6 percent of disposable income. In the five years after the tax cuts were fully phased (1984-88), the savings rate averaged 8.6 percent of disposable income. By this most basic measure the supply-side tax cuts were a complete failure. It is worth noting that deficits of this era should have increased incentives to save, by raising interest rates, so the decline in the saving rate is even more striking. (Corporations also increased their dividend payout rate, which should have increased the savings rate as well.)

    This article never discusses the plausible magnitude of the growth effects of supply-side tax cuts. Even in a best case scenario, it is unlikely that the Bush tax cuts would increase the annual growth rate by even 0.05 percentage points, a gain in growth that would probably be too small to even be noticed by anyone in their lives. (The Congressional Budget Office recently estimated that the Bush tax cuts would reduce GDP under most scenarios. The only situations in which they led to an increase in output over the next decade, is if people assumed that the deficits from the tax cuts would lead to higher taxes in future years. In these scenarios, people have incentive to work more in the next decade – a period of relatively low taxes – rather than in later years when they expect taxes to be higher [http://www.cbo.gov/showdoc.cfm? index=4129&sequence=0])

    This article also never discusses the possibility that the intention of the Bush tax cuts is simply to redistribute money to the wealthy – their one undisputed effect. This would be like discussing the steel tariffs without ever raising the possibility that their purpose might be to protect jobs of workers in the steel industry and to increase the profitability of steel manufacturers.

    At one point the article asserts that President Bush's plan to establish tax-free savings accounts, "could quickly shelter most families entire portfolios from taxation." Actually, the vast majority of families' can already shelter their entire portfolios from taxation in the way that this article is describing. Only 2 to 3 percent of families reach the current limits on the amount that can be placed into tax sheltered retirement accounts such as IRAs or 401(k)s.

    War-Speak (posted 4-8-03)

    Geoffrey Nunberg, a Stanford linguist, writing in the NYT (April 6, 2003):

    The first casualty when war comes is truth." With due respect to Hiram Johnson, the Progressive senator who made that famous remark in 1917, the first casualty of war is less often the truth itself than the way we tell it. Coloring the facts is always simpler and more effective than falsifying them.

    The modern language of war emerged in the Victorian age, when military planners first became concerned about public opinion. One linguistic casualty of that period was "casualty" itself, a word for an accidental loss that became a euphemism for dead and wounded around the time of the Crimean War, in the mid-19th century, the conflict that gave birth to the war correspondent.

    By World War I, the modern language of warfare was in its full euphemistic glory. The mutinies among French troops in 1917 were described in dispatches as "acts of collective indiscipline," and the writers of the daily communiqués from the Western Front were instructed to use the phrase "brisk fighting" to describe any action in which more than 50 percent of a company was killed or wounded.

    What's notable about the current war isn't the toll it's taking on language — all wars do that — but the obsessive attention we pay to the matter. There has never been an age that was so self-conscious about the way it talked about war. Barely two weeks into the conflict, more than a dozen articles have appeared in major newspapers speculating about what its effects on the language will be, as if that would reveal to us what story we would wind up telling about it.

    In part, that is simply a reaction to the jumble of images and reports we've been subjected to, and of the need to make sense of them. Last week, Defense Secretary Donald H. Rumsfeld complained that the abruptly shifting impressions of the war's progress were due to viewers seeing "every second another slice of what's actually happening over there." He waxed nostalgic for World War II newsreels that wrapped the week's war highlights in a stirring narrative.

    Mr. Rumsfeld wistfulness is understandable. True, domestic support for World War II was never as solid or uncritical as we like to imagine — as late as 1944, almost 40 percent of Americans said they favored a negotiated peace with the Germans. But there is no trace of those doubts in the language the war left us with, or in the artless enthusiasm of those newsreels: "Then, by light of the moon, a thousand mighty bombing planes take off, flying to their marks and releasing their fatal loads."

    That was the tail end of a purple thread that ran back to those Crimean dispatches about gallant British troops pouring fire on the terrible enemy. The effusive metaphors of the newsreels were already shopworn in 1969, when Mr. Rumsfeld joined President Richard M. Nixon's cabinet, and war reports had to be tailored to an increasingly skeptical and knowing public.

    Today, no journalist would hazard a reference to mighty bombers dropping fatal loads. Embedded reporters produce embedded language, the metallic clatter of modern military lingo: acronyms like TLAM's, RPG's and MRE's; catchphrases like "asymmetric warfare," "emerging targets" and "catastrophic success" — the last not an oxymoron, but an irresistibly perverse phrase for a sudden acceleration of good fortune.

    Iraq's Indebtedness Is as Deep as Its Oil Wells (posted 4-8-03)

    Alan B. Krueger, writing in the NYT (April 3, 2003):

    The fog of finance is almost as thick as the fog of war in Iraq, but analysts have pieced together a balance sheet that looks, well, wildly out of balance. Saddam Hussein borrowed heavily to finance his war with Iran, the invasion of Kuwait, and the first Persian Gulf war. On top of this, the United Nations Compensation Commission received $320 billion of claims for damages against Iraq related to its invasion of Kuwait.

    Iraq's total potential obligation — from war-related compensation claims, foreign debt and pending contracts — is $383 billion, according to Frederick D. Barton and Bathsheba N. Crocker of the Center for Strategic and International Studies in Washington. Their report, "A Wiser Peace: An Action Strategy for a Post-Conflict Iraq" (available at www.csis.org/isp/pcr/index.htm), should be required reading for anyone looking for a blueprint of how to rebuild Iraq successfully after the war.

    "In any reconstruction there is a make-or-break issue," said Mr. Barton, a veteran of United States and United Nations efforts to rebuild war-torn regions like Bosnia, Haiti and Rwanda. "Debt is the make-or-break issue for Iraq."

    To gain some perspective on the crushing financial burden facing the Iraqi people, note that with a population of 24 million, pending obligations work out to $16,000 for every man, woman and child. The Central Intelligence Agency estimates, probably optimistically, that Iraq's per capita gross domestic product is $2,500. So, for the average person, financial obligations exceed income by a ratio of more than six to one.

    If 50 percent of Iraq's future export income is diverted to paying down the debt — more than three times the percentage extracted for German World War I reparations — it would take more than 35 years to pay off current obligations fully, even after allowing for reasonable growth in oil exports.

    The ratio of debt to G.D.P. in Iraq is more than 10 times what it is in Argentina or Brazil.

    Even if the Barton-Crocker estimates are way off, the debt load facing future generations of Iraqis is overwhelming.

    What would happen if an occupied Iraq simply defaulted on the debt?

    Patrick Bolton, an expert on sovereign bankruptcy at Princeton University, said creditors could seize Iraq's assets in their countries or sue to block future creditors from collecting money owed to them, creating an obvious disincentive for anyone considering investing in Iraq. Creditors might even be able to block the United States from using Iraqi oil reserves for the Iraqi people, as the Bush administration has promised.

    Indeed, the United Nations has been siphoning off 28 percent of Iraq's oil export revenue to compensate claimants from the invasion in Kuwait — and so far has processed only half of all claims from that war. There could be additional claims after this war, and much of the country's infrastructure will need to be rebuilt or repaired. The United Nations has awarded 30 cents on the dollar for individual and family claimants; claims by companies, governments and international organizations remain to be settled.

    War of Words (posted 4-2-03)

    Phrases associated with different wars; <I>Boston Globe</i> (March 27, 2003):<P>

    Iraq War (2003)

    "MOABs" ("massive ordnance air burst," also "mother of all bombs")
    "NBC assault" (referring to nuclear, biological, and chemical weapons)
    "target of opportunity"
    "embeds"
    "shock and awe"
    "catastrophic success" (a great success)
    "regime change"
    "debaathification"
    "decapitation"
    "weapons of mass destruction"

    Gulf War (1991)

    "the mother of all battles"
    "no-fly zone"
    "Humvees"
    "MREs" (Meals, Ready to Eat)

    Vietnam War

    "quagmire"
    "fragging"
    "friendly fire"

    Cold War

    "fallout"
    "plausible deniability"

    Korean War

    "brainwashing"

    World War II

    "jeep"
    "snafu"
    "blitz" (short for blitz krieg)
    "firestorm"

    World War I

    "bombardment"
    "trench warfare"
    "no man's land"
    "shell-shocked"

    Civil War

    "slacker"
    "unconditional surrender"

    < Public Opinion in the 1930s When Danger Lurked Everywhere (posted 4-1-03)

    Mike Murphy, wiriting in the Weekly Standard (March 24, 2003):

    In the fall of 1939 Adolf Hitler had already started the Second World War. Austria and Czechoslovakia had been conquered. Poland was falling to German armies. Britain and France had just declared war.

    Against this, Gallup measured American public opinion on the European war. Perhaps unsurprisingly, 96 percent of Americans opposed joining the war against Hitler. But when asked if the United States should stay out of the war, even if that meant fascist Germany would conquer the democracies of England and France, 79 percent of Americans still said America should avoid the war.

    This was public opinion in the United States after a decade of Hitler's ranting, re-arming, and marching across his neighbors' borders. Even as late as 1941, with France defeated and England alone, a poll showed 79 percent of Americans still opposed involvement in the war.

    European public opinion was no wiser. Shortly after Chamberlain won peace in our time at Munich, only 39 percent of British public opinion opposed his policies. After losing millions in the slaughtering fields of the First World War, it is no surprise that France and England craved peace during the 1930s. Woodrow Wilson lured a reluctant America into that Great War with a promise that it would end all wars. The newspapers of the 1930s frequently terrified readers with stories of vast air armadas that would bomb crowded cities with poison gas. That public opinion would cling to peace at nearly any cost is easily understandable, and arguably commendable. No civilized society will ever embrace the horror of war if given any other option, even options that are illusions. But it is the duty of leaders to see through the illusions.

    Hitler made his riskiest initial move in March 1936 by remilitarizing the Rhineland, and thereby dramatically repudiating the Treaty of Versailles. France's vastly superior army of the time could easily have rolled into the Ruhr valley, upholding the treaty that ended World War I and stopping Hitler's ambitions by disarming his regime. Reacting to Hitler's gamble, France's caretaker premier Albert Sarraut made a snarling radio speech, weighed military action, and consulted his British allies in the Baldwin government, who told the French that Britain could not "accept the risk of war" and urged diplomatic action within the League of Nations.

    Facing elections in May and fearing a backlash from a powerful "pacifist tornado," the Sarraut cabinet quickly rejected military action. "If we had declared a general mobilization two months before the elections," wrote Sarraut's air minister in 1944, "we would have been swept out of parliament by the voters, if it did not happen beforehand through a revolution in the streets." France, the dominant land power in Europe during the 1930s, did nothing.

    < Do Wars Bring About Prosperity? (posted 4-1-03)

    Virginia Postrel, writing in the NYT (March 27, 2003):

    "War brings prosperity. This was the conclusion drawn by Americans who watched the war economy of World War II bring an end to the Great Depression," the antiwar intellectual Seymour Melman wrote in "The Permanent War Economy," published in 1974. The book sought to reverse that perception.

    Few if any macroeconomic scholars would have endorsed the simple notion that "war brings prosperity." But two influences led that idea to percolate into public consciousness: the prominence of Keynesian theory and the experience of wartime production.

    Keynesians held that the Depression had persisted because New Deal spending had not been big enough to restart the economy. The war forced the government to abandon fears of large-scale spending.

    The war pumped up demand for unemployed workers and underused factories. Indeed, the Depression "made World War II possible," the macroeconomist Robert J. Gordon of Northwestern University argued in an interview. "We had vast wasted resources — unutilized factories, unemployed workers."

    "Our capacity to build automobiles was only being used at one-half," he added, so it was relatively easy to ramp up production of military equipment like tanks and ships.

    The wartime demand for military production was so obvious that after the war many people feared the economy would slip quickly back into recession. Where, if not from government, would the demand come from?

    Postwar demand came, of course, from consumers and their rapidly multiplying children, the baby boomers.

    But the cold war also kept military spending high by historic (though not World War II) standards. And the Depression left scars on the public psyche, producing the fear that without continuous large government outlays, the private economy would falter.

    Mainstream Keynesian theory held that government spending was needed to bolster the economy during recessions. Few macroeconomists argued that government should keep spending large sums even when the private economy was expanding.

    But that idea did have influential backing, particularly from Alvin Hansen of Harvard, the dean of prewar Keynesians. He promoted the idea of "secular stagnation," which held that inefficiencies in the economy were constantly threatening recession, even during booms, unless the government kept spending high.

    "Hansen, unlike the standard Keynesians, argued that you not only needed countercyclical management, you needed full-time pump priming," said John Nye, an economic historian at Washington University in St. Louis.

    Most macroeconomic scholars were more judicious, but Mr. Hansen's ideas were popular. Professor Nye noted that "many of his ideas got swallowed up" into the popular wisdom that the economy needs constant stimulus and, hence, that high military spending is good for the economy.

    From the stagflation of the 1970's to the prosperity of the 1990's, experience has mostly reversed these popular perceptions. The 1990's boom demonstrated that a strong economy can persist not only amid budget surpluses but also after substantial post-cold-war military cuts.

    As for the war in Iraq, it is tiny compared with previous wars.

    < Blacks Were Not Over-Represented in the Ranks of the Vietnam Troops (posted 4-1-03)

    David Halbfinger and Steven Holmes, writing in the NYT (March 30, 2003):

    Among the many myths of Vietnam that persist today, experts say, is that it was a war fought by poor and black Americans, who died in greater proportions than whites.

    Although that was true in the early stages of the American ground war, in 1965 and 1966, when there were large numbers of blacks in front-line combat units, Army and Marine Corps commanders later took steps to reassign black servicemen to other jobs to equalize deaths, according to Col. Harry G. Summers Jr. in "Vietnam War Almanac."

    By the end of the war, African-Americans had suffered 12.5 percent of the total deaths in Vietnam, 1 percentage point less than their proportion in the overall population, Colonel Summers wrote.

    Servicemen from states in the South had the highest rate of battlefield deaths, 31 per 100,000 of the region's population, Mr. Kolb found. Soldiers from states in the Northeast had the lowest rates, 23.5 deaths per 100,000.

    Since the end of the draft, that geographic skew on the battlefield has extended to the services as a whole. The percentages of people from the Northeast and Midwest have dropped, while the proportion from the West has climbed and from the South has skyrocketed — even after accounting for southward and westward population shifts in society at large. For the year ending Sept. 30, 2000, 42 percent of all recruits came from the South.

    Over all, Mr. Kolb said, 76 percent of the soldiers in Vietnam were from working-class or lower-income families, while only 23 percent had fathers in professional, managerial or technical occupations.

    The disparity created by the Vietnam draft can be seen on the walls of Memorial Hall and Memorial Church at Harvard University, where the names of Harvard students and alumni who died for their country are inscribed. There were 200 Harvard students killed in the Civil War and 697 in World War II, but only 22 in Vietnam.

    For Stanley Karnow, the journalist and author of "Vietnam: A History," who began reporting from Vietnam in 1959, the contrast with World War II was personal. When he turned 18 in 1943, he dropped out of Harvard and enlisted in the Army. In 1970, when his son turned 18 and became eligible for the draft, he was also a Harvard student. "We did everything we could to keep him out of the draft," Mr. Karnow said.

    Science and War (posted 3-29-03)

    Liz Marlantes, writing in the Christian Science Monitor (March 25, 2003):

    Historians often describe World War I -- with its introduction of mustard and other poison gases -- as the chemists' war. World War II is widely regarded as the physicists' war, a triumph of laboratory discoveries culminating in the atomic bomb.

    How Long Do Our Wars Last? (posted 3-29-03)

    Table posted in USA Today (March 26, 2003):

    WarDate War Began Length
    World War IApril 6, 1917 1 year, 7 months, 5 days
    World War IIDec. 8, 19413 years, 8 months, 6 days
    KoreaJune 27, 19503 years, 1 month
    VietnamFeb. 14, 196210 years, 11 months, 13 days
    Persian GulfJan. 17, 19911 month, 10 days
    KosovoMarch 24, 19992 months, 27 days
    AfghanistanOct. 7, 20012 months

    Why Do We Put Up Yellow Ribbons During Wars? (posted 3-28-03)

    Cecil Adams, writing for the website, StraightDope.com (March 2003):

    Yellow ribbons first emerged as a national symbol in January 1981, when they sprouted like weeds to welcome home the Americans held hostage in Iran. The whole thing was started by Penelope (Penne) Laingen, wife of Bruce Laingen, U.S. charge d'affaires in Teheran. Ms. Laingen says she was inspired by two things: (1) the song "Tie a Yellow Ribbon Round the Ole Oak Tree," written in 1972 by Irwin Levine and Larry Brown and made famous by Tony Orlando and Dawn, and (2) the prior example of one Gail Magruder. Ms. Laingen writes:

    "Gail Magruder, wife of Jeb Stuart Magruder of Watergate fame, put yellow ribbons on her front porch to welcome her husband home from jail. This event was televised on the evening news.

    "At this point ... I stepped in to change the legend and song from the return of a forgiven prodigal to the return of an imprisoned hero. Interestingly, I had remembered the Gail Magruder ribbons, but I had only a vague understanding of the Levine-Brown song lyrics, although I knew it involved a 'prisoner,' which my husband surely was in Iran."

    Penne's aim, and that of the other hostage families she was in contact with, was to keep public attention focused on the prisoners. Various ideas had been proposed or tried early on, including asking people to turn on their porch and car lights, honk their horns, ring church bells, display the flag, wear Vietnam-type POW bracelets, etc. But none of these schemes proved satisfactory.

    Finally Penne hit on yellow ribbons. She hung one made from yellow oilcloth on an oak tree in her front yard in December 1979, and mentioned it to a Washington Post reporter who was doing a story on how hostage families were dealing with stress. The reporter described what Penne had done in her article and yellow ribbons soon were appearing nationwide. When the buildup for the Persian Gulf war began the ribbons appeared anew and now appear to be firmly established as a symbol of solidarity with distant loved ones in danger.

    OK, but where did the song "Tie a Yellow Ribbon Round the Ole Oak Tree" come from? At this point the ribbon story starts to get a little tangled.

    Larry Brown claimed he heard the returning-convict story on which the song was based in the army. Apparently it was a widely circulated urban legend--so widely circulated, in fact, that it got the songwriters into a bit of hot water. New York Post writer Pete Hamill had related the story in a 1971 column with a few different details--for one thing, the convict told his story not to a bus driver but to some college students headed for Fort Lauderdale.

    Hamill claimed he'd heard the story from one of the students, a woman he'd met in Greenwich Village. He sued Brown and Levine for stealing his work, but the defense turned up still earlier versions of the tale (Penne Laingen quotes a version from a book published in 1959) and the suit was dropped.

    A big difference in many of the earlier stories was that the centerpiece wasn't a yellow ribbon, it was a white ribbon or kerchief. But Levine claimed "white kerchief" wouldn't fit the meter, so yellow ribbon it became. In addition to being trochaic, yellow seemed "musical and romantic," he reportedly said.

    But it wasn't quite that simple. The 1949 John Wayne movie She Wore a Yellow Ribbon featured a hit song of the same name, and the line appears in a 1961 Mitch Miller songbook. A source who knows Brown and Levine says they (or at least Levine) privately admit they got the concept of yellow ribbons from the 1949 song.

    The movie tune was a rewrite of a song copyrighted in 1917 by George A. Norton titled Round Her Neck She Wears a Yellow Ribbon (For Her Lover Who Is Fur, Fur Away). This in turn was apparently based on the popular 1838 minstrel-show song All Round My Hat (surely you remember it), which sported the line, "All round my hat I [w]ears a green willow [because] my true love is far, far away." Doesn't scan (or parse) very well, which no doubt explains the switch to yellow ribbons in the twentieth century. Songs with green willows and distant lovers go back at least to 1578.

    It's interesting that the ribbons and willows in these songs simply serve as a reminder of a distant loved one, since that's pretty much the only significance of yellow ribbons today. There is no suggestion of the returning prodigal such as we find in the Levine-Brown song, or even of imprisonment, as was the case during the Iran hostage crisis. So I guess we can say that yellow ribbons do have some grounding in tradition, although it's ribbons rather than green willows chiefly as a metrical convenience.

    Contrary to popular belief, there is no indication that yellow ribbons had any symbolic value during the American Civil War. The notion that they did stems from the aforementioned John Wayne movie, which featured soldiers in Civil War-era uniforms.

    Will Iraq Begin Selling Oil in Dollars Again?(posted 3-26-03)

    Jerry West, editor of the Record (March 25, 2003):

    With the collapse of the USSR the US became the most powerful economic and military force in the world, a position Bush's team, according to their own writings, intends to keep, and to tolerate no challenges to. The moves into Afghanistan and Iraq are the opening scenes in an unfolding drama.

    The US economy is based on a demand for the dollar and that dollar is kept in demand because there is an agreement that OPEC will sell its oil in US dollars. This sets up dollars to be the standard for international exchange, and requires most countries in the world to maintain US dollar reserves. These reserves in turn are invested in the US economy. Thus that economy is kept afloat. This is a good thing for the US which is running a huge deficit and would be in deep trouble should the dollar plunge in value.

    A few years back the European Union created a common currency, the Euro, to replace the national currencies of its members. In the last few years the value of the Euro has gone from around 85 cents to about $1.10 US. As a standard of international exchange it is now challenging the US dollar, a challenge that could have disastrous consequences for the US economy. By the end of this year the EU community will have grown to the point where it is a third larger than the US and will be buying over half of OPEC's oil. It may want to do so in Euros.

    Add to this picture the fact that in 2000 Saddam Hussein stopped using the dollar and priced Iraq's oil, the second largest reserve in the world, in Euros. Iran may do so. Russia is moving more to Euros everyday. And on top of this, Iraq nationalized its oilfields in 1972. A US governed Iraq would privatize them and bring back pricing in US dollars. I wonder what this war is about?

    The History of the Marine Units Fighting in Iraq (posted 3-24-03)

    What other famous battles and wars did the Marine units enagegd in Iraq fight in? According to Col. James Toth, speaking on behalf of the Marines, the First Marine Division, which is currently fighting in Iraq, also fought in:

    Guadalcanal and Tulagi(1942)
    Cape Gloucester(1944)
    Peleliu (1944)
    Okinawa (1945)
    Inchon Landing and Chosin Reservoir(1950)
    Vietnam (1965-1971)
    Gulf War(1990-1991)

    Why do Soldiers Wave a White Flag When Surrendering? (posted 3-22-03)

    Brendan I. Koerner, writing in Slate (March 21, 2003):

    Hundreds of Iraqi soldiers are surrendering by waving white flags, the international symbol of capitulation. How did this tradition originate?

    Ancient historians from both China and Rome noted the use of white flags to signal surrender. In the former empire, the tradition is believed to have originated with the reign of the Eastern Han dynasty (A.D 25-220), though it may be somewhat older. The Roman writer Cornelius Tacitus mentions a white flag of surrender in his Histories, first published in A.D. 109. His reference concerns the Second Battle of Cremona, fought between the Vitellians and the Vespasians in A.D. 69; at the time, the more common Roman token of surrender was for soldiers to hold their shields above their heads. It is believed that the tradition developed independently in the East and West.

    As for the bland color selection, it was likely just a matter of convenience in the ancient world. Artificial colors were still centuries away, so white clothes were always handy—not to mention highly visible against most natural backgrounds. Vexillologists (those who study flags) also opine that plain white provided an obvious contrast to the colorful banners that armies often carried into battle.

    SAY GOODBYE TO THE COLD WAR PEACE DIVIDEND (posted 3-18-03)

    Alan Beattie, writing in the Financial Times (London) (March 15, 2003):

    George Magnus, chief economist at UBS Warburg, and his colleagues have sketched out a scenario in which the need to combat the threat of terror sees US aid and personnel deeply involved in countries as varied as Turkey, North Korea, Colombia, Iraq, Afghanistan, the Philippines, Djibouti, Yemen and Bosnia."Our admittedly crude guess is that the broadly defined military budget, encompassing homeland security, foreign aid and other nation-building programmes, could more than double from 3.5 per cent of GDP to as much as 8-9 per cent over the coming years," their report says.

    This would mean that, after the honeymoon of the 1990s, the US public would have to pay back the cold war peace dividend. US defence spending, as high as 10 per cent of national income in the 1950s, dropped to 5-6 per cent by the 1980s, fell to 3 per cent by 2000 and, state CBO estimates, was set to fall further. Ensuring US security could throw this trend into reverse.

    JUST HOW BIG IS THE PROJECTED DEFICIT? (posted 3-17-03)

    Economist Dean Baker, commenting on articles published on March 8, 2003 in the Washington Post and the NYT (March 17, 2003):

    These articles report on new estimates from the Congressional Budget Office on the cost of a war with Iraq and on the size of the deficit in coming years. Both articles only express these numbers in dollar terms; it would be more helpful if they were expressed as shares of the budget or relative to GDP. For example, the $25 billion estimate of the cost of sending troops to the Middle East is equal to approximately 1.1 percent of total spending in 2003. The projected deficits of $287 billion for 2003 and $338 billion for 2004 (not counting the costs of a war or additional tax cuts) are equal to approximately 2.7 and 3.2 percent of GDP, respectively.

    The deficit figures reported in this article include the Social Security surplus. For some purposes it is appropriate to present the deficit without including the Social Security surplus, since this is money borrowed from the Social Security trust fund, which must be repaid. Without the Social Security surplus, the deficits for 2003 and 2004 would be approximately $470 billion (4.6 percent of GDP) and $530 billion (5.2 percent of GDP), respectively. Since many people seem to view this as the more important measure of the budget deficit [it is often claimed that politicians use the Social Security surplus to"hide" the true size of the deficit], the deficit numbers should be reported without including the Social Security surplus.

    At one point the Post article refers to Medicare's"slide towards insolvency." The most recent projections from Medicare's trustees show that the program will be able to pay all scheduled benefits through the year 2027 with no changes whatsoever. There has never been a period in which Medicare has been able to go such a long period without a tax increase. Therefore, if Medicare can be currently be described as"sliding towards insolvency," then it has been sliding towards insolvency through its entire existence.

    ARTISTS AND WAR (posted 3-13-03)

    Denise Rompilla, an assistant professor of art history at St. John's University in New York, writing in the Chronicle of Higher Education about an exhibition,"Images From the Atomic Front" (March 13, 2003):

    Although the imagery of World War II is often thought of in terms of its haunting photographic legacy, artists served as high-profile correspondents throughout much of the war, vividly documenting its conflicts in drawings, watercolors, and oil sketches executed on the spot and often under harrowing conditions....

    On January 11, 1946, a press conference was called to announce a series of atomic tests that would be conducted in the Pacific with a mock naval fleet, to"promote the understanding, development, and use of America's scientific discoveries." While only one official combat artist, Grant Powers, was assigned to the two tests conducted at the Bikini Atoll in July 1946, two former servicemen -- Arthur Beaumont and Charles Bittinger -- received special commissions from the U.S. Navy to attend the operation, and created memorable images of the explosions and damage rendered to the fleet. Beaumont would have the special distinction of being irradiated after he was accidentally left for hours chained to a buoy in the middle of the battered target array and was forced to drink the contaminated water he used to mix his paints. In spite of the well-documented environmental catastrophe resulting from Bikini and potential dangers charted for human participants in nuclear testing, artists continued to be sent into training maneuvers involving atomic explosions with little more than a pair of military-issued dark goggles for protection.

    HOW OFTEN HAS THE U.S. USED THE VETO AT THE UN? (posted 3-12-03)

    David Rider, writing in the Ottawa Citizen:

    The United States, marshalling all its diplomatic might to prevent a veto of its planned UN Security Council resolution authorizing war on Iraq, is the most frequent user of the controversial veto power.

    The U.S., one of five permanent members on the 15-member council with the power to kill a resolution with its single vote, has done so 76 times since the United Nations was founded in 1945, according to an analysis by Global Policy Forum, a New York-based UN social policy watchdog.

    (The UN itself does not tally veto figures and yesterday referred the Citizen to Global Policy numbers.)

    The total is more than that of Britain, France and China -- three other permanent members -- combined. The fifth member -- the Soviet Union -- led the way with 119 but, since the Soviet Union's collapse in 1991, Russia has added only two vetoes to that total.

    In the past decade, six of the seven veto votes cast by the U.S. have been used to protect Israel from criticism over its actions in the Mideast conflict. The other permanent members invoked a total of three in the same period.

    BUSH'S INFREQUENT PRESS CONFERENCES (posted 3-11-03)

    Mike Allen, Washington Post Staff Writer (March 7, 2003):

    [President]Bush went before 94 reporters for his eighth solo news conference last night [March 6, 2003] as part of his effort to prepare Americans for a likely war against Iraq as increasingly insistent opposition from allies and skepticism at home grow.

    At the same point in their presidencies, President Bill Clinton had held 30 solo news conferences (that is, without a foreign leader at a twin lectern) and Bush's father had held 58, according to research by Martha Joynt Kumar, a Towson University political science professor who specializes in presidential communication.

    After two years and 45 days in office, President Ronald Reagan had held 16 solo news conferences, President Jimmy Carter had held 45, President Gerald Ford had held 37, President Richard M. Nixon had held 16 and President Lyndon B. Johnson had held 52.

    ... The news conference was Bush's second in the East Room or in prime time. The last was Oct. 11, 2001 -- four days after allied cruise missiles and bombers began dismantling the Taliban. Bush's last solo news conference, held in a more casual setting, was Nov. 7, two days after the Republican triumphs in the midterm elections.

    Bush's aides point out that he frequently takes short bursts of questions from reporters in other settings -- most often, when cameras are allowed in at the beginning or end of a presidential event. The White House said that counting those, Bush has taken questions 216 times, not including one-on-one interviews. Aides said Bush disdains what they call the"preening" by correspondents that he considers an inescapable part of televised news conferences.

    HOW THE NYT MISSED THE NEWS THAT CRICK AND WATSON HAD DISCOVERED DNA (posted 2-28-03)

    Dennis Overbye, writing in the NYT (February 25, 2003):

    If journalism is the first draft of history, as the saying goes, then it's often a terrible draft. A case in point happened in 1953, when Francis Crick, a graduate student at Cambridge University, and Dr. James D. Watson, a young biochemist, published a short paper in the journal Nature proposing that DNA, or deoxyribonucleic acid, the molecule seemingly responsible for heredity, had a double helix structure. By suggesting that DNA could split into complementary strands, the two men had established the first plausible physical basis for the encoding and transmission of genes, literally the secret of life. It was biology's biggest moment in the 20th century.

    One might expect that such an accomplishment would be trumpeted in newspaper headlines around the world. But this was before the days when every advance in science, marginal or not, was preceded by a drumroll of missives from press agents. In fact, the double helix was a dog that did not bark, at least not at first, in this or any other newspaper.

    The two men made their discovery on Feb. 28. Their paper appeared April 25. Major newspapers in Britain did not notice until May 15, when Sir Lawrence Bragg, the director of the Cavendish Laboratory, where Dr. Watson and Dr. Crick did their work, gave a talk in London. That occasioned an article in The News Chronicle of London.

    The news reached readers of The New York Times the next day -- maybe. Victor K. McElheny, in researching his new biography,"Watson and DNA: Making a Scientific Revolution," found a clipping of a six-paragraph Times article written from London and dated May 16, with the headline"Form of 'Life Unit' in Cell Is Scanned."

    Yet a search of The Times's databases could find no trace of it. The logical, if galling, conclusion is that the article ran in an early edition and was then pulled to make space for news deemed more important.

    On June 13 The Times did run an article that called DNA"a substance as important to biologists as uranium is to nuclear physicists." Datelined London, the article missed the fact that Dr. Watson had given a double helix talk a week before only a train ride from New York, at Cold Spring Harbor on Long Island, Mr. McElheny pointed out.

    Although Dr. Crick was asked to be on a BBC program that fall, the double helix received scant mention for the rest of the year, according to Mr. McElheny.

    OUT OF AFRICA (posted 2-26-03)

    Hillary Mayell, writing in National Geographic.com about the origin of the phrase"out of Africa":

    Out of Africa. The phrase is everywhere; used to title movies, books, magazine articles, art exhibits, conferences, lectures, and travel tours. It's used as shorthand in newspaper headlines and to describe anthropological and medical theories related to Africa. But where did it come from?

    Somewhat surprisingly, the phrase stems from an ancient Greek proverb."There is always something new coming out of Africa," wrote Aristotle more than 2,300 years ago in his book on natural history.

    Writing in The Journal of African History, Harvey Feinberg and Joseph B. Solodow trace the history and meaning of the proverb from its ancient beginnings to contemporary usage."It's a phrase even Africanists don't know the origin of, so we were interested in tracing how it got from the ancient world to our world," said Feinberg, who teaches African history at Southern Connecticut State University (SCSU).

    Over the millennia, the meaning of"Out of Africa" has changed significantly."The Greek word that means 'new' had a different connotation than it does today," said Solodow, a professor of foreign languages at SCSU."For us, if we see a product advertised as 'New and Improved', we don't need the word improved to gather the right meaning. For the ancient Greeks, and Latins as well, the word 'new' tended to have negative connotations, associated with something strange or undesirable."

    SO HOW LONG WILL WE STAY IN IRAQ? (posted 2-18-03)

    Chart showing the number of U.S. soldiers located in foreign countries; in the Wall Street Journal, citing as a source the Department of Defense (February 13, 2003):

    Afghanistan 9,00016 months
    Bosnia1,7008 years
    Kosovo3,0004 years
    Korea37,00050 years

    IT TOOK LESS TIME TO TRY EICHMANN (posted 2-12-03)

    Tom Hundley, writing in the Australian (February 13, 2003):

    THE Nuremburg War Crimes Tribunal needed 11 months to try, convict, sentence and hang 10 of Adolf Hitler's top lieutenants.

    Swifter justice met Adolf Eichmann, the bureaucrat behind the Holocaust. His 1961 Jerusalem trial lasted eight months. He was hanged in May 1962.

    But the wheels of justice turn in slow motion for Slobodan Milosevic, the disgraced former Yugoslav leader. His trial before the International War Crimes Tribunal began a year ago today, and the prosecution still has not finished laying out its case.

    The prosecutors want another year or two, but chief judge Richard May, who can scarcely conceal his impatience with the snail-like pace, has imposed a May deadline. Mr Milosevic, defending himself, will then get equal time in court, which should take the trial to the end of 2004.

    EJECTOR SEATS ABOARD THE SHUTTLE? (posted 2-12-03)

    Roger Lanius, commenting on the use of ejector seats aboard shuttles; in an interview on CNN (February 8, 2003):

    ROGER LAUNIUS, AIR AND SPACE MUSEUM HISTORIAN: There's been an acceptance that you really can't bail out of these things very readily. KOCH: Air and Space Museum historian Roger Launius says the seats were taken out after just four flights because of their weight and limited usefulness. LAUNIUS: They're not much good if you're above about 50,000 feet, and clearly if you're going at hypersonic speeds, they're not much good for you either. The individuals would be going too fast to survive the ejection. KOCH: After the 1986 Challenger explosion, NASA devised a landing escape system. But it again, only works at low altitudes.

    25 PERCENT OF GULF WAR VETS DISABLED (posted 2-10-03)

    Kim Cobb, writing in the Houston Chronicle

    Of approximately 575,000 Gulf War veterans eligible for VA benefits, about 25 percent are certified as disabled to some degree.

    The Department of Defense asserts it's made dramatic progress in its ability to protect American troops against chemical and biological warfare in the past 12 years. The protective gear being issued to troops deploying now is vastly improved, the agency says, and"without a doubt the best that is available in the world today."

    But some veterans and their advocates aren't buying those assurances. ...

    ... [thousands of Persian Gulf veterans] came home with a baffling assortment of symptoms that first were dismissed by Veterans Administration doctors as stress-related. Although the VA now concedes that many Gulf War veterans share ailments such as chronic fatigue, balance and cognitive problems, the cause of those illnesses remains in dispute.

    Of approximately 575,000 Gulf War veterans eligible for VA benefits, about 25 percent are certified as disabled to some degree. ...

    The Pentagon denied for several years after the Gulf War that American troops had been exposed to toxic chemicals, but in 1996 conceded that as many as 100,000 Gulf War veterans may have been exposed to low levels of sarin gas when U.S. troops destroyed an Iraqi munitions depot in Khamisiyah in March 1991.

    IS THE STOCK MARKET DOOMED THIS YEAR? (posted 2-6-03)

    An exchange on CNBC's"The News with Brian Williams" (January 31, 2003):

    FORREST SAWYER: CNBC's Sharon Epperson is with us on what's called the January factor. Which means what, Sharon?

    SHARON EPPERSON, CNBC: Well, it's actually a January barometer, and it's been around for many years. Stock historian Neil Hirsch (ph) came up with the term. And what it means is, he's looked at the past Januarys since 1950, and every time there's been a down January, the year has followed suit, and it's been a down market for the entire year. He's looked at the S&P 500 and found, actually, that it's been down on average about 13 percent by the end of the year, when you've had a down January.

    SAWYER: But it's not every time. I mean, it's just most of the time, right?

    EPPERSON: It's most of the time. But there have only been a couple of exceptions, and those were wartime, actually, the Vietnam War, 1966 and 1968. In 1966, January, the market was down but finished -- rather, was up, but finished the year down after the war had started. And then in 1968, the market was down at the beginning, and then actually turned around.

    HARD TO GET ELECTED IN A RECESSION (posted 1-29-03)

    Presidential historian Allan Lichtman, appearing on MSNBC (January 20, 2003):

    No president in the history of the republic has ever been reelected during an election-year recession. George Bush knows that he's got to sell an economic program to the American people.

    ANTI-SEMITISM ON THE RISE AMONG THE YOUNG (posted 1-27-03)

    A news brief in the Washington Post (reprinted in frontpagemag.com, January 27, 2003):

    Anti-Semitism may be increasing in the United States as more young adults express bigoted views about Jews than do middle-aged Americans, according to a national poll by the Institute for Jewish and Community Research in San Francisco.

    On question after question, researchers found that the proportion of Americans ages 18 to 35 who held anti-Semitic views was consistently higher than the percentage of middle-aged Americans who shared those attitudes.

    For example, nearly one in four young adults - 23 percent - agreed with the statement that Jews were a"threat" to the country's"moral character," a view shared by 15 percent of Americans between ages 45 and 54. And 20 percent of young adults agreed that Jews" care only about themselves," compared with 12 percent of middle-aged Americans.

    Gary Tobin, president of the group that commissioned the survey, suggested that the disquieting results may reflect"the blurring of anti-Israelism and anti-Semitism on college campuses" and that"the social norms against anti-Semitism that took root following the Holocaust have worn off."

    The survey of about 1,000 randomly selected adults was conducted in May. The margin of sampling error was plus or minus 3 percentage points.

    THE MAN WHO CLAIMED TO FLY AN AIRPLANE IN 1901 (posted 1-20-03)

    Prepare yourself. It's the 100th anniversary of the Wright Brothers flight. There will be lots of stories like this one from Paul Marks, writing in Connecticut Today:

    On a hot August night in 1901, so the story goes, a Bridgeport man climbed aboard a strange, bat-winged aircraft and rose into the sky above Long Island Sound. He said he flew as high as 200 feet.

    Twenty-seven-year-old Gustave Whitehead, an eccentric turn-of-the-century tinkerer, thrilled to the view over the moonlit waves. His pride swelled over the 16-foot-long monoplane, puttering along on acetylene-powered engines he built himself.

    "I was soaring up above my fellow beings in a thing my own brain had evolved," he wrote euphorically afterward."I could fly like a bird."

    So the story goes. But despite the vivid account, aviation historians say Whitehead's flight was merely one of fancy.

    Never did the German immigrant - whose trip aloft would have beat the Wright brothers' historic flight at Kitty Hawk by more than two years - furnish photographs or other reliable evidence. Nor did his primitive plane ever fly for the public.

    Witnesses he produced - mainly youthful helpers or friends - were considered biased or unreliable.

    Among the more generous assessments came from the late Harvey Lippincott, archivist in the 1960s for United Aircraft Corp., forerunner of United Technologies Corp. He wrote that Whitehead"for all his eccentricities, should be acknowledged as a great and true pioneer."

    The design of his craft was no more preposterous, Lippincott said, than that of Samuel Langley's"Aerodrome," a winged craft built with federal backing that flopped dramatically into the Potomac River. Whitehead's streamlined fuselage, wheeled landing gear and tractor propellers out in front of the pilot all broke aeronautical ground.

    But the inventor's grandiose claim is beyond proof, Lippincott said."There seems to be evidence that his airplanes made short flights or hops. But the proof required by current authorities to sustain his claims of flights of substantial length has not come forth."

    Michael Speciale, executive director of the New England Air Museum, is less circumspect.

    "Whitehead supposedly built this thing, flew over Long Island Sound one night and came back," he said, arching his brows."There were no witnesses. My question is: If it was so good, how come he never did it again?"

    A German immigrant who arrived in the United States in 1995 at age 21, Whitehead settled in Bridgeport five years later. He arrived with a reputation, having already made several unsuccessful attempts at flight. He was reported to have launched some sort of steam-powered airplane near Pittsburgh in 1899, almost killing a passenger who was scalded when it crashed.

    It was enough to prompt Gov. John Dempsey, prodded by partisans from the state's largest city, to confer the title,"Father of Aviation in Connecticut" on Whitehead in 1964 in recognition of"his inventive genius."

    But even the gubernatorial proclamation allowed that Whitehead's flights might have been only in his mind.

    DID YOU MISS JOHN HANCOCK'S BIRTHDAY? (posted 1-13-03)

    You may have just missed John Hancock Day ... or maybe not. According to an article in the Christian Science Monitor, some place the date of his birth on January 12, others on January 23. The consequence is that those who admire him for his handwriting--and there are many in this category--celebrate his contribution to American penmanship on different days.

    Can no one out there settle the question once and for all of the birth date of Founding Father John Hancock? Please drop a note to the editor if you can. (editor@historynewsnetwork.org)

    NOTE: A reader has suggested that the confusion over his birthdate may stem from the use of different calendars. In the eighteenth century the calendar was changed; the change resulted in the shift of dates by eleven days. Thus, George Washington was born on February 11 (old style calendar) but February 22 (new style).

    2 MORE BUBBLES TO POP? (posted 1-13-03)

    Economist Dean Baker, commenting on an article by John Berry in the Washington Post,"For '03 Economy, Cautious Hopes" (January 5, 2003):

    This lengthy article discusses the economy's prospects in 2003. It makes no mention of either the housing bubble or the dollar bubble. The housing bubble has led home sale prices to outpace the overall rate of inflation by more than 30 percentage points over the last seven years, creating nearly $3 trillion of bubble wealth. The dollar bubble has led to a large increase in the current account deficit, which is now running at a rate of more than $500 billion a year. The bursting of one or both of these bubbles would have an enormous impact on the economy. Ignoring these bubbles at the start of 2003 is comparable to ignoring the stock bubble when discussing the economy's prospects in 2000 or 2001.

    14 WOMEN IN THE SENATE--A RECORD (posted 1-9-03)

    The Christian Science Monitor, taking note of reasons why the 107th Congress was historic (December 30, 2002):

    On Dec. 20 swore in its 14th woman - another record. Alaska Gov. Frank Murkowski's decision to name his daughter, Lisa, to fill his unexpired term also marked the first time a US senator was appointed by her father.

    "The shift to 14 woman is a very big historical footnote," says Larry Sabato, a political scientist at the University of Virginia."It means that slowly but surely we're moving toward eventual parity between genders in both the House and the Senate. We're making a lot more progress on gender equity than racial equity."

    DOW IN THE DUMPS (posted 1-8-03)

    USA Today, summing up the decline in the stock market (January 2, 2003):

    Shell-shocked investors now know all too well that stocks fell a third year in
    a row in 2002, a dubious achievement not seen in more than 60 years.


    As if that's not bad enough, there's a distinct possibility of a four-peat in
    which the downward spiral drags on and sends stocks down again in 2003. Merely
    entertaining the thought is frightening: The only time stocks have fallen a fourth
    year in a row was in 1932, when the USA was in the grips of the Great Depression.
    Some market historians insist that bleak possibility shouldn't be ruled out.


    "The market is still overvalued and still in the aftermath of the biggest
    bubble since 1929," says Robert Shiller, professor of economics at Yale University,
    who adds there's a greater than 50-50 chance stocks will fall for the fourth-straight
    time this year.


    Shiller says investors are acting like gamblers, betting on a coin toss. "If
    you toss a coin and get three tails, people think the next toss will be a head,"
    he says. But that's a fallacy, Shiller says, because statistically there's still
    an equal chance for both heads and tails to turn up on the fourth toss. Likewise,
    the fact that stocks have fallen for three years has no bearing on 2003. "There's
    nothing to prevent the market from falling four years in a row," he says.

    Others agree, saying the excesses from the bull market haven't been unwound yet.
    After all, the bull market from 1982 to 2000 far exceeded the one in the 1920s,
    so the hangover should be worse, says Gibbons Burke, editor of MarketHistory.com.

    That hasn't been the case, though. The Dow Jones industrials have fallen only
    39% from the 2000 high to the October low, well below the 85% inflation-adjusted
    loss from 1929 through 1932 and 75% drop from the 1966 peak to the 1982 bottom,
    he says.


    Others insist the chances of stocks falling for four-consecutive years is unlikely
    if you consider:


    * Three-peats are rare. The Dow Jones industrial average has fallen three years
    in a row only four times, says James Stack, president of InvesTech Research. It's
    happened only two times before with the S&P and never with the Nasdaq.


    Even Japan's struggling Nikkei stock average has never fallen four years in a
    row, Stack says. "Odds are in favor of the bulls heading into 2003,"
    he says.


    * Four-peats are even rarer. The Dow and S&P are the only major indexes to
    ever fall four years in a row, and they only did it once.


    * Stocks tend to rally after being down three or more years. The S&P rose
    12% in 1942 after being down for three years. Even better, it skyrocketed 47%
    in 1933 after falling for four consecutive years.


    * Investors have been adequately punished for excesses. The Nasdaq composite has
    suffered three of its worst five years ever in 2000, 2001 and 2002. And 2002 was
    even worse than 2001.


    Market-watchers also point to seasonal trends that make a fourth down year less
    likely. For instance, 2003 is the third year of a presidential term, and stocks
    tend to do best in such pre-election years, says Jeff Hirsch, editor of the Stock
    Trader's Almanac. Falling again would "be a tall order," he says.

    THE DOW IN PERSPECTIVE (posted 1-2-03)

    Editorial, in the Wall Street Journal (January 2, 2003):

    Today's Dow is still nearly 2,000 points higher than when Alan Greenspan uttered the words"irrational exuberance." For that matter, stocks are up eight-fold since 1982, when President Reagan initiated the long boom with policies resembling those now being championed again by the Bush Administration.

    THE SENATOR WHO BECAME A GOVERNOR AND APPOINTED HIS DAUGHTER IN HIS PLACE (posted 12-30-02)

    Editorial in the Columbus Dispatch, commenting on Frank Murkowski's decision to resign from the Senate upin his election as goveror of Alaska and appoint his daughter to replace him (December 27, 2002):

    The fact that she is the daughter of the governor who appointed her has some Alaskans upset, and rightly so. As one critic put it,"It rubs against people's sense of propriety." And it undermines the public's trust in elected officials.

    About two dozen senators have resigned to become governors over the years, and four of those have had the opportunity to appoint their successors, assistant Senate historian Betty Koed told the Anchorage Daily News. But none has appointed a daughter or a son, she said. Two did appoint their wives to the job. That's not a good idea, either.

    Some people might argue that there have been several instances where the wives of lawmakers have been appointed to fill their husbands' unexpired terms after the husbands' deaths. While some of those women may not have been the most qualified candidates, the important difference in those cases is that the wives were appointed by someone to whom they were not related.

    Frank Murkowski, who was sworn in as governor this month, says he considered 26 candidates and interviewed 11. He says he thought long and hard about appointing his daughter. He should have mulled it over a bit longer.

    WHEN THE SMITHSONIAN SNUBBED THE WRIGHT BROTHERS (posted 12-27-02)

    Miles O'Brien, CNN correspondent, reporting on the upcoming 100th anniversary of the Wright Brothers flight (December 21, 2002):

    Lost amid the high praise [of the Wright Brothers] is a stubborn embarrassment of history. Ninety-nine years ago, the Smithsonian refused to give proper credit to the Wrights and was not even interested in displaying the Flyer. Peaked with anger, Orville shipped the historic craft to a museum in Great Britain. The Smithsonian did not see the error of its ways and bring the flyer home until 1948.

    THE BOOM IN NUCLEAR B0MBS (posted 12-20-02)

    John Else, writing in the San Francisco Chronicle (December 15, 2002):

    Since 1945, something like 100,000 nuclear bombs have been manufactured. Long ago we passed that milestone in history when it became possible to exterminate all life on Earth. That probably won't happen in our lifetimes. But if you can get together a coffee can of enriched uranium, the Hiroshima- sized bombs designed at Los Alamos are relatively cheap and easy to build -- in Iraq, in Israel, probably in Idaho or Paraguay if you're passionate and determined enough. Getting the fuel is the only really hard part.

    SUPERSTITIONS AREN'T AS OLD AS YOU THINK (posted 12-18-02)

    Richard Morrison, writing in the Times (London), about the origins of superstitions concerning Friday the 13th (December 13, 2002):

    Most think it is something to do with there being 13 people at the Last Supper. But the Last Supper happened on a Thursday! What’s more, although the inestimable Oxford Dictionary of English Folklore traces a belief about Friday being unlucky to Chaucerian days, and “13 phobia” to the 17th century, its editors can find no superstitious conjunction of Friday and 13 earlier than 1913! In other words, it’s a 20th-century fraud. Quite possibly some Hollywood publicist was behind it from the start.

    The same is true, it seems, of many other supposedly ancient superstitions that linger like shreds of old wallpaper in the pristine, rationalist rooms of the 21st-century mind. Like many people, I say “touch wood” when I mean “I hope”. I thought I was making some faint connection with the Celtic Druids who believed that trees contained benign spirits, or perhaps with the Catholic Good Friday rite of kissing “the wood of the Cross”. But the Oxford Dictionary is scathing. “There is no basis whatsoever for these explanations, beyond guesswork,” it says.

    And the experts are equally disdainful about the antiquarian claims of Britain’s other “Top Ten Superstitions”. Only one, that spilling salt brings bad luck, is more than 400 years old. True, the widespread superstition about black cats dates from the 17th century (though there is still no agreement about whether the felines in question bring good luck or bad). But the rest — broken mirrors, magpies, dropped scissors, walking under ladders, umbrellas indoors, new shoes on a table, etc, etc — turn out to be nonsenses concocted in the 18th, 19th or 20th century. And most of the “lucky rituals” that saturate the world of sport, gaming and entertainment are no older than my patio doors.

    KILLER FOG OF 1952 (posted 12-18-02)

    NPR (December 11, 2002):

    Dec. 11, 2002 -- Fifty years ago this month, a toxic mix of dense fog and sooty black coal smoke killed thousands of Londoners in four days. It remains the deadliest environmental episode in recorded history.

    The so-called killer fog is not an especially well-remembered event, even though it changed the way the world looks at pollution. Before the incident, people in cities tended to accept pollution as a part of life. Afterward, more and more, they fought to limit the poisonous side effects of the industrial age.

    WHAT THE US NEVER KNEW ABOUT SOVIET NUKES (posted 12-3-02)

    Richard Perle, in the course of a debate about"The State of the West" (December 2, 2002):

    It might be worth some time looking back at the history and results of the Arms Control agreements of the Cold War. We now know that the Soviet Union had 50,000 nuclear weapons, 20,000 more than we ever knew. They hid far more weapons than were ever subject to limitation in the course of those negotiations.

    IMMIGRANTS CONSTITUTED HALF THE NEW WORKFORCE IN 1990S--A RECORD (posted 12-3-02)

    Summary of the findings of a new report released by Northeastern University (December 4, 2002):

    During the decade of the 1990s, foreign immigration reached an all-time historical high in the U.S. when between 13 and 14 million net new foreign immigrants flowed into the country and contributed some 40 percent of the net growth in the resident population over the decade.

    As a consequence of both these immigrants’ relative youth and strong labor market attachment, they had an even more substantial – and heretofore unexamined – impact on the growth of the U.S. labor force, impacting the private sector to an even greater degree. New immigrants made up more than half of the growth of the nation’s entire civilian workforce between 1990 and 2001, but their impacts on labor force growth varied markedly by age, gender, region and state. Among males, new immigrants were responsible for 80 percent of the nation’s labor force growth and within the New England and Middle Atlantic divisions where immigrants generated all of the labor force growth between 1990 and 2001.

    According to a new analysis of 2000 Census data and 2001 monthly CPS surveys by Northeastern University’s Center for Labor Market Studies prepared for The Business Roundtable’s Education and the Workforce Task Force in Washington, D.C., neither the dimensions of immigrant participation in the labor force nor the role they played in the 90s boom has been fully appreciated. No longer solely relegated to low-level service and manufacturing jobs, new immigrants increasingly made up large portions of those who worked in retail, trade, and business, high-tech, personal and professional services.

    “At no point during the past century did new immigrants ever contribute so substantially to the labor market growth of the country,” said Andrew Sum, director of the Center for Labor Market Studies at Northeastern and one of the authors of the study. “New immigrants’ role in contributing to the 1990s job boom in both sheer magnitude and breadth can no longer be ignored.”

    STERILIZATION 11-27-02

    A story in the Boston Globe, exposing the history of sterlization in Massachusetts (November 27, 2002):

    In the late 1920s, when eugenics was a respectable science discussed in liberal drawing rooms, researchers looking to prove that rural bloodlines had become tainted hit upon the perfect case study in the town of Shutesbury.

    Without revealing their purpose to residents, eugenicists spent many months gathering information about families in the Western Massachusetts town, drawing genetic charts that showed"what may be expected when good pioneer stock is mixed with bad immigrant stock." The families of Shutesbury were used as a case study in Leon Whitney's 1934 book"The Case for Sterilization," which argues that"the useless classes" should not be allowed to reproduce.

    Many of the descendants of those families are learning of the study for the first time this week after a Boston Magazine reporter gathered hundreds of long-forgotten documents from the offices of the former American Eugenics Society in Philadelphia. Her article, which appears today, sheds light on the obscure part that Massachusetts played in the selective-breeding craze that culminated horribly in the Nazi plan to eliminate Jews. Other papers found in Philadelphia document the forced castration of 26 teenage boys at the state-run Hospital for Epileptics in Palmer. The doctor who sterilized the boys - who were diagnosed with epilepsy, kleptomania, masturbation, or"solitary behavior" - described his actions as"an effective means of race preservation."

    The emerging documents made the biggest stir yesterday among the residents of Shutesbury, who racked their brains to imagine how their bloodlines could have been studied and presented to national specialists without anyone's knowledge.

    "It's just scary to think where that might have gone, that kind of report, if it fell into the wrong hands" said Roberta Hunting, whose father-in-law was Shutesbury's town clerk for 60 years. The Hunting family's genealogy, carefully penned out for three generations, was among the papers found in Philadelphia.

    Massachusetts was not an exception in its forays into eugenics. Historians estimate that as many as 60,000 Americans were sterilized without their consent in state institutions because they were alcoholic, epileptic, mentally retarded, or"morally defective." In a landmark decision in 1927, Justice Oliver Wendell Holmes defended the practice, writing that"instead of waiting to execute degenerate offspring for crime, or letting them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind."

    THE HOUSING BUBBLE 11-5-02

    Economist Dean Baker, questioning the conventional wisdom that the boom in real estate prices is a positive development:

    Home prices have risen by more than 30 percentage points in excess of the overall rate of inflation over the last seven years. This is an unprecedented run-up in housing prices, which has added more than $3 trillion to household wealth. The collapse of this bubble is likely to seriously dampen consumption and economic growth.

    HALLOWEEN: DEBUNKER'S VIEW 10-28-02

    Laura Miller, in Salon.com :

    Of all today's holidays, Halloween seems like the most primeval. Its bats, witches, spooks, skeletons and monsters surely indicate roots reaching back before the dawn of science and Christianity; the whiff of prehistoric campfires clings to its sable robes. Well, guess again.

    Halloween has been creeping up on Christmas to become the second biggest annual bonanza for U.S. retailers, a Grim Reaper that harvests $6.8 billion per year in exchange for candy, costumes, cards and party supplies. That success sets it up for the kind of debunking that Christmas has endured recently, as historians have shown that what we think of as time-honored Yuletide traditions are actually only about 100 years old. Likewise, as two new books document, the seemingly ancient customs of Halloween turn out to be recent embellishments to a holiday that used to be a pretty low-key affair. And forget those Transylvanian villagers and superstitious medieval peasants -- Halloween is as American as the Fourth of July. ...

    Despite the fact that conservative Christians in America have protested the"pagan" revelry of Halloween, the holiday owes its name and many of its trappings to Christianity."Halloween" derives from All Hallows Even, the night before All Saints' Day (Nov. 1), which is in turn followed by All Souls' Day (Nov. 2), an occasion for praying for and visiting with the dead. In Mexico, the celebration of Los Dias de Los Muertos, or the Days of the Dead, closely resembles the old All Souls rites of the Middle Ages. The most extravagantly Catholic places had the grisliest practices:"In Naples," writes Rogers,"the charnel houses containing the bones of the dead were opened on All Souls' Day and decorated with flowers. Crowds thronged through them to visit the bodies of their friends and relatives. Sometimes the cadavers were dressed in robes and placed in niches along the walls." Leaving food out for the spirits was a fairly common ritual, as it still is in Mexico today.

    JAPAN: BASKET CASE? 10-25-02

    Economist Dean Baker, commenting on an article in the NYT by Howard French on October 13, 2002:

    This article discusses the progress of economic restructuring programs in Japan. At one point it comments that"for decades, the traditional approach to economic management has been to prevent corporate failure through heavy regulation and costly subsidies, passing along the burden to consumers and taxpayers." It is worth noting that Japan's per capita GDP rose at a 4.8 percent annual rate over the forty-year period from 1960 to 2000. This is one of the most rapid sustained growth rates by any nation in the history of the world. It suggests that Japan's system was enormously successful. This growth rate is far faster than any nation has been able to maintain following the policies advocated by the I.M.F. or World Bank. It would be appropriate to note the past success of Japan's economic model, and the possibility that, given this success, the forms of government intervention criticized in this article may have served a positive economic purpose.

    GLOBALIZATION IS NO MYTH 10-14-02

    Jay R. Mandle, in a recent article in the Historical Society's periodical:

    In 1970, 3% of manufactured goods originated in developing countries. Twenty years later that share was 18%, and today it is even higher. The image of poor countries as confined to supplying agricultural goods or raw materials no longer corresponds to reality. With this shift has come an acceleration of economic growth. Between 1990 and 1998 economic growth in the thirteen largest poor countries averaged 7.3% per year, higher than in any prior corresponding period—rapid growth by any standard.

    REQUIRING THE PEOPLE TO VOTE ON A DECLARATION OF WAR 9-16-02

    In 1937, on the eve of World War II, a Gallup Poll found that 80 percent of the American people approved of a measure requiring a national referendum on war. Proposals to require such a referendum had first been advanced in the 1910s when pressure mounted to draw the United States into World War I. In 1916 Senator Robert La Follette introduced a bill to require an advisory referendum on war in the event the United States broke diplomatic relations with a European power. In the 1920s and 1930s nearly a score of resolutions were introduced in Congress backing a constitutional amendment to require a national referendum on war. In 1935 Congressman Louis Ludlow (D-Indiana) gained widespread national acclaim for a constitutional amendment requiring a national war referendum except in cases of invasion or attack. The proposal never made it out of the Judiciary Committee. After the Japanese attack on the gunboat Panay in December, 1935, Ludlow mustered enough signatures on a discharge petition to force the House leadership to hold a vote to force the measure out of committee. Only FDR's strong opposition led to the discharge petition's narrow defeat.

    Source: Alexander DeConde, Presidential Machismo (2000), pp. 125-27.

    IMMIGRATION STATS 8-15-02

    Cal Thomas, in a recent column:

    In his 1992 book,"The Tyranny of Change: America in the Progressive Era: 1890-1920," John Whiteclay Chambers wrote of the great immigration wave of a century ago, noting that a majority of arrivals in this country never intended to stay. Many hoped that"after a few years of work, they could save enough money to return home to an improved position for themselves and their families."

    "Although the majority of new immigrants permanently settled in America, a significant number left (with a departure rate of 35 percent for Croatians, Poles, Serbs and Slovenes; 40 percent for Greeks; and more than 50 percent for Hungarians, Slovaks and Italians; the rate among Asian immigrants was much higher, more than two-thirds)," Chambers wrote. Today the departure rate is only about 15 percent and anyone who gets here, even illegally, can now expect his or relatives to legally follow.

    ARE WE SAVING ENOUGH? 8-13-02

    The media recently reported that the savings rate in the U.S. in June was 4.2 percent. This was regarded as a major development. Since 1999 the savings rate was usually lower. But should we be celebrating? Economist Dean Baker points out that before the 1990s the savings rate usually approached 10 percent. He adds:"With most of the baby boomers in their peak saving years, the U.S. would be expected to have a savings rate today that is higher than the historic average."

    BUSH AND THE STOCK MARKET 7-22-02

    According to New York Times, President Bush"is off to the worst start of any president in the last 75 years. At least, that is, as measured by the performance of the Standard & Poor's index of 500 stocks. With the plunge in stock prices over the last nine weeks, the S.& P. 500 has now fallen 36.9 percent since Mr. Bush was sworn in on Jan. 20, 2001. That is the worst record for any president, as measured by the S.& P., which dates back to 1927, and is nearly twice as bad as the record compiled over the first 18 months of Herbert Hoover's administration."

    The paper points out that the market also dropped during the first 18 months of the administrations of Presidents Reagan and Nixon and both went on to win re-election.

    The first President Bush benefited initially from a high stock market, the S & P 500 rising 26 percent during the first eighteen months of his term. He lost when a recession occurred during the second half of his presidency. The paper concludes that presidents are probably better off having a recession in the first part of their term.

    Only three presidents since the Civil War, according to the paper, never faced a recession during their time in office: James Garfield (who died after six months), Lyndon Johnson and Bill Clinton.

    BUSH AND POLLS 7-17-02

    President Bush, according to a new Washington Post-ABC News poll, remains popular with 72 pecent of the public. He has now remained hugely popular for ten months. How does his public opinion rating compare with other presidents whose popularity received a boost following a national crisis? Only one president succeeded in maintaining high poll numbers as long as President Bush; that was John Kennedy. His numbers remained high for twelve months following the Berlin crisis in 1961. By comparison:

    DatePresidentEventBeforeAfterChangeDuration in Months
    1941FDRPearl Harbor7284+128
    1948Truman Berlin Blockade3639+31
    1950Truman N. Korean Invasion3746+95
    1961JFKBerlin Crisis7179+812
    1962JFKCuban Missile Crisis6173+128
    1967LBJSix Day War4452+81
    1975FordMayaguez Incident4051+118

    Source: J. Lee,"Rallying Around the Flag," Presidential Studies Quarterly (1978).

    UNEMPLOYMENT RATE 1-3-02

    At the height of the Great Depression the unemployment rate, it is estimated, was about 25 percent. The latest unemployment statistics are out. They indicate that the unemployment rate among black teenegers is 32.2 percent. Economist Dean Baker notes that this is ten points higher than a year ago. (The unemployment rate includes only those people who are actively seeking employment.)

    FIRST WORLD WAR 12-18-01

    ... from Cal Christman, as related on H-Net:

    Lawrence Stallings edited a photographic history of the"Great War" in 1933, which he titled THE FIRST WORLD WAR, clearly implying that he expected a Second World War. He ended his photographic history by including pictures of Hitler, Mussolini, and Stalin on the last page, again implying that there would be another war to come. His work may have been the first published work to name the conflict of 1914-1918 the First World War.

    9-11 GALLUP POLL 12-5-01

    According to the Gallup Poll, 72% of Americans predict that 100 years from now 9/11 will loom more important than Pearl Harbor.


  • comments powered by Disqus

    More Comments:


    Kim Epton - 1/3/2004

    "But the hole had its dangers; if the pot broke or cracked, the guerrilla could be attacked by poisonous spiders or snakes. Hence, 'spider hole'."

    Spiders and snakes are not poisonous, they are venomous (well, some, anyway). Toadstools are poisonous. There is a difference.


    Susan Karina Dickey, OP - 10/20/2003

    Paul Collins of the Australian Financial Review (10-17-03) did a fine job of summarizing the process for electing a pope in the story posted by HNN on Oct. 20. In the final paragraph he comments, "It is often forgotten that the pope's primary title is bishop of Rome and it could be argued that it is appropriate that he be an Italian, or that at least that he be able to speak excellent, idiomatic Italian, and be completely at home in western European culture."

    True, the pope's primary title is Bishop of Rome, but the primary ministry is the spiritual leadership of the world's Roman Catholics. Furthermore, many non-Catholics--Christian and otherwise--take note of the cultural critique offered by this figure. I speak not only of John Paul II, but of anyone holding this office. Even those who disagree with the pope would concede that the Holy Father helps to shape the international discussion of various cultural, economic, and social issues.

    One could argue that in an increasingly globalized world that the pope should be from a non-western country. As for familiarity with the Italian language and European culture, many of the cardinals around the world were educated in Rome. Most of the "candidates" for the papacy are not strangers to the culture, yet a pope from Latin America or Africa would certainly bring a fresh perspective.

    Finally, the cardinals in their deliberations try to leave room for the influence of the Holy Spirit. Regrettably, some let personal ambition and interests interfere. But remember John XXIII? I daresay there are still a few surprises in store.



    William P. MacKinnon - 10/18/2003

    In your article "How Many Generals Have Been Elected President?" you also list generals who were nominated by their party but were not elected. You missed at least one: Brig. Gen. John W. Phelps (West Point, 1836) of Vermont, who ran in 1880 against another general (Hancock) as the American Party's standard-bearer. Phelps lost with only about 800 votes nation-wide.


    Rachael Focht - 10/14/2003

    Do you know where Massachusetts got its name from?


    Dave Livingston - 9/27/2003

    One of the bests, among many, examples of deceitful anti-war propaganda serrved up as news during the war was the admittedly dramatic photo of Saigon Police General Loan executing a V.C. on the streets of Saigon during Tet 1968.

    The photrojournalist took the photo of Gen. Loan killing a V.C. with his Smith & Wesson revolver, As said, it is a dramatic photo. The only caption to go with the photo was to say that very thing General Loan executing a captured V.C., but the implication was here is this nasty South Viet general brutally killing without benefit of proper legal process some poor perhaps Vietnamese farmer.

    Humbug! What the jerk of a journalist did not trouble to learn or if he did, did not bother to share with the photo was that the implied to be simple, gentle BViet farmer executed was inb fact the commander of a Communist murder squad captured immediastely sfter having murdered General Loan's best buddy. Not only that having murdered also his best buddy's wife AND all nine, 9, of their children, including a babe in arms.

    A Jesuit once told me instead of hating that murderous V.C. whom Loan execur=ted if our faith means anything I must instead pray for his soul. But I have not yet been able to bring myself to do that.

    Another illustratiuon that we were the good guys was brought home to me by the story of Fr. Charles Watters, one of two chaplains, both Catholic piests, to be awarded the Medal of Honor for valor on the field of battle in Viet-Nam, but that is a story for another time.


    Dave Livingston - 9/27/2003

    referring to the Republic of Viet-Nam, but not the People's Democratic (Communist tyranny) of Viet-Nam, i.e., South Viet-Nam, but not North Viet-Nam the estimation that our bombing killed a million Indochinses is spo much horse pucky. Unlike the chair-polishing Leftist America haters relating this propaganda I was in Indochina during the war, Lieutenant, 1st Infantry Division, 1966-7; Captain, 101st Airborne, March, 1969 until 22 January 1970, when WIA during a firefight with Little Brown Brother.

    The only reason I was hit was because LBB brought along too many of his neighbors and cousins from North Viet-Nam than one G.I. could handle.

    If a lot of North Viets were killed, that gave nor pleases me, but colateral damage is a fact of war, particularly modern warfare with its very destructive weapons. Despite what the Bleed'n Heart Left presupposes U.S. military doctine does not sanction the killing of civilians. We avoid it whenever practical, if possible, without risking the completion of the mission, striking a military target, for instance or unduly risking American or allied lives. There is a practical reason in addition to the moral one for our attempting to avoid colateral civilian deaths, generally it is counter-productive because contrary to uninformed opinion, it rather than beating an enemy populace into submission it serves to stiffen enemy resolve--just look at the London Blitz. But of course, if the civilian populace is deliberately targeted to be murdered there aren't enough people left for their resolve to be stiffened.

    But we never, as far is known to me, deliberately attacked cilivians as civilians in 'Nam. Indeed, my second tour I was as much as ordered by a Colonel of the South Vietnamese Army to kill civilians out on the border with Laos because he knew better than I that nearly all of those civilians were supporters, whether willingly or unwillingly, via paying taxes and providing military recruits, to the Communists. But because he wanted them killed and I had the opportunity to do so, doesn't mean it was done. In fact, never once in two tours very frequently engaged with the enemy did ever even consider harming, let alone killing a non-combatant.

    To which i attrubite as perhaps part of the reason our Lord preserved my life when seriously WIA in one last firefight. Satisfied my honor is unbesmiriched and that my hands are clean of murder or of even of intentional harm to a non-combatant in at least that one regard I know I am prepared to face my Lord on Judgement Day. It is beside the point my soul probably will spend a long, very long, time in Purgatory as a consequence of the rest of my sins.

    It is very tiresome to real bleating anti-American propaganda of the supposed terrible things we did in 'Nam, but why is is those among us weeping crocodile tears over out=r sins never, or at least hardly ever, mention the Boat People, the million plus, the approximately 1/8th of the then people of South Viet-Nam fled the country in an extremely risky way, with unknown thousands dying at sea, to escape the Communist dictatorship that was about to be inflected upon them. Then there are the estimated, no-one in the West will ever know the true figures, tens of thoysands who were murdered by the Communists, not only in their concentration-cum-murder camps, but also in the streets of the cities. It is recorded that the Hanoi government had to send special authorities to the South to rein in trheir own murder squads becausse the willful & deliberate murders of South Viets had gotten so out-ofhand it threatened to depopulate the South. Our fat-fannied, limp-wristed critics never bother to mention those uncomfortable facts.

    If you doubt my wrd, it is suggested you go to Westminister, California, where Boat People have established the largest Vietnamese community outside Viet-Nam. Their disdain, to put it mildly, of Communists sand of Pinko fellow-travelers here in the States is illustrated by the bronze statute in a park, a statute bought & paid for by the Viet community of an American G.I. and an ARVN standing together in cvomradeship. Moreover, in the parek are two kiosks, one listing the Americann KIAs, the other ARVN KIAs & missing.


    benjamin r. beede - 9/27/2003

    As usual, the date World War II ended depends upon one's perspective. Legally, for the U.S., World War II ended on December 31, 1946, because President Truman needed more time for economic controls to work in the post-World War II period.
    Moreover, extending World War II extended G.I. benefits, thereby promoting recruitment of new soldiers who wanted the benefits. One could well argue, moreover, that World War II only ended with signing of peace treaties, some of which were negotiated, much less signed, until long after 1945. If one wants to restict the end of World War II to combat operations, then it should be realized that scattered fighting continued for some time. There was significant combat in the Philippines early in 1946, for example.


    John Stobo - 8/19/2003

    Fighting did not end on August 14/15 1945. The Soviets and Japan continued fighting
    until August 31/September 1. In the second half of August 1945 the Red Army completed its
    invasion of Manchuria, retook the southern half of Sakhalin, and occupied the Kuriles.


    David L. Carlton - 8/8/2003

    This item, I must say, flummoxes me. The foundation of the Duke fortune was the American Tobacco Company; from about 1905 forward much of that fortune was funneled into what is now Duke Energy. But does money made in tobacco stop being "tobacco money" if it's invested (laundered?) in another industry? Did the family keep its tobacco money and its utility money scrupulously segregated? Can one tell by the smell if the money is "tobacco money" or "power money"?

    The item makes (some) sense if one considers that the Duke Endowment--an entity that has underwritten Duke University and other institutions in the Carolinas since the 1920s--was funded with securities of what was then called the Southern Power Company. Indeed, many Carolinians at the time regarded the creation of the Endowment as a device to buy political support for the utility--not exactly a universally beloved institution locally. But the fact remains that the foundation of the Duke fortune was tobacco; the family would scarcely have had the means to underwrite a massive regional power generating and transmission company had it not been for its success in monopolizing the American cigarette market for some twenty years.


    Catherine Aitkenhead - 4/29/2003

    Ms. Miller states that Halloween owes much of its trappings to Christianity, as if Christianity were the source of Halloween traditions. In fact, Holloween comes from much older religions than Christianity. What is confusing Ms. Miller is that it was the Catholic Church that incorporated the local pagan customs into its practices in the areas where it spread its doctrine. The fall festival of the northern Europeans became Halloween, which the Catholic Church associated with an All Hallow's Eve. The spring festival of the goddess Estre (a fertility goddess) was changed to Easter, and the Catholic Church associated it with the resurrection of Christ. In fact, if you read the New Testament scriptures, nowhere are Christians told to keep certain days or feasts (except the Lord's Supper), and nowhere is there a date given for any of the events of Christ's life. There is a close correllation to many of the Catholic feast days with local ancient festivals. This was a device of the Catholic Church to bring peoples under its control.