Roundup: Talking About HistoryFollow RU: Talking About History on RSS and Twitter
This is where we excerpt articles about history that appear in the media. Among the subjects included on this page are: anniversaries of historical events, legacies of presidents, cutting-edge research, and historical disputes.
From Bill Clinton's use of Fleetwood Mac's"Don't Stop" to Bob Dole's unlikely choice of Gary Glitter's"Rock'n'Roll Part Two", a key feature of every US presidential campaign is the pop or rock tune adopted to gee up supporters. But there's only ever been one candidate who wrote his own tune and had the great bebop vocalist Jon Hendricks create the lyrics:"Your politics oughta be a groovier thing, so get a good president who's willing to swing. Vote Dizzy! Vote Dizzy!" And he's the same candidate who wanted to change the name of the White House to the Blues House, install Duke Ellington as Secretary of State and revoke the citizenship of the racist southern governor George Wallace and deport him to Vietnam.
Forty years ago Lyndon Johnson didn't just beat Barry Goldwater for the presidency. He faced another challenger, a man whose puffed cheeks blew a musical revolution through his 45 degree-tilted trumpet: John Birks"Dizzy" Gillespie.
Perhaps because they didn't think his programme worth studying, political historians have consistently overlooked Dizzy's run for the presidency. But, in so far as he was ever serious about anything, Gillespie took the campaign seriously enough to come up with a platform and nominate his cabinet. The John Birks Society, formed to organise on his behalf, was active in 25 states and petitioned to place Dizzy's name on the ballot in California. In the end, Gillespie withdrew. As he put it in his autobiography, To Be, Or Not... To Bop:"I never thought the time would come when I'd vote for Lyndon B. But I'd rather burn in hell than vote for Barry G." On the way, however, he drew extra attention to the issue of race and provided disaffected voters with something markedly different from the mainstream duopoly; and, of course, had a lot of fun.
The idea for the campaign was dreamt up by Jean Gleason, wife of the jazz critic Ralph Gleason, and Ramona Crowell, a longtime fan of Dizzy's who first met him while seeking his permission to use his image on a T- shirt."It was at the Black Hawk in San Francisco," recalls Crowell."He tried to hit on me because he didn't know that I was married." Gillespie asked her to meet him at his hotel the next day to discuss the T-shirt."I said I would," says Crowell,"but Ralph Gleason warned me not to go up to his room, because Dizzy was a notorious womaniser." Assured that there were other people present, Crowell eventually did go to the room, where she found a group of people drinking"gallon jugs of wine", and after an afternoon of drinking and eating, Gillespie gave his permission.
In the summer of 1963 the Gleasons begun the campaign with a rally in Chicago, and soon"Dizzy for President" badges were to be seen at Core (Congress of Racial Equality) rallies around the country. That September the operation gathered momentum at the Monterey Jazz Festival, where Hendricks wrote the lyrics to the campaign song - it was sung to the tune of an old Gillespie number,"Salt Peanuts" - and performed it with Dizzy's quintet. It was also the first time that Dizzy met Ramona Crowell's husband, Kenney."He hugged me and kissed me," says Ramona Crowell,"and then he turned to Kenney and said, How are you?' Kenney said: I'm okay, but I don't like you kissing my wife.' So Dizzy said, Oh, are you jealous?' And he went over, grabbed both of Kenney's cheeks and gave him a big kiss on the mouth."
Ramona Crowell was by this point Dizzy's vice-presidential running-mate. Miles Davis was pencilled in as director of the CIA, Louis Armstrong as Minister of Agriculture, Thelonious Monk was to be Roving Ambassador Plenipotentiary, and other cabinet members were to include Ella Fitzgerald, Peggy Lee, Woody Herman and Count Basie. According to Dizzy, the drummer Max Roach wanted to be Minister of War, but was overruled, because, said the candidate,"We're not going to have any." The Library of Congress was to be in the charge of Ray Charles, and Charles Mingus was to be Minister of Peace,"because he'll take a piece of your head faster than anybody I know".
Dizzy's campaign promised that if he was elected, he would fight for civil rights and equal opportunity in the job market. To ensure that employers were truly blind to race, Dizzy proposed that those applying for jobs would"have to wear sheets over their heads so bosses won't know what they are until after they've been hired". He promised to end the war in Vietnam and to give full diplomatic recognition to China (which the US was not to do until 1979). Healthcare and education were both to be free.
In recognition of his most loyal constituency, Dizzy said he would push for the creation of civil service nightclubs where jazz musicians would be guaranteed work as government employees. Nasa was also to be instructed to send a black astronaut to the moon. When the Gillespie campaign couldn't find any qualified applicants, the candidate volunteered to go himself.
Dizzy spent time on the campaign into early 1964, during his residency at Birdland in New York, although after he failed to get on the ballot in California (he claimed he"almost" got on), as Ramona Crowell puts it:"It sort of fizzled out." But Dizzy always insisted that it had not been just a publicity stunt."Anybody coulda made a better president than the ones we had in those times," he wrote,"dilly-dallying about protecting blacks in their civil and human rights and carrying on secret wars against people around the world. I didn't think there was any choice. I had a real reason for running because the proceeds from the sale of the buttons went to Core and SCLC the Southern Christian Leadership Conference, whose president was Dr Martin Luther King, Jr , and I could threaten the Democrats with a loss of votes and swing them to a more reasonable position on civil rights."
"It shone a light on the whole thing," says Hendricks."Like, what about a black person running for president? It had never happened before. At the same time, black people were saying to the Democrats, We don't have to vote for you.' It was to give both political parties, all those poseurs and jive-talkers, a kick in the butt."
Ramona Crowell thinks the political process in America has been"downhill ever since" 1964. Like many involved in the"Dizzy for President" campaign, she is not a fan of the White House's current incumbent."We've all," she warns,"got to get our mojos working."
Ulysses S. Grant is widely remembered as a great general and a poor president. In his new biography of the 18th president, historian Josiah Bunting III notes that pundits and historians have tended to write of Grant the president with" condescension." Bunting sums up this view of Grant:"There may be strength in his soul, but no fineness in it, no grace; little culture, small learning . . . meager evidence of the capacity to learn and reflect; no felt obligation to explain himself; no evidence of self-doubt."
If those words strike you as remarkably similar to the view of President Bush that is dominant in academia, much of the media and urban-oriented blue-state America, then you see one of the parallels between Grant and Bush that fairly leap off the pages of Bunting's fascinating book.
These similarities struck me with particular clarity during the recent round of debates between the president and his Democratic opponent, Sen. John Kerry, that polar opposite of Bush in ideology, culture and worldview.
It's worth pondering the parallels between Grant and Bush because the American people are about to decide what qualities they want in their president, and the choice has rarely been starker. Bush, for good or ill, shows many of the same character traits, strengths and weaknesses as the Union general who became known as"Unconditional Surrender" Grant -- whose single-minded pursuit of victory at whatever the cost ultimately won the Civil War, but whose nave trust in and loyalty to his friends resulted in a scandal-marked presidency.
Man of few words
Grant was neither cultured nor eloquent. Bunting calls him a"stumpy, awkward, bashful man" of few words, who won a reputation in the Civil War as"a fighter not a talker." He was a rough-cut Westerner, a cigar-chewing general who would sit on a stump during battles, oblivious to personal danger, issuing terse orders.
As the casualties piled up and the Northern press called him a"butcher," he bulldozed ahead, never second-guessing himself through the blood-soaked horrors of 1864 to a final armistice at Appomattox, Va., in 1865. His tenacity, like Bush's regarding Iraq, was widely questioned during the war, but later cited as crucial to victory.
As a general-turned-politician, Grant was a poor public speaker. He could write clear and precise battle orders but couldn't give a decent speech. Bunting believes that was no handicap to Grant as president."America has always loved a tongue-tied hero, a Charles Lindbergh, a Lou Gehrig," Bunting writes."Here was the very incarnation of Act not Talk."
That, too, is what millions of Americans admire in George W. Bush: He is Act, not Talk. The contrast between Bush and Kerry on this count is telling. The president clearly was outclassed on eloquence in the debates, where Kerry's mastery of language and argument was evident. But Bush's record as president shows a willingness to make big decisions, take big risks, stick with a course of action and play for victory. This is the essence of strong leadership.
The doubts about Kerry run to the opposite qualities: Is he too indecisive, too politically calculating and too quick to change positions under pressure -- all too much Talk not Act?
This is Bunting's take on Grant's decisiveness:"Grant was willing to make decisions and live with their consequences, sustained, as William Tecumseh Sherman once said, by a constant faith in victory." Bush was willing to gamble his presidency on war with Iraq and seems sustained by an unshakable faith that freedom will triumph over terrorism. He will live with the consequences, which could arrive on Nov. 2.
Letters and photographs recalling the lasting bond between Oscar Wilde and his Canadian patron Robert Ross -- the man believed to have sparked Wilde's embrace of a homosexual life and the eventual saviour of his literary reputation -- are to be auctioned next week in Britain.
Ross was a grandson of Robert Baldwin, co-founder of responsible government in Canada, and was born in 1869 into a distinguished Toronto family.
He was educated in Britain and by 1887 had befriended Wilde, 13 years his senior and one of English literature's leading talents of the day. Ross is widely assumed to have been the novelist's first male lover after Wilde -- who was married with two children -- began feeling drawn to gay relationships.
Ross loyally defended Wilde when his homosexual activity later landed him in jail under Britain's 19th-century anti-sodomy laws.
The Canadian also comforted Wilde during his final illness in 1900, and an extremely rare photo of the dead author arranged by Ross -- valued at about $20,000 -- is among the objects being sold by Sotheby's.
Ross, who is also credited with nurturing the career of British poet Siegfried Sassoon and a host of other artists, was almost single-handedly responsible for rescuing Wilde's estate from bankruptcy and re-establishing public interest in his work in the 20th century.
Ross died in 1918, and his ashes were later placed in Wilde's tomb in Paris.
Among the artifacts to be sold next week in London are an original photograph of Wilde signed to"Bobbie" (estimated value $35,000) and the first letter written by Wilde to Ross when his Canadian friend decided to leave London to attend the University of Cambridge ($22,000).
"My dear Bobbie," Wilde wrote in the October 1888 letter."I congratulate you -- university life will suit you admirably -- tho' I shall miss you in town."
Ross was a leading man of letters and an influential art gallery owner in turn-of-the-century London.
Sotheby's notes that as early as 1887"Ross may have seduced Wilde, and it has generally been assumed that they were lovers at some time. ... In any event, Ross would prove to be the most faithful and most trustworthy friend Wilde ever had."
Authorities in the Spanish region of Aragon, whose kings helped evict the Moors from Spain 500 years ago, has stirred controversy by suggesting that the severed heads of four Moors should be removed from its heraldic shield.
The heads have upset the semi-autonomous region's growing population of Muslim immigrants, provoking its socialist administration to propose that the heads be erased from its bottom left-hand quarter.
"This is the ideal moment . . . to revise our symbols," the regional government's president, Marcelino Iglesias, said."We should think about one of the quarters that are on the current shield."
The Moors' heads are on the shield to mark the conquest of the northern Aragonese city of Huesca by one of the region's first Christian kings, Pedro I, in 1096.
That was in the early stage of the Christian reconquista that eventually saw the last Moorish king driven out of Spain in 1492, 700 years after they conquered most of the Iberian peninsula.
Opposition politicians yesterday accused Mr Iglesias of trying to airbrush the event out of Spanish history.
"We don't really think that the people of Aragon care about this," said Antonio Suarez, of the conservative, People's party opposition."Should we also take the crosses off our shield because they might offend other religions?"
"Changing an emblem of this category should not be done lightly," Guillermo Redondo, a historian and expert in Spanish heraldry, told the local Heraldo de Aragon newspaper."There is a fair amount of ignorance and ingenuity (in this idea)."
Mr Redondo pointed out that the four heads were similar to those carried on the shields of Corcega and Sardinia and said that they reflected the region's past as a land of both Christian and Islamic cultures.
"It is the sum of the Christian culture, because of the crosses, and Islamic culture, because of the heads. There is no blood on them - as there is on other pieces of iconography," he said.
But the proposal has won support from others.
"It is about time," said Adolfo Barrena of the local, far left, United Left-Aragon party."Symbols like this do nothing to favour integration and multiculturalism."
"It is a positive thing that would favour coexistence (between religions), in Aragon," agreed the general secretary of the Union of Islamic Communities in Spain, Riay Tatary.
At least four skeletons, including a child, were discovered on a 20-hectare block between Mazengarb Rd and the Southwards Complex two weeks ago. The find stopped earthworks at the site.
The skeletons were first thought to be European because they were found buried in coffins constructed with nails and there was no iwi knowledge of burials in the area.
However, consultant archaeologist Mary O'Keeffe said that, based on an initial archaeological investigation, the skeletons appeared to show Maori physical characteristics.
The findings still had to be verified by a physical anthropologist.
Ms O'Keeffe said there was a potential for other archaeological sites in the immediate vicinity based on the Kapiti Coast's rich and diverse environment at the time.
"It was a very desirable place to live," she said.
The skeletons were believed to date back to the early 19th century. Ms O'Keeffe based the date on the fact that the bodies had been buried using European burial customs.
"European influence from missionaries and traders was increasing on the Kapiti Coast from the early 19th century.
"A piece of cloth was found in association with one of the burials which may have been a blanket wrapped around the body -- again suggesting European influence," she said.
In a dark wood in northern Poland, up near the Russian border, something dreadful is happening: Hitler is being brought back to life. There have been resurrected Hitlers before, of course, on screen and in novels, but nothing remotely like this. Using "the most innovative computer animation techniques ever seen on TV", Hitler's face has in effect been superimposed on to an actor's body.
Although this hybrid Hitler is only partially flesh and blood, he struts, rants and gesticulates just like the real thing. But having breathed new digitalised life into the Fuhrer, what do you do with him? This being television, the answer is simple: you give him his own show.
Virtual History: The Plot to Kill Hitler is a cod documentary about the attempt to kill him in 1944 - "The July Plot". On July 20, 1944, a group of high-ranking German generals, led by Colonel Claus von Stauffenberg, tried to blow Hitler up at his Polish headquarters, the Wolf's Lair. The plot failed, but only just: a briefcase full of explosives detonated as planned, but left Hitler with only minor injuries. He soon recovered, the captors were shot, and the war continued for another year....
Now, just a few yards away from where Stauffenberg's bomb exploded, The Plot to Kill Hitler is being screened in the Fuhrer's former communications bunker. So what exactly is the future of history? Well, on this evidence here, it looks very much like being fabrication.
What the film-makers have done is to shoot "new" newsreel footage that resembles the real thing but in fact shows events that were never filmed at the time. Thus Hitler is seen having his breakfast and being injected with his daily "pick-me-up" by his physician, while Churchill loafs about in his dressing-gown until mid-morning reading the papers.
In order to obtain the desired level of would-be veracity, the actors involved had to wear specially designed frames on their heads while they were filmed. These recorded each of their movements, however tiny. Meanwhile, images of the three historical faces were scanned into a computer. Once filming was completed, the actors were digitally decapitated and their heads replaced by those of their computer-generated counterparts.
The results may not be faultless - Churchill's head, in particular, seems to be moving in a different plane to the rest of him - but they were certainly good enough to send a ripple of astonishment through the assembled hacks and historians.
According to David Abraham, head of Discovery Networks Europe, "Audiences will be able to see historical figures as they never have done before. In future, if it's done responsibly, this technique will allow us to tell history's greatest stories in a completely new way."
The key word here, of course, is "responsibly". Everyone involved is very aware that in the wrong hands this technique could cause an even greater blurring of the line between fact and fiction than exists at the moment. For instance, you could create convincing broadcasts from despots or crackpots who had either died years beforehand or didn't want to run the risk of actually being filmed.
As the historian Andrew Roberts, one of the advisers on The Plot to Kill Hitler, concedes, "Yes, it is possible to create false history like this. But here we've been very careful only to construct scenes that are non-contentious, and we haven't tried to reconstruct dialogue. I think that as long as you make it perfectly clear what's going on, then there isn't an ethical problem."
Also in the audience is a dapper, black-haired man with something worryingly familiar about his features. He is Clive Brooks, a former official in the immigration department of the Metropolitan Police, who plays Hitler - at least he does from the neck down. But in order that the visual effects team should have a suitable base on which to superimpose Hitler's face, it was necessary that there should be a close resemblance between actor and dictator.
"People have told me several times in the past that I look like Hitler," admits Brooks, trying to sound as cheerful about it as one can under the circumstances. "When they approached me, I thought it would be fascinating to have a go. However, I wasn't under any illusions. Basically, I was there as a body actor. I had to hit my marks and turn my head at the right moment. But in a sense," says Brooks, valiantly if none too plausibly, "you could argue that there was more acting involved; that it's easier to give your face to the camera than it is to keep it away."
With that, we gathered up our kitbags and tin mugs and were loaded back on to the trucks. On the way to the airfield, accompanied by motorcycle outriders in WW2 uniforms, we passed a line of cars waiting at a level crossing. Inside the cars people stared in astonishment as history, briefly replayed as farce, swept past their gaze.
Statewide textbook adoption, the process by which 21 states dictate the textbooks that schools and districts can use, is fundamentally flawed. Textbook adoption distorts the market, entices extremist groups to hijack the curriculum, enriches the textbook cartel, and papers the land with mediocre instructional materials that cannot fulfill their important education mission. The adoption process cannot be set right by tinkering with it, concludes The Mad, Mad World of Textbook Adoption, the latest release from the Thomas B. Fordham Institute. Rather, legislators and governors in adoption states should eliminate the process and devolve funding for and decisions about textbook purchases to individual schools, individual districts, even individual teachers.
The Mad, Mad World of Textbook Adoption is the first of a new Fordham Institute series, "Compact Guides to Education Solutions," that provides practical solutions to K-12 education problems for policy makers, legislators, school leaders, and activists. These concise guides are meant to help drive reforms at the local, state, and national levels by offering actionable policy recommendations.
Textbook Adoption: The process, in place in twenty-one states, of reviewing textbooks according to state guidelines and then mandating specific books that schools must use, or lists of approved textbooks that schools must choose from.
Textbook Adoption Is Bad for Students and Schools
It consistently produces second-rate textbooks that replicate the same flaws and failings over and over again. Adoption states perform poorly on national tests, and the market incentives caused by the adoption process are so skewed that lively writing and top-flight scholarship are discouraged. Every individual analyst and expert panel that has studied American K-12 textbooks has concluded that they are sorely lacking and that the adoption process cries out for reform.
Textbook adoption has been hijacked by pressure groups. The textbook adoption process has been a feature of American education since Reconstruction, when former Confederate states issued guidelines for school materials that reflected their version of the Civil War. In the present day, special interest pressure groups from the politically correct left and the religious right exert enormous influence on textbook content through bias and sensitivity guidelines and reviews that have dumbed down textbook content in an attempt to render them inoffensive to every possible ethnic, religious, and political constituency.
Textbooks are now judged not by their style, content, or effectiveness, but by the way they live up to absurd sensitivity guidelines. Do literary anthologies have more male than female story characters? Do textbooks portray stereotypes such as female nurses or male mechanics? Do history textbooks suggest that religious strife has been a cause of conflict in human history? Do they mention junk food, magic, or prayer; suggest that the old are wise or the young are vigorous; or leave out any ethnic, racial, or religious group, no matter how small? If they do, that is grounds to have a textbook rejected.
The adoption process encourages slipshod reviews of textbooks written by anonymous development houses, according to paint-by-numbers formulas. Textbooks are not actually carefully reviewed—and sometimes are not read at all by those who act as "reviewers." They are scrutinized instead with a superficial "checklist" approach that identifies whether textbooks have presented key words and phrases without viewing the entire textbook for quality, accuracy, and content. States often apply "readability" formulas to ensure that textbooks use simpler words and phrases, resulting in a lowest-common-denominator approach. Reviewers almost never have to sign their reviews, and the entire process is cloaked in secrecy laws. Meanwhile, textbooks are almost never field tested to gauge whether they are effective in raising student achievement.
Finally, textbook adoption created a textbook cartel controlled by just a few companies. Requiring publishers to post performance bonds, stock outmoded book depositories, and produce huge numbers of free samples have all raised the costs of producing textbooks. This has frozen smaller, innovative textbook companies out of the adoption process and put control of the $4.3 billion textbook market in the hands of just four multi-national publishers.
The Bottom Line
There is no evidence that textbook adoption contributes to increased student learning. In fact, the vast majority of adoption states are also in the bottom half of all states when it comes to NAEP reading and math scores.
Click here to read the rest of the Executive Summary:
It's taken months of removing soot, tackling water damage, and reorganizing, but readers and researchers are back at Iraq's National Library.
Nearly a year and a half after one of Iraq's chief repositories of historical record was looted and burned, surviving archives and manuscripts are being cleaned and catalogued - while the director ventures out occasionally to scour book markets for lost treasures.
At the same time, the Iraq Museum remains closed. Its location near a hotbed of resistance puts it in the crossfire of frequent attacks on US forces. But its directors express high hopes of reopening amuseum - perhaps within a year - that far outshines that of the Hussein era.
Today both institutions, early symbols of postwar troubles, are looking toward a fresh start.
"We want to be not just a part of Iraq's new democratic and liberal culture, but a leader in it," says Saad Eskander, a Kurdish historian who was appointed library director last December. "There's still a lot of work to do and we could use much more help, but the library has come a long way since those dark days after the war."...
At the National Library, director Eskander says the blame for cultural losses must be laid at the feet of Iraqis and Americans alike. Receiving guests in an office that before the war was the kitchen of the library's theater, Eskander says, "There is no question the Americans neglected their duty as military occupiers. But what happened to this library was still primarily the fault of the former director general."
About 60 percent of the records and documents of modern Iraq were lost, along with virtually all historical maps and photos, and perhaps 95 percent of rare books, Eskander says. Almost all equipment was destroyed or carried away as well.
The wrong relocation
The former director - once the preferred poet of Saddam Hussein - was dismissed after accusations that he removed rare books from the collection. But Eskander faults the former director for a different decision: moving the library's rare books and national archives to the basement of the nearby ministry of tourism in the prewar frenzy.
"The best thing would have been to move those collections to nearby mosques," he says, "but there was a reason for choosing that ministry: It was a fortress of support of the Baathist regime and housed officials" from Mr. Hussein's intelligence forces.
Eskander says the move meant the books and archives in that basement survived the burning and looting. But about two months after Baghdad's fall, he says, "someone entered the basement, took what they wanted, and opened the water taps."
The objective, some speculate, was to obliterate the Republican Guards' archives, which were among the documents. But about 40 percent of Iraq's archives from the Ottoman Empire, along with rare books and manuscripts, were also destroyed.
The threatened total loss of documents prompted swift action from the US military, Eskander says. When it was determined that the best response would be to freeze the soaked documents for later restoration, officials quickly came up with $ 70,000 to purchase special freezers.
Still, Eskander barely hides his disappointment in other US institutions as he tours the library's gutted shell. Reaching a collection of vacuum cleaners, he says, "This is what the Library of Congress came up with to help us out - and then they wanted pictures of them in use, like they thought we were going to steal them for personal benefit."
Eskander says the US has committed to placing several library employees in archival restoration programs in the US - but as yet has refused to issue them visas.
The US official says "fears of terrorism" are holding up the visa
process in general and not just for Iraqis. But, he adds, "we will get
the library those visas" - if they wait long enough. Meanwhile, Eskander
says he is pursuing restoration programs in European countries.
Tim Radford, in the Guardian (Oct. 14, 2004):
[What has medicine learned from the Nazis?] In the 1920s, German scientists correctly picked up on x-rays as a possible source of genetic damage. In the same decade they also launched a huge campaign against tobacco, condemning it as a "plague" and "lung masturbation", according to Robert N Proctor, the historian, in his book The Nazi War on Cancer. The catch is that these scientists were eugenicists and were worried about the corruption of German germplasm. Smoking, for instance, was "unGerman" and a vice propagated by Jews.
A decade later, Nazi scientists identified the dangers of organochlorine pesticides such as DDT before anyone else, and launched campaigns to discourage alcoholism. German scientists of the period made the link between asbestos and lung cancer and developed the first high-powered electron microscope. They also pro moted breast self-examination to detect tumours at an early stage. Nazi leaders backed all these campaigns. Hitler was a vegetarian. Heinrich Himmler lectured the Waffen-SS on the importance of vitamins, minerals, whole foods and fibre in their diet.
The question arises because the American Medical Association yesterday announced a series of lectures on Nazi medicine at medical schools around the US. The lectures have been organised in association with the US Holocaust Memorial Museum and the collaboration coincides with an American exhibition called Deadly Medicine: Creating the Master Race.
Alan Wells, of the American Medical Association, says: "During the 1930s, the German medical establishment was admired as a world leader in innovative public health and medical research. The question we want to examine is: 'How could science be co-opted in such a way that doctors as healers evolved into killers and medical research became torture?'"
Adolf Hitler spoke of Germany as a body and himself as the doctor who wanted to make the nation healthy by eliminating the diseased parts. This programme of national health began with sterilisation and ended, of course, with six million deaths in the concentration camps. These events left their mark on all medicine.
"We can never forget this history because it continues to affect medical ethics today," Wells says.
"For example, one reason doctors today are so concerned about racial and
ethnic health disparities is because our codes of ethics demand that we treat
every person equally, without regard to race or ethnic background. This ethical
obligation is a direct outgrowth of the horrors of Nazi medicine."
... contemporary attitudes toward inebriation mean that "it's hard for people to take the study of alcohol consumption seriously," says Richard W. Unger, a professor of history at the University of British Columbia and the author of Beer in the Middle Ages and the Renaissance (University of Pennsylvania Press).
Mr. Unger says such attitudes stem from a failure to realize that brewed beverages were a necessity before the advent of dependable, clean water supplies or soft drinks.
In early modern Europe, "beer was a normal part of daily life," says Mr. Unger. With food often scarce, it was a nutritional godsend. It was also an all-purpose social lubricant, regarded with "neither suspicion nor awe," he says. Consumption per person far exceeded today's levels. For children as young as 4 years old, too, beer was a staple. Presumably they were fed weak brews, says Mr. Unger, but not too weak: "If the alcohol levels are low, the nutrition level is low, too."
So was medieval life like living in a college fraternity today? Perhaps so, for beer was brewed in one in 10 houses, all by trial and error, as modern methods of chemistry had not yet been developed.
"You couldn't walk down the street without smelling beer," says Mr. Unger.
The historical sources on beer are vast, Mr. Unger says, because "governments have regulated the production of alcohol since at least 3500 BC, and governments keep records."...
President Bush says the presidency is hard work and that the most important decisions are often not politically popular.
Sen. John F. Kerry replies that the president is devious and stubborn, most notably and tragically when he is wrong.
Both combatants should find comfort in"Decisions That Shook the World," a three-part Discovery Channel series that, starting tonight, dissects presidential decision-making at its most crucial and, at times, most devious and controversial.
Presidential historian Michael Beschloss, also co-producer of the series, notes in his introduction that in this presidential election season questions about the presidency and its enormous power are paramount.
"Decisions" looks at three cases of presidential leadership: Lyndon B. Johnson's decision to champion civil rights despite opposition from his fellow Southerners; Ronald Reagan's insistence on the"Star Wars" missile defense system when many people thought he was daft; and Franklin D. Roosevelt's determination to bootleg help to the British even before the U.S. entered World War II.
American presidents are finaglers,"Decisions" tells us. They sometimes ignore public opinion or uncomfortable facts when they think that the survival of the nation is at stake. They're politicians. They bully, seduce and flatter the opposition, foreign or domestic. They are routinely vilified by opponents as the most dire threat to the republic since its creation. One film clip shows a 1940 protester outside the White House with a sign,"Hitler Has Not Attacked Us -- Why Attack Hitler?"
If the much-praised"American Experience" on PBS is a graduate seminar on American history,"Decisions" is an undergraduate survey course -- but a good one with compelling film clips, a well-argued thesis and a sprinkling of insider-interviews.
The series starts with Johnson's decision, almost immediately after assuming office, to take up the civil rights legislation of his fallen predecessor and make it the highpoint of his administration.
Although Johnson had shown little interest in civil rights during his years in the Senate -- and routinely used the N-word -- he changed virtually overnight. Washington power broker Vernon E. Jordan Jr. tells viewers that when it comes to civil rights he would rather have a" converted Southerner than a wobbly Northern liberal."
It is a sympathetic portrayal of Johnson. Vietnam, where the same presidential steadfastness marched the nation into a quagmire, is given only a glancing reference. But survey courses are broad brush, and the focus of"Decisions" is on the civil rights and the voting rights bills.
Next week's episode, on Reagan and"Star Wars," may be the least compelling. On this one,"Decisions" could have used some dissenting or contrasting views. The argument that the threat of"Star Wars" pushed the Soviet empire to collapse is just that, argument.
Still, there is a good, if somewhat hagiographic, look at Reagan's personality."There was a stubbornness to Reagan when he thought he was right," says Michael K. Deaver. It's an insight that contradicts the widely accepted idea that Reagan was easily manipulated by his staff.
The best of the three episodes details Roosevelt's silent vow to help the British in the face of American isolationism and a strong challenge in the 1940 election by Wendell Wilkie. FDR may have been the wiliest man in the Oval Office.
"In 1940, Franklin Roosevelt knew he might have to skate to the edge of telling untruths or even breaking the law," Beschloss says."He didn't want to do it, but if it was necessary, he was ready." Roosevelt promised voters,"I will never send your boys to a foreign war," knowing full well he would someday do just that.
He had"an enormous self- assurance that allowed him to feel he always knew what was best, even when he was wrong," Beschloss says. He stepped on civil liberties. He quietly authorized wiretapping when it was still outlawed by the Supreme Court.
"Whenever he faced a question of liberty vs. security, he would choose security," said historian and Roosevelt biographer James MacGregor Burns.
If there is one thing that all three presidents in"Decisions" shared, it is a mastery of the stage. Which brings us back to where the series begins: politics and the national stage, with two men passionately promising to make different decisions on our behalf. On Nov. 2, the public will make its own decision.
The United States is always urging other nations to embrace its form of government. But although it is the world's oldest constitutional democracy - formed more than 200 years ago when some important countries of today did not even exist - not one foreign state has ever adopted the American system.
The reasons for this are simple, though they require a knowledge of history. Because the US form of democracy was decided in the late 18th century, important aspects are outmoded today. It is also extraordinarily complicated, so much so that many Americans still fail to understand it.
These citizens were shocked after the last presidential election in late 2000, when it became clear that although Vice-President Al Gore, the Democrat, had 530,000 more votes than his opponent, the Republican George W Bush, the latter was headed for the White House.
This November, a similar outcome is possible, perhaps with the parties' fortunes actually reversed. The final spectacle on Jan 6, 2001, when Mr Gore presided in Congress over the formal registration of his own defeat, exposed another shocker: Americans do not directly vote for their president (in fact the Constitution offers no federal right to vote at all).
Instead, something called the Electoral College had taken over. This was not a group of academic political scientists, but a few hundred loyal party members, or 'electors', from every state but two, who were pledged to elect the president in a winner-take-all final ballot. That was why the election in Florida was vital; its electoral college votes decided the result.
There are deep flaws in this system, and it has been contested unsuccessfully in countless constitutional amendments over two centuries. Twice, in 1800 and 1888, the outcome went against the popular vote, and the result was disputed and critical in 1824 and 1876.
The US was embarrassed by the 2000 debacle, but rarely discussed is something even more mortifying: That the basic reasons for the electoral college are anti-democratic and racist.
The founding fathers invented it to block what they feared as 'mob rule' in choosing the president, but it also functions unfairly due to a reason rooted in slavery.
Only in November last year came a scholarly book devoted singularly to this unpleasant complication. The distinguished historian Garry Wills published, Negro President: Jefferson and the Slave Power, about the 1800 election and how Thomas Jefferson's victory depended on the nation's population of disenfranchised slaves. Hence the derisory name given him that provided Mr Wills with his title.
In a notorious decision in the 1787 Constitutional Convention, the founding fathers decided that each slave would count as three-fifths of a white person. This was not because they literally believed that's all the humanity African-Americans possessed.
It was instead a compromise fraction to allay threatened rebellion from the less citizen-populated Southern states, by increasing population numbers that counted in each state in the region. This in turn decided the allocation of its congressional representatives - and the electoral college votes to which each state was entitled.
In the words of John Quincy Adams, America's sixth president, it was 'the triumph of the South over the North; of slave representation over the purely free'. Its effects also gave the South disproportionate dominance over much of US political history, even up to today.
The three-fifths rule granted the South greater influence on federal affairs, because not only did it influence presidential elections, but the structure of the senate.
Again, with Southern support, smaller states got two US senators each, the same as the larger ones, despite the latter's objections. So that today Wyoming, with under one million population, has the same number of senators as California, with 30 million. Both of each state's senators are also added to the electoral college vote totals, once more giving smaller states an advantage. Not to have counted the US senators would have brought Mr Gore victory in 2000.
In his book, Mr Wills points out that historians have ignored the three-fifths clause because it directly influenced only one presidential election, but disproportionate Southern political power - based on slave ownership - helped shape America.
For decades slave states held one-third more congressional seats than their free populations warranted. Before 1850, slaveholders also controlled the presidency for 50 years, held the speakership for 41 years, and provided 18 of 31 Supreme Court justices.
This imbalance undoubtedly prolonged slavery as an institution, and it certainly allowed its establishment in Missouri, as well as permitted the expulsion of native Americans from their ancestral lands. Altogether, more than a quarter of US presidents in its entire history owned slaves.
Adams again, in 1843, told the House of Representatives: 'Your country is no longer a democracy, it is not even a republic. It is a government of 2,000-3,000 holders of slaves, to the utter exclusion of the remaining part.'
Not exactly the proud history of a nation devoted to rule by the people, for the people, and of the people; or of one person, one vote. The racist and undemocratic elements of the three-fifths clause continue to influence the US Senate today. By deliberately creating an institution more on the lines of the British House of Lords than a democratic assembly for the people, the senate is today overwhelmingly composed of rich white men.
The senate's own majority is held by members from the 26 smallest states, representing only 18 per cent of the nation's population. It has no black senator and over half of the 100 are millionaires. The nine largest states, containing a majority of Americans and including New York and California, are represented by just 18 of the 100.
This means that these few men - the senate's percentage of women is still well below a fifth - directly control what becomes law in the US. It is, according to American historian and author Richard Rosenfeld, 'a grotesque monument to (an) anti-democratic legacy'.
In an essay calling for the abolition of the US senate, he adds that the institution, 'remains largely a preserve of wealthy white male aristocrats drawn from an entirely different economic class than the people they purport to represent. . . disadvantaging women, blacks, the young, and other groups with smaller percentages of millionaires.'
The combined impact of the electoral college system and the continuing bias towards small, country populations as against large cities, contributes to voter apathy. Less than half of entitled Americans cast a ballot, in many cases because they believe it does not count.
And because of the electoral college requirement, unless they live in a swing state - of which there are only about 17 of the 50 this time - their presidential vote does not indeed count.
Of almost all Western democracies, the US is the most conservative. If its electoral system was fairer, the Democrats would more often have held the presidency in recent decades, instead of increasingly conservative Republicans. Its senate would also be more liberal.
American democracy, it can be argued, is broken, but it will not be fixed. Why? Because the small states have to agree to constitutional changes that would cost them power. They are not about to surrender that.
It probably should have come as no surprise that Democratic candidates John Kerry and John Edwards would more than hold their own in the recent debates against President Bush and Vice President Cheney. Kerry and Edwards are both lawyers, and the thrust and parry of debating, challenging, questioning and defending should be second nature to them. Yes, among the stark differences between the two tickets in the election of 2004, here is one you might not have thought of: Two lawyers are running against two non-lawyers. That contrast means quite a bit, though not in the ways you might expect.
You probably already knew Edwards was a lawyer; he built his career -- and made millions -- suing doctors and others on behalf of seriously injured individuals. He earned his law degree at the University of North Carolina at Chapel Hill. But as Kerry said during the debate on Friday,"I'm a lawyer, too." After his service in the Vietnam War and before he was elected to the Senate, Kerry went to Boston College Law School and served as a prosecutor in Massachusetts.
Bush and Cheney both ran businesses and held public office with varying degrees of success, but they never once hung out a shingle that read,"attorney at law." That is a gap in their rsums of which they are proud, no doubt.
On the campaign trail, Bush and Cheney have made cracks about lawyers -- easy targets, given that lawyers are generally held in low esteem. Last December, a Gallup Poll found only 16% of the public ranked lawyers' ethics as high; among professions, lawyers were toward the bottom along with journalists and members of Congress.
But the lawyerly past of the Democratic ticket has not emerged as a major battle cry. When taking a swing at trial lawyers during their respective debates, both Bush and Cheney linked them -- but not Edwards personally -- to the rise in medical malpractice insurance. Kerry and Edwards both shrugged off the charge, and Edwards told the story of one of his injured clients, coming across as a fighter for the little guy, not an economy-wrecking evildoer. When USA TODAY polled voters in July about Edwards' background as a trial lawyer, two-thirds said it was a plus, not a minus.
A good track record
History tells us not to fear lawyers as presidents. A fascinating new book looks at the 25 out of 43 presidents who were also lawyers. America's Lawyer-Presidents, edited by Norman Gross, details the legal careers of all who made it from law offices to the Oval Office.
As U.S. Supreme Court Justice Sandra Day O'Connor said in her foreword,"Our constitutional democracy has always relied on the talents and hard work of lawyers in private practice, public service and political office."
Before he set about drafting the Declaration of Independence, Thomas Jefferson handled legal paperwork on real estate transactions in Virginia."Honest Abe" Lincoln had a long and distinguished legal career, and he offered this advice to people contemplating a life in the law:"Discourage litigation. Persuade your neighbors to compromise whenever you can. As a peacemaker, the lawyer has a superior opportunity of being a good man."
Franklin Delano Roosevelt, on the other hand, was a Columbia Law School dropout. But he still practiced law for a time, mainly as a prelude to politics. In fact, that is how most lawyer-presidents viewed their legal careers.
Bill Clinton, for example, was a professor at the University of Arkansas Law School and attorney general of the state, but he viewed those positions as steppingstones to becoming governor, then president.
Which brings up one undeniable drawback of some of our lawyer-presidents. Clinton, believing himself to be a legal expert, may have felt emboldened to skate close to the line that led to his impeachment woes. And the last time we had a successful two-lawyer ticket -- Richard Nixon and Spiro Agnew in 1972 -- both left office in disgrace. A law license clearly is not a ticket to sainthood.
The U.S. Supreme Court
There is one important way in which a background as a lawyer means a great deal, and it relates to a task the next president almost surely will face: picking U.S. Supreme Court justices. From their early training, lawyers know how important the high court is in the life of the nation, and they tend to act accordingly. Nearly two-thirds of the Supreme Court justices generally viewed by historians as"great" or"near-great" were appointed by presidents who were also lawyers, according to Gross' book.
John Adams, the first lawyer-president, appointed John Marshall, viewed as our greatest chief justice. William Taft named Charles Hughes to the bench, and Woodrow Wilson appointed Louis Brandeis. FDR, bored by the law himself, nonetheless named Hugo Black, Felix Frankfurter, William Douglas, Robert Jackson and Wiley Rutledge, each stellar in his own way.
After 10 years without vacancies on the current Supreme Court, it is almost certain that one or more justices will depart in the next president's term of office. President Ford once said,"Few appointments a president makes can have as much impact on the future of the country as those to the Supreme Court."
Ford should know. Justice John Paul Stevens, Ford's only appointment to the Supreme Court, is still on the court today, a powerful force nearly 30 years after Ford -- himself a lawyer-president -- appointed him.
Germany: The Berlin Wall returned to the German capital last week like a bad sequel. Berliners watched in disbelief as the wall rose from the dead before their eyes, days before they celebrate the 15th anniversary of its demise writes Derek Scally in Berlin.
A 200-metre stretch of the wall has been reinstated at the notorious former border crossing, Checkpoint Charlie - this time for the tourists.
"We want to remember the victims of the wall and the efforts of the Allies," said Ms Alexandra Hildebrandt, director of the nearby Wall Museum and initiator of the project.
Her Berlin Wall II has whipped up a lively controversy in the German capital, a city well used to heated historical debates.
The original 166-km long"anti-fascist protection wall", as it was officially known, was erected at dawn on August 13th, 1961, turning West Berlin into an Allied-controlled island in East Germany.
The wall halted the flow of refugees to the west that had reached 160,000 in the first eight months of 1961.
Some 215 people were killed trying to cross in the 28 years of the wall's existence, with another 800 dying on the border with West Germany.
Most of the wall has ended up in motorway foundations, with only three sections still standing in the city centre.
Like most sequels, the Berlin Wall II is a shadow of the original and many see it as a crass attempt to cash in on the notoriety of its predecessor.
Dr Hubertus Knabe, a leading historian of the period, criticises what he calls the"trivialisation" of the wall - the reconstruction without the razor wire, automatic machine guns and other deadly methods used to seal the border.
"The wall wasn't a tourism project but an instrument of murder," said Mr Walter Momper, mayor of West Berlin in 1989, to Die Welt newspaper."The authentic locations with their wall remainders are sufficient. Anything else would be Disneyland."
Puzzled Berliners and tourists weren't sure how to react."It's so difficult to imagine that there used to be anything here, so in that sense it's a good idea," said Ms Pauline Morris, visiting from Bristol. Ms Greta Schiller from Bonn said:"I think it's a good idea. For my grandchildren the wall's just a boring historical detail." An elderly Berlin lady said:"Don't they know how much we hated that thing?" before stalking off with a stricken expression.
A spokesman for the city government admitted yesterday that they badly need new ideas for ways of keeping alive the memory of the wall, but said this project was of"little help".
Anyone who wants to see the reconstructed Berlin Wall will have to hurry, though: the city has approved the project only until the end of the year. As construction workers put the finishing touches to the wall yesterday, an impatient British tourist asked:"What time does the Berlin Wall open?"
The Netherlands has been plunged into a painful re-examination of its past following calls to award Dutch citizenship posthumously to Anne Frank, the teenager whose war-time diary is the most widely read document to emerge from the Holocaust.
Politicians, historians and the media are struggling to address the issue following her nomination for a television vote next month to decide the greatest Dutch person of all time.
She is among 200 candidates put forward by KRO, a television broadcaster, including Vincent van Gogh and Johan Cruyff.
Anne Frank was not Dutch, however. She was born in Frankfurt, Germany, in 1929 and came to the Netherlands in 1934. Along with countless thousands of Jews who had fled the Nazi regime, she was stripped of German citizenship in 1941.
In June 1942 she and her family went into hiding in a secret room in an Amsterdam canal house. They were betrayed to the Nazis and Anne died from typhus in Bergen-Belsen concentration camp in 1945, months before the war ended.
Her diary was published in 1947 and has been translated into more than 60 languages, selling millions of copies worldwide. The house where she hid is now a museum, attracting thousands of visitors annually.
Dutch media have been quick to highlight a troubling fact. De Volkskrant, the leading leftwing daily, wrote:"Anne Frank is a symbol not only of the Nazis' destructive machine, but also of the Netherlands' powerlessness to protect the Jewish people."
Anti-semitism was"an unacknowledged phenomenon" in the Netherlands in the 1930s, the paper noted.
NRC Handelsblad, the Dutch evening paper, said granting citizenship would be akin to the" cynical retouching of photographs from the Soviet era".
Rome: An anatomy professor yesterday gave Michelangelo an A-plus for his masterpiece David, despite one small flaw: a missing muscle in the statue's back.
University of Florence anatomy professor Massimo Gulisano, working with haematology professor Pietro Bernabei, said their measurements of the statue also debunked long-held notions that the 4.1m-high hunk of marble was out of proportion.
"Some say the feet, the hands are out of proportion. It's not true," Professor Gulisano said during a two-day conference in Florence of art historians, restorers and scientists to mark the 500th anniversary of the statue's unveiling.
"Michelangelo knew anatomy very well. There was only one error. There was a hollow where there should have been a muscle on the right side of the back."
But Michelangelo, who learned anatomy by dissecting cadavers, was aware of the flaw, Professor Gulisano added, writing in a letter that a defect in the marble block forced him to leave out that muscle.
The professors also admired the technical correctness with which Michelangelo depicted the tensions in the muscles of the biblical hero, who, with his slingshot, was about to challenge Goliath.
Some observers have commented that David's genitals seemed a little small, but the professors told the conference that measurements indicated all was normal.
"Taking in consideration this is a statue of a young man who was thinking about a battle that was about to happen, everything is normal," Professor Gulisano said.
More than 300 years before the Soviet Union launched its Sputnik satellites and American astronaut Neil Armstrong stepped on to the Moon, England had its own ambitious space programme.
It came in the shape of a 17th-century clergyman who drew up plans for a spaceship powered by wings, springs and gunpowder, a leading science historian will reveal this week. According to Professor Allan Chapman of Oxford University, it was the first serious attempt at a manned flight to the Moon.
The man behind the lunar mission was Dr John Wilkins, scientist, theologian and brother-in-law of Oliver Cromwell. In 1640, as a young man of 26, Dr Wilkins wrote a detailed description of the machinery needed to communicate and even trade with beings from another world.
"It was the first serious suggestion of space flight based on the best documentary evidence available to them at the time," said Professor Chapman, who will present his findings tomorrow night at a public lecture at Gresham College, London.
Although earlier philosophers and poets had written about visiting the Moon, the writings of Dr Wilkins were in an altogether different league, Professor Chapman believes. Wilkins lived in what he describes as the"honeymoon period" of scientific discovery, between the astronomical revelations of Galileo and Copernicus, who showed a universe with other, possibly habitable worlds, and the later realisation that much of space was a vacuum and therefore impassable.
According to Dr Wilkins, the gravitational and magnetic pull of the Earth extended for only 20 miles into the sky. If it were possible to get airborne and pass beyond this point, it would be easy to continue on a journey to the Moon. Inspired by the discovery of other continents and the great sea voyages of explorers such as Francis Drake and Walter Raleigh, Wilkins conceived an equally ambitious plan to explore space.
"Partly the argument was religious. As well as being a scientist, Wilkins was a theologian, and the argument was that if God had made worlds then it's within divine providence to put beings on them," Professor Chapman said.
Dr Wilkins drew up plans for what he called a flying chariot powered by clockwork and springs, a set of flapping wings coated with feathers and a few gunpowder boosters to help send it on its way.
"Of course his approach did not work because he based it on the premise that the Earth's pull only went up 20 miles and if you crossed that 20 miles, you could float after that," he said.
By the 1660s, the idea began to fall apart with the work of Robert Boyle and Robert Hooke, who demonstrated the nature of the vacuum that would stretch between the Earth and the Moon.
Teachers often forget successful classes but find their failures indelibly imprinted in their minds. I clearly recall one first-year seminar from nine years ago, on masterpieces of the human imagination.
"What," I asked,"are we to make of Plato's attempts to define justice?" A chill descended. Noses burrowed into The Republic. One student hesitantly volunteered a comment; another offered a passing observation. Something resembling a discussion followed, but most of the remarks betrayed the superficiality of the students' engagement.
They were eager to discuss their favorite movies and books, censorship, or the problem of date rape, but they shrank from theseeming irrelevance of Plato to their lives. Often the brightest students were the most subdued. Their occasional remarks showed intelligence and sophistication, yet every gesture and tone of voice conveyed boredom.
The more I thought about that failed seminar, the more I realized that it had not been very different from other discussions of classic texts that I had led at Barnard and Columbia. I could recall particular sessions when a student's insight blazed into real illumination and rekindled my own enthusiasm; in the better classes a steady, intelligent patter made the time pass quickly. But those were exceptions. And when I confessed my sense of failure to colleagues, most recounted similar frustrations.
What was wrong?
Early the following semester I met individually with students from that seminar. They were initially surprised by my sense of its inadequacy, but as we talked they opened up. They explained that what I had taken to be sullenness on their part was actually a manifestation of deep anxiety. They knew that my knowledge of the texts exceeded their own; they chafed at being obliged to reveal the insufficiency of their understanding.
Another revelation followed. If students were uncomfortable with me, they were even more worried about their peers' reactions. Their sophisticated disinterest masked a fear of saying something foolish, inappropriate, or -- even worse -- revealing about their fragile sense of self. The more I pushed them to the brink of otherness -- other ideas, cultures, and societies -- the more they clung to familiarity or simply clammed up.
Students were also put off by the purely cerebral character of the undertaking. They regarded classic texts as abstract mental games: intellectual hurdles to be cleared, rather like math or language requirements, before they could dash off to the courses whose relevance to their lives was obvious. They had little sense that the ideas behind the texts had been forged in the heat of human passions. They knew that Socrates had been put to death for his views, but that appalling fact merely proved how distant his world was from their own.
I concluded that if my role as mentor impeded my students' engagement with the texts, it should be minimized. If students' insecurity hampered their ability to engage fully with otherness, they should assume an alternative identity. If students regarded important texts as vague and abstract, they should examine the texts within the context of the impassioned debates and dramas from which they had emerged.
Instead of teacher-generated discussions, I decided that the class would play games, each set in an intellectually charged moment in history. Because I didn't want students to behave like callow novices, inferior to a grand inquisitor, I would have them assume the roles of powerful adults: mighty emperors, influential scholars, religious zealots. I would become mere tutor or scribe....
Juan Cole, at his blog (Oct. 4, 2004):
Graham Larkin of Stanford has penned an important article arguing against David Horowitz's sinister proto-Stalinist social engineering project of"balancing" universities ideologically. See also my "Are Professors Too Liberal?".
Larkin's essay is especially good in pointing out that there are no obvious evaluation mechanisms for ideological balance.
If we go by opinion polls, about half of Americans reject Darwin, so Horowitz's proposal would require that half of all biologists would have to be creationists. Then, with regard to party preference, opinion polls show that at some points in the past 8 months Ralph Nader has been favored by 6% of the electorate. At other points it has been 2%. So presumably between 2% and 6% of the professors would have to be Nader supporters. Indeed, we might have to put people on monthly contracts so that we can adjust the percentage in accordance with the latest polls. About 10% of Americans support radical fringe groups, so of course there would have to be a place for the American Nazi Party on the faculty, Horowitz seems to be arguing. Maybe we could have the supremicist teach modern German history; that seems to be the sort of thing that would make Horowitz happy. How sad that at present the Nazi period in Germany is usually taught by some wimpy liberal, Horowitz seems to be saying.
Moreover, there is no obvious reason that"balance" should be conceived only along the narrow US spectrum. A fifth of human beings lives under Chinese Communism, so the logical conclusion is that Horowitz is insisting that 1/5 of all US university professors be believers in Chinese communism. And, of course, the Muslim Brotherhood and Jama'at-i Islami would have to have its faculty representatives in proportion to the popularity of those fundamentalist parties in the world.
If we limited the political spectrum to just US Republicans and Democrats, then hiring faculty 50/50 by party affiliation would have ethnic implications as Horowitz envisions it. He argues that virtually all faculty in the liberal arts are Democrats. Outside the academy, we know that most Jews vote Democrat, and that 66% of Latinos, and almost all African-Americans do. If, as Horowitz desires, we must ensure that half of all university posts go to members of the Republican Party, then we'd have to fire those nasty democrat-leaning minority members and hire Republicans instead, most of whom would, proportionally speaking, be white Protestants. That is, the full implications of Horowitzism are a purge of higher education in the United States of African-American, Catholic and Jewish faculty members.
And, as I argued in my piece for the History News Network, why shouldn't the same rules apply to other professions? Military officers and CEOs, for instance?
The eyes of Abdul Aziz al-Babtain filled with tears as he walked beneath the horseshoe arches of the Mezquita, one of Islam's greatest architectural jewels, and said:"When the West consider the Arabs, they should think of this."
Six months after the Madrid bombs, when Islamist extremists murdered 191 train passengers, Spain is reopening the debate over its rich Moorish history in a search for answers to why it became a target.
In the old capital of al-Andalus -the Islamic name for the Spain that the Moors occupied -the Spanish Royal Family and the Socialist Government are sponsoring a week of dialogue with the descendants of the country's former rulers.
The inspiration for the event came from Mr al-Babtain, a Kuwaiti businessman with a poetic bent, who holds biannual conferences to promote the Arab world's literary heritage.
Strolling through the famous orange tree patio of Cordoba's former mosque -a cathedral since 1523 -he admitted that the encounter took on a new sense of urgency in the wake of the September 11 and March 11 al-Qaeda terror attacks.
Mr al-Babtain's literary hero is Ibn Zaydun, whose love letters to Princess Wallada, written beneath the walls of the Mezquita, still inspire modern Arabic poets."We were going to hold a conference in Zaydun's honour in Damascus, but after the terrorist attacks I decided that it was vital that we came to his birthplace and try to bridge the gap between our cultures, to show the true face of Islam."
His initiative thrived because it touches still-raw wounds in Spain, where rising unease over Muslim immigrants has fused with the country's sudden sense of vulnerability.
Extra police officers have been dispatched to Spanish cities with a larger than average Muslim population, most of whom arrive from Morocco. Jose Antonio Alonso, the Interior Minister, has announced plans to crack down on the proliferation of mosques and to control their preachers, some of whom preach jihad. This was the same message that al-Qaeda gave after the Madrid bombs, calling for a"holy war" to"liberate al-Andalus".
Spaniards are now torn between clashing ideas. On the one hand, Jose Luis Rodriguez Zapatero, the Prime Minister, has called for"an alliance of civilisations" between the West and the Islamic world and remains strongly critical of the United States-led war in Iraq.
His predecessor, Jose Maria Aznar, the conservative leader, however, is urging the West, and particularly Spain, to wake to the ambitions of the religious extremists.
"There are those who think that the Madrid attacks are related to the support given by the Spanish Government to the Iraq war," he said at Georgetown University in Washington this month.
"The problem with al-Qaeda came from before that -as long ago as 1,300 years," he said, referring to the arrival of Moorish invaders in Spain.
Mr al-Babtain regretted Senor Aznar's reading of his country's history."I was very sorry to hear that, I felt deeply disappointed because it is well known that we Arabs lived here in peace for centuries. All three great religions -Islam, Christianity and Judaism -lived together under our rule."
His claim is argued over by Spanish historians attempting to unravel fact from myth in the period of what is called la convivencia ("living together"): was the 700 years of Muslim rule really such a paradise?