Wednesday, February 25, 2009

Obama's big mistake that will only get bigger with time

What Obama's Risking in Afghanistan

by John Bruhns

President Obama's campaign slogan was "Change we can believe in." Americans, desperate for change, gave Mr. Obama a clear victory. Now in power, he's realizing that he can't deliver that change - at least in Iraq and Afghanistan.

Obama said during his campaign: "I will meet with military leaders and the secretary of Defense and give them a new mission - bring our troops home safely and responsibly from Iraq within 16 months."

But there is now no plan to fully withdraw our troops from Iraq within that time frame. After meeting with military leaders, Mr. Obama is considering a 23-month withdrawal.

Violence in Iraq has been in decline. But the situation remains dicey. We're very likely to see a spike in violence at some point in the near future. When that happens, the new 23-month withdrawal plan will be extended.

On the Afghan front, Obama has just ordered 17,000 more troops into the effort. The timing is pertinent. The Pakistanis, our supposed allies, have agreed to a truce with the Taliban, providing them sanctuary in Swat, an area roughly the size of Delaware. The Taliban version of Muslim sharia law has been imposed, and the Pakistanis have suspended all military operations against them. The Pakistani government has denied our troops access to this area. Pakistani troops have even shot at our helicopters flying reconnaissance missions.

Al Qaeda and the Taliban have been in resurgence for years. And this concession by Pakistan will give the Taliban ample time to prepare for our new troops who'll be just walking in the door.

Why just 17,000? Because that's all we can spare due to the number of troops in Iraq and Afghanistan, and the turnaround time needed for those returning from combat.

The Afghan surge is a terribly bad idea. But Obama can initiate it because of his high approval ratings. Sound familiar? George W. Bush's ratings were quite similar before the invasion of Iraq. Imagine if Bush were still in office and pulled this stunt? The crowd I run with on the left would be screaming in protest.

Now it's just me who's screaming, waiting for the rest of the anti-war gang to voice disagreement with this certain fiasco.

Bush bet the farm on Iraq and let Afghanistan fall by the wayside. We missed our window of opportunity to defeat al Qaeda in the early years of the war and nab Bin Laden. So who are we going to be fighting? Obama needs to define the mission.

The Soviets bogged down in Afghanistan with 100,000 boots on the ground. The United States backed the Mujahedeen for Cold War reasons. After we helped the Mujahedeen defeat the Soviets, they turned on us.

Will we now rely on the Northern Alliance in a similar fashion? After all, they're an offspring of the Mujahedeen. They fought the Taliban, but their ideologies are almost identical.

And despite Obama's commitment of troops, Gen. David McKiernan, overseeing the war in Afghanistan, can't ensure success. He has emphasized the difference between the troop surge in Iraq and the one in Afghanistan, saying the Afghanistan surge won't be short-lived - we'll be there for years.

Most U.S. combat troops arriving in this first wave will be sent to southern Afghanistan, an area McKiernan describes as a stalemate, at best.

If our troops get bogged down in the Kandahar area, what will become of the rest of the country? Al Qaeda and the Taliban will surely be enticed to regain ground in other areas. All while war-weary NATO troops work on their plans to extricate themselves from the situation.

I see a dilemma. We have to do something. But 17,000 troops won't make a difference, except to cost more lives and money.

What does Afghani president Hamid Karzai have to contribute? Not a whole lot.

I'm not sure we can trust Karzai, a former member of the Mujahedeen, early supporter of the Taliban and former lobbyist for the U.S. oil company Unocal.

After eight years of U.S. involvement, the Afghan army and defense ministry still rely heavily on U.S. troops for most security operations.

And don't forget the extremely challenging terrain, especially in a war. The country is extremely mountainous, with limited water, with blistering summers, and frigid winters. And don't forget the borders with China, Iran, Russia and Pakistan, creating worries about regional conflict.

Americans are fed up with war. History shows that no war can be won without the support of the people. And the economy is so dire it's hard to understand why Obama would allow such an expensive military commitment.

I know the president is tasked with protecting the American people. But Mr. Obama is moving too quickly without doing his homework.

Published on Tuesday, February 24, 2009 by the Philadelphia Daily News

Monday, February 23, 2009

Taking a much needed swipe at America's dangerous denial culture

What We Don’t Know Will Hurt Us

By Frank Rich

And so on the 29th day of his presidency, Barack Obama signed the stimulus bill. But the earth did not move. The Dow Jones fell almost 300 points. G.M. and Chrysler together asked taxpayers for another $21.6 billion and announced another 50,000 layoffs. The latest alleged mini-Madoff, R. Allen Stanford, was accused of an $8 billion fraud with 50,000 victims.

“I don’t want to pretend that today marks the end of our economic problems,” the president said on Tuesday at the signing ceremony in Denver. He added, hopefully: “But today does mark the beginning of the end.”

Does it?

No one knows, of course, but a bigger question may be whether we really want to know. One of the most persistent cultural tics of the early 21st century is Americans’ reluctance to absorb, let alone prepare for, bad news. We are plugged into more information sources than anyone could have imagined even 15 years ago. The cruel ambush of 9/11 supposedly “changed everything,” slapping us back to reality. Yet we are constantly shocked, shocked by the foreseeable. Obama’s toughest political problem may not be coping with the increasingly marginalized G.O.P. but with an America-in-denial that must hear warning signs repeatedly, for months and sometimes years, before believing the wolf is actually at the door.

This phenomenon could be seen in two TV exposés of the mortgage crisis broadcast on the eve of the stimulus signing. On Sunday, “60 Minutes” focused on the tawdry lending practices of Golden West Financial, built by Herb and Marion Sandler. On Monday, the CNBC documentary “House of Cards” served up another tranche of the subprime culture, typified by the now defunct company Quick Loan Funding and its huckster-in-chief, Daniel Sadek. Both reports were superbly done, but both could have been reruns.

The Sandlers and Sadek have been recurrently whipped at length in print and on television, as far back as 2007 in Sadek’s case (by Bloomberg); the Sandlers were even vilified in a “Saturday Night Live” sketch last October. But still the larger message may not be entirely sinking in. “House of Cards” was littered with come-on commercials, including one hawking “risk-free” foreign-currency trading — yet another variation on Quick Loan Funding, promising credulous Americans something for nothing.

This cultural pattern of denial is hardly limited to the economic crisis. Anyone with eyes could have seen that Sammy Sosa and Mark McGwire resembled Macy’s parade balloons in their 1998 home-run derby, but it took years for many fans (not to mention Major League Baseball) to accept the sorry truth. It wasn’t until the Joseph Wilson-Valerie Plame saga caught fire in summer 2003, months after “Mission Accomplished,” that we began to confront the reality that we had gone to war in Iraq over imaginary W.M.D. Weapons inspectors and even some journalists (especially at Knight-Ridder newspapers) had been telling us exactly that for almost a year.

The writer Mark Danner, who early on chronicled the Bush administration’s practice of torture for The New York Review of Books, reminded me last week that that story first began to emerge in December 2002. That’s when The Washington Post reported on the “stress and duress” tactics used to interrogate terrorism suspects. But while similar reports followed, the notion that torture was official American policy didn’t start to sink in until after the Abu Ghraib photos emerged in April 2004. Torture wasn’t routinely called “torture” in Beltway debate until late 2005, when John McCain began to press for legislation banning it.

Steroids, torture, lies from the White House, civil war in Iraq, even recession: that’s just a partial glossary of the bad-news vocabulary that some of the country, sometimes in tandem with a passive news media, resisted for months on end before bowing to the obvious or the inevitable. “The needle,” as Danner put it, gets “stuck in the groove.”

For all the gloomy headlines we’ve absorbed since the fall, we still can’t quite accept the full depth of our economic abyss either. Nicole Gelinas, a financial analyst at the conservative Manhattan Institute, sees denial at play over a wide swath of America, reaching from the loftiest economic strata of Wall Street to the foreclosure-decimated boom developments in the Sun Belt.

When we spoke last week, she talked of would-be bankers who, upon graduating, plan “to travel in Asia and teach English for a year” and then pick up where they left off. Such graduates are dreaming, Gelinas says, because the over-the-top Wall Street money culture of the credit bubble isn’t coming back for a very long time, if ever. As she observes, it took decades after the Great Depression — until the 1980s — for Wall Street to fully reclaim its old swagger. Not until then was there “a new group of people without massive psychological scarring” from the 1929 crash.

In states like Nevada, Florida and Arizona, Gelinas sees “huge neighborhoods that will become ghettos” as half their populations lose or abandon their homes, with an attendant collapse of public services and social order. “It will be like after Katrina,” she says, “but it’s no longer just the Lower Ninth Ward’s problem.” Writing in the current issue of The Atlantic, the urban theorist Richard Florida suggests we could be seeing “the end of a whole way of life.” The link between the American dream and home ownership, fostered by years of bipartisan public policy, may be irreparably broken.

Pity our new president. As he rolls out one recovery package after another, he can’t know for sure what will work. If he tells the whole story of what might be around the corner, he risks instilling fear itself among Americans who are already panicked. (Half the country, according to a new Associated Press poll, now fears unemployment.) But if the president airbrushes the picture too much, the country could be as angry about ensuing calamities as it was when the Bush administration’s repeated assertion of “success” in Iraq proved a sham. Managing America’s future shock is a task that will call for every last ounce of Obama’s brains, temperament and oratorical gifts.

The difficulty of walking this fine line can be seen in the drama surrounding the latest forbidden word to creep around the shadows for months before finally leaping into the open: nationalization. Until he started hedging a little last weekend, the president has pointedly said that nationalizing banks, while fine for Sweden, wouldn’t do in America, with its “different” (i.e., non-socialistic) culture and traditions. But the word nationalization, once mostly whispered by liberal economists, is now even being tossed around by Lindsey Graham and Alan Greenspan. It’s a clear indication that no one has a better idea.

The Obama White House may come up with euphemisms for nationalization (temporary receivership, anyone?). But whatever it’s called, what will it mean? The reason why the White House has been punting on the new installment of the bank rescue is not that the much-maligned Treasury secretary, Timothy Geithner, is incapable of getting his act together. What’s slowing the works are the huge political questions at stake, many of them with consequences potentially as toxic as the banks’ assets.

Will Obama concede aloud that some of our “too big to fail” banks have, in essence, already failed? If so, what will he do about it? What will it cost? And, most important, who will pay? No one knows the sum of the American banks’ losses, but the economist Nouriel Roubini, who has gotten much right about this crash, puts it at $1.8 trillion. That doesn’t count any defaults still to come on what had been considered “good” mortgages and myriad other debt, whether from auto loans or credit cards.

Americans are right to wonder why there has been scant punishment for the management and boards of bailed-out banks that recklessly sliced and diced all this debt into worthless gambling chips. They are also right to wonder why there is still little transparency in how TARP funds have been spent by these teetering institutions. If a CNBC commentator can stir up a populist dust storm by ranting that Obama’s new mortgage program (priced at $75 billion to $275 billion) is “promoting bad behavior,” imagine the tornado that would greet an even bigger bank bailout on top of the $700 billion already down the TARP drain.

Nationalization would likely mean wiping out the big banks’ managements and shareholders. It’s because that reckoning has mostly been avoided so far that those bankers may be the Americans in the greatest denial of all. Wall Street’s last barons still seem to believe that they can hang on to their old culture by scuttling corporate jets, rejecting bonuses or sounding contrite in public. Ask the former Citigroup wise man Robert Rubin how that strategy worked out.

We are now waiting to learn if Obama’s economic team, much of it drawn from the Wonderful World of Citi and Goldman Sachs, will have the will to make its own former cohort face the truth. But at a certain point, as in every other turn of our culture of denial, outside events will force the recognition of harsh realities. Nationalization, unmentionable only yesterday, has entered common usage not least because an even scarier word — depression — is next on America’s list to avoid.

Originally published in the New York Times on the 21st February 2009

reposted from:

http://www.nytimes.com/2009/02/22/opinion/22rich.html?_r=1

Saturday, February 21, 2009

Change we can believe in? I daresay not...

Obama’s War on Terror May Resemble Bush’s in Some Areas

by Charlie Savage

WASHINGTON - Even as it pulls back from harsh interrogations and other sharply debated aspects of George W. Bush's "war on terrorism," the Obama administration is quietly signaling continued support for other major elements of its predecessor's approach to fighting Al Qaeda.

In little-noticed confirmation testimony recently, Obama nominees endorsed continuing the C.I.A.'s program of transferring prisoners to other countries without legal rights, and indefinitely detaining terrorism suspects without trials even if they were arrested far from a war zone.

The administration has also embraced the Bush legal team's arguments that a lawsuit by former C.I.A. detainees should be shut down based on the "state secrets" doctrine. It has also left the door open to resuming military commission trials.

And earlier this month, after a British court cited pressure by the United States in declining to release information about the alleged torture of a detainee in American custody, the Obama administration issued a statement thanking the British government "for its continued commitment to protect sensitive national security information."

These and other signs suggest that the administration's changes may turn out to be less sweeping than many had hoped or feared - prompting growing worry among civil liberties groups and a sense of vindication among supporters of Bush-era policies.

In an interview, the White House counsel, Gregory B. Craig, asserted that the administration was not embracing Mr. Bush's approach to the world. But Mr. Craig also said President Obama intended to avoid any "shoot from the hip" and "bumper sticker slogans" approaches to deciding what to do with the counterterrorism policies he inherited.

"We are charting a new way forward, taking into account both the security of the American people and the need to obey the rule of law," Mr. Craig said. "That is a message we would give to the civil liberties people as well as to the Bush people."

Within days of his inauguration, Mr. Obama thrilled civil liberties groups when he issued executive orders promising less secrecy, restricting C.I.A. interrogators to Army Field Manual techniques, shuttering the agency's secret prisons, ordering the prison at Guantánamo Bay, Cuba, closed within a year and halting military commission trials.

But in more recent weeks, things have become murkier.

During her confirmation hearing last week, Elena Kagan, the nominee for solicitor general, said that someone suspected of helping finance Al Qaeda should be subject to battlefield law - indefinite detention without a trial - even if he were captured in a place like the Philippines rather than in a physical battle zone.

Ms. Kagan's support for an elastic interpretation of the "battlefield" amplified remarks that Attorney General Eric H. Holder Jr. made at his own confirmation hearing. And it dovetailed with a core Bush position. Civil liberties groups argue that people captured away from combat zones should go to prison only after trials.

Moreover, the nominee for C.I.A. director, Leon E. Panetta, opened a loophole in Mr. Obama's interrogation restrictions. At his hearing, Mr. Panetta said that if the approved techniques were "not sufficient" to get a detainee to divulge details he was suspected of knowing about an imminent attack, he would ask for "additional authority."

To be sure, Mr. Panetta emphasized that the president could not bypass antitorture statutes, as Bush lawyers claimed. And he said that waterboarding - a technique that induces the sensation of drowning, and that the Bush administration said was lawful - is torture.

But Mr. Panetta also said the C.I.A. might continue its "extraordinary rendition" program, under which agents seize terrorism suspects and take them to other countries without extradition proceedings, in a more sweeping form than anticipated.

Before the Bush administration, the program primarily involved taking indicted suspects to their native countries for legal proceedings. While some detainees in the 1990s were allegedly abused after transfer, under Mr. Bush the program expanded and included transfers to third countries - some of which allegedly used torture - for interrogation, not trials.

Mr. Panetta said the agency is likely to continue to transfer detainees to third countries and would rely on diplomatic assurances of good treatment - the same safeguard the Bush administration used, and that critics say is ineffective.

Mr. Craig noted that while Mr. Obama decided "not to change the status quo immediately," he created a task force to study "rendition policy and what makes sense consistent with our obligation to protect the country."

He urged patience as the administration reviewed the programs it inherited from Mr. Bush. That process began after the election, Mr. Craig said, when military and C.I.A. leaders flew to Chicago for a lengthy briefing of Mr. Obama and his national security advisers. Mr. Obama then sent his advisers to C.I.A. headquarters to "find out the best case for continuing the practices that had been employed during the Bush administration."

Civil liberties groups praise Mr. Obama's early executive orders on national security, but say other signs are discouraging.

For example, Mr. Obama's Justice Department last week told an appeals court that the Bush administration was right to invoke "state secrets" to shut down a lawsuit by former C.I.A. detainees who say a Boeing subsidiary helped fly them to places where they were tortured.

Margaret Satterthwaite, a faculty director at the human rights center at the New York University law school, said, "It was literally just Bush redux - exactly the same legal arguments that we saw the Bush administration present to the court."

Mr. Craig said Mr. Holder and others reviewed the case and "came to the conclusion that it was justified and necessary for national security" to maintain their predecessor's stance. Mr. Holder has also begun a review of every open Bush-era case involving state secrets, Mr. Craig said, so people should not read too much into one case.

"Every president in my lifetime has invoked the state-secrets privilege," Mr. Craig said. "The notion that invoking it in that case somehow means we are signing onto the Bush approach to the world is just an erroneous assumption."

Still, the decision caught the attention of a bipartisan group of lawmakers. Two days after the appeals court hearing, they filed legislation to bar using the state-secrets doctrine to shut down an entire case - as opposed to withholding particular evidence.

The administration has also put off taking a stand in several cases that present opportunities to embrace or renounce Bush-era policies, including the imprisonment without trial of an "enemy combatant" on domestic soil, Freedom of Information Act lawsuits seeking legal opinions about interrogation and surveillance, and an executive-privilege dispute over Congressional subpoenas of former White House aides to Mr. Bush over the firing of United States attorneys.

Addressing the executive-privilege dispute, Mr. Craig said: "The president is very sympathetic to those who want to find out what happened. But he is also mindful as president of the United States not to do anything that would undermine or weaken the institution of the presidency. So for that reason, he is urging both sides of this to settle."

The administration's recent policy moves have attracted praise from outspoken defenders of the Bush administration. Last Friday, The Wall Street Journal's editorial page argued that "it seems that the Bush administration's antiterror architecture is gaining new legitimacy" as Mr. Obama's team embraces aspects of Mr. Bush's counterterrorism approach.

Anthony D. Romero, executive director of the American Civil Liberties Union, said the sequence of "disappointing" recent events had heightened concerns that Mr. Obama might end up carrying forward "some of the most problematic policies of the Bush presidency."

Mr. Obama has clashed with civil libertarians before. Last July, he voted to authorize eavesdropping on some phone calls and e-mail messages without a warrant. While the A.C.L.U. says the program is still unconstitutional, the legislation reduced legal concerns about one of the most controversial aspects of Mr. Bush's antiterror strategy.

"We have been some of the most articulate and vociferous critics of the way the Bush administration handled things," Mr. Craig said. "There has been a dramatic change of direction."

Published on Wednesday, February 18, 2009 by the New York Times

reposted from:

http://www.commondreams.org/headline/2009/02/18-4

Friday, February 20, 2009

Born to be intellectually unfree, it would seem...

Born believers: How your brain creates God

by Michael Brooks

WHILE many institutions collapsed during the Great Depression that began in 1929, one kind did rather well. During this leanest of times, the strictest, most authoritarian churches saw a surge in attendance.

This anomaly was documented in the early 1970s, but only now is science beginning to tell us why. It turns out that human beings have a natural inclination for religious belief, especially during hard times. Our brains effortlessly conjure up an imaginary world of spirits, gods and monsters, and the more insecure we feel, the harder it is to resist the pull of this supernatural world. It seems that our minds are finely tuned to believe in gods.

Religious ideas are common to all cultures: like language and music, they seem to be part of what it is to be human. Until recently, science has largely shied away from asking why. "It's not that religion is not important," says Paul Bloom, a psychologist at Yale University, "it's that the taboo nature of the topic has meant there has been little progress."

The origin of religious belief is something of a mystery, but in recent years scientists have started to make suggestions. One leading idea is that religion is an evolutionary adaptation that makes people more likely to survive and pass their genes onto the next generation. In this view, shared religious belief helped our ancestors form tightly knit groups that cooperated in hunting, foraging and childcare, enabling these groups to outcompete others. In this way, the theory goes, religion was selected for by evolution, and eventually permeated every human society (New Scientist, 28 January 2006, p 30)

The religion-as-an-adaptation theory doesn't wash with everybody, however. As anthropologist Scott Atran of the University of Michigan in Ann Arbor points out, the benefits of holding such unfounded beliefs are questionable, in terms of evolutionary fitness. "I don't think the idea makes much sense, given the kinds of things you find in religion," he says. A belief in life after death, for example, is hardly compatible with surviving in the here-and-now and propagating your genes. Moreover, if there are adaptive advantages of religion, they do not explain its origin, but simply how it spread.

An alternative being put forward by Atran and others is that religion emerges as a natural by-product of the way the human mind works.

That's not to say that the human brain has a "god module" in the same way that it has a language module that evolved specifically for acquiring language. Rather, some of the unique cognitive capacities that have made us so successful as a species also work together to create a tendency for supernatural thinking. "There's now a lot of evidence that some of the foundations for our religious beliefs are hard-wired," says Bloom.

Much of that evidence comes from experiments carried out on children, who are seen as revealing a "default state" of the mind that persists, albeit in modified form, into adulthood. "Children the world over have a strong natural receptivity to believing in gods because of the way their minds work, and this early developing receptivity continues to anchor our intuitive thinking throughout life," says anthropologist Justin Barrett of the University of Oxford.

So how does the brain conjure up gods? One of the key factors, says Bloom, is the fact that our brains have separate cognitive systems for dealing with living things - things with minds, or at least volition - and inanimate objects.

This separation happens very early in life. Bloom and colleagues have shown that babies as young as five months make a distinction between inanimate objects and people. Shown a box moving in a stop-start way, babies show surprise. But a person moving in the same way elicits no surprise. To babies, objects ought to obey the laws of physics and move in a predictable way. People, on the other hand, have their own intentions and goals, and move however they choose.

Mind and matter

Bloom says the two systems are autonomous, leaving us with two viewpoints on the world: one that deals with minds, and one that handles physical aspects of the world. He calls this innate assumption that mind and matter are distinct "common-sense dualism". The body is for physical processes, like eating and moving, while the mind carries our consciousness in a separate - and separable - package. "We very naturally accept you can leave your body in a dream, or in astral projection or some sort of magic," Bloom says. "These are universal views."

There is plenty of evidence that thinking about disembodied minds comes naturally. People readily form relationships with non-existent others: roughly half of all 4-year-olds have had an imaginary friend, and adults often form and maintain relationships with dead relatives, fictional characters and fantasy partners. As Barrett points out, this is an evolutionarily useful skill. Without it we would be unable to maintain large social hierarchies and alliances or anticipate what an unseen enemy might be planning. "Requiring a body around to think about its mind would be a great liability," he says.

Useful as it is, common-sense dualism also appears to prime the brain for supernatural concepts such as life after death. In 2004, Jesse Bering of Queen's University Belfast, UK, put on a puppet show for a group of pre-school children. During the show, an alligator ate a mouse. The researchers then asked the children questions about the physical existence of the mouse, such as: "Can the mouse still be sick? Does it need to eat or drink?" The children said no. But when asked more "spiritual" questions, such as "does the mouse think and know things?", the children answered yes.

Default to god

Based on these and other experiments, Bering considers a belief in some form of life apart from that experienced in the body to be the default setting of the human brain. Education and experience teach us to override it, but it never truly leaves us, he says. From there it is only a short step to conceptualising spirits, dead ancestors and, of course, gods, says Pascal Boyer, a psychologist at Washington University in St Louis, Missouri. Boyer points out that people expect their gods' minds to work very much like human minds, suggesting they spring from the same brain system that enables us to think about absent or non-existent people.

The ability to conceive of gods, however, is not sufficient to give rise to religion. The mind has another essential attribute: an overdeveloped sense of cause and effect which primes us to see purpose and design everywhere, even where there is none. "You see bushes rustle, you assume there's somebody or something there," Bloom says.

This over-attribution of cause and effect probably evolved for survival. If there are predators around, it is no good spotting them 9 times out of 10. Running away when you don't have to is a small price to pay for avoiding danger when the threat is real.

Again, experiments on young children reveal this default state of the mind. Children as young as three readily attribute design and purpose to inanimate objects. When Deborah Kelemen of the University of Arizona in Tucson asked 7 and 8-year-old children questions about inanimate objects and animals, she found that most believed they were created for a specific purpose. Pointy rocks are there for animals to scratch themselves on. Birds exist "to make nice music", while rivers exist so boats have something to float on. "It was extraordinary to hear children saying that things like mountains and clouds were 'for' a purpose and appearing highly resistant to any counter-suggestion," says Kelemen.

In similar experiments, Olivera Petrovich of the University of Oxford asked pre-school children about the origins of natural things such as plants and animals. She found they were seven times as likely to answer that they were made by god than made by people.

These cognitive biases are so strong, says Petrovich, that children tend to spontaneously invent the concept of god without adult intervention: "They rely on their everyday experience of the physical world and construct the concept of god on the basis of this experience." Because of this, when children hear the claims of religion they seem to make perfect sense.

Our predisposition to believe in a supernatural world stays with us as we get older. Kelemen has found that adults are just as inclined to see design and intention where there is none. Put under pressure to explain natural phenomena, adults often fall back on teleological arguments, such as "trees produce oxygen so that animals can breathe" or "the sun is hot because warmth nurtures life". Though she doesn't yet have evidence that this tendency is linked to belief in god, Kelemen does have results showing that most adults tacitly believe they have souls.

Boyer is keen to point out that religious adults are not childish or weak-minded. Studies reveal that religious adults have very different mindsets from children, concentrating more on the moral dimensions of their faith and less on its supernatural attributes.

Even so, religion is an inescapable artefact of the wiring in our brain, says Bloom. "All humans possess the brain circuitry and that never goes away." Petrovich adds that even adults who describe themselves as atheists and agnostics are prone to supernatural thinking. Bering has seen this too. When one of his students carried out interviews with atheists, it became clear that they often tacitly attribute purpose to significant or traumatic moments in their lives, as if some agency were intervening to make it happen. "They don't completely exorcise the ghost of god - they just muzzle it," Bering says.

The fact that trauma is so often responsible for these slips gives a clue as to why adults find it so difficult to jettison their innate belief in gods, Atran says. The problem is something he calls "the tragedy of cognition". Humans can anticipate future events, remember the past and conceive of how things could go wrong - including their own death, which is hard to deal with. "You've got to figure out a solution, otherwise you're overwhelmed," Atran says. When natural brain processes give us a get-out-of-jail card, we take it.

That view is backed up by an experiment published late last year (Science, vol 322, p 115). Jennifer Whitson of the University of Texas in Austin and Adam Galinsky of Northwestern University in Evanston, Illinois, asked people what patterns they could see in arrangements of dots or stock market information. Before asking, Whitson and Galinsky made half their participants feel a lack of control, either by giving them feedback unrelated to their performance or by having them recall experiences where they had lost control of a situation.

The results were striking. The subjects who sensed a loss of control were much more likely to see patterns where there were none. "We were surprised that the phenomenon is as widespread as it is," Whitson says. What's going on, she suggests, is that when we feel a lack of control we fall back on superstitious ways of thinking. That would explain why religions enjoy a revival during hard times.

So if religion is a natural consequence of how our brains work, where does that leave god? All the researchers involved stress that none of this says anything about the existence or otherwise of gods: as Barratt points out, whether or not a belief is true is independent of why people believe it.

It does, however, suggests that god isn't going away, and that atheism will always be a hard sell. Religious belief is the "path of least resistance", says Boyer, while disbelief requires effort.

These findings also challenge the idea that religion is an adaptation. "Yes, religion helps create large societies - and once you have large societies you can outcompete groups that don't," Atran says. "But it arises as an artefact of the ability to build fictive worlds. I don't think there's an adaptation for religion any more than there's an adaptation to make airplanes."

Supporters of the adaptation hypothesis, however, say that the two ideas are not mutually exclusive. As David Sloan Wilson of Binghamton University in New York state points out, elements of religious belief could have arisen as a by-product of brain evolution, but religion per se was selected for because it promotes group survival. "Most adaptations are built from previous structures," he says. "Boyer's basic thesis and my basic thesis could both be correct."

Robin Dunbar of the University of Oxford - the researcher most strongly identified with the religion-as-adaptation argument - also has no problem with the idea that religion co-opts brain circuits that evolved for something else. Richard Dawkins, too, sees the two camps as compatible. "Why shouldn't both be correct?" he says. "I actually think they are."

Ultimately, discovering the true origins of something as complex as religion will be difficult. There is one experiment, however, that could go a long way to proving whether Boyer, Bloom and the rest are onto something profound. Ethical issues mean it won't be done any time soon, but that hasn't stopped people speculating about the outcome.

It goes something like this. Left to their own devices, children create their own "creole" languages using hard-wired linguistic brain circuits. A similar experiment would provide our best test of the innate religious inclinations of humans. Would a group of children raised in isolation spontaneously create their own religious beliefs? "I think the answer is yes," says Bloom.

Michael Brooks is a writer based in Lewes, UK. He is the author of 13 Things That Don't Make Sense (Profile)

reposted from:

http://www.newscientist.com/article/mg20126941.700-born-believers-how-your-brain-creates-god.html?full=true&print=true

The Bolivarian revolution marches on while Obama simply parrots Bush

In an article which originally appeared in the UK's Guardian newspaper, Mark Weisbrot takes Obama to task over his continuation of Bush Administration rhetoric relating to Latin America:

Venezuela, An Imaginary Threat

by Mark Weisbrot

US-Latin American relations fell to record lows during the George Bush years, and there have been hopes - both north and south of the border - that President Barack Obama will bring a fresh approach. So far, however, most signals are pointing to continuity rather than change.

Obama started off with an unprovoked verbal assault on Venezuela. In an interview broadcast by the Spanish-language television station Univision on the Sunday before his inauguration, he accused Hugo Chávez of having "impeded progress in the region" and "exporting terrorist activities".

These remarks were unusually hostile and threatening even by the previous administration's standards. They are also untrue and diametrically opposed to the way the rest of the region sees Venezuela. The charge that Venezuela is "exporting terrorism" would not pass the laugh test among almost any government in Latin America.

José Miguel Insulza, the Chilean president of the Organisation of American States, was speaking for almost all the countries in the hemisphere when he told the US Congress last year that "there is no evidence" and that no member country, including the US, had offered "any such proof" that Venezuela supported terrorist groups.

Nor do the other Latin American democracies see Venezuela as an obstacle to progress in the region. On the contrary, President Lula da Silva of Brazil, along with several other presidents in South America, has repeatedly defended Chávez and his role in the region. Just a few days after Obama denounced Venezuela, Lula was in Venezuela's southern state of Zulia, where he emphasised his strategic partnership with Chávez and their common efforts at regional economic integration.

Obama's statement was no accident. Whoever fed him these lines very likely intended to send a message to the Venezuelan electorate before last Sunday's referendum that Venezuela won't have decent relations with the US so long as Chávez is their elected president. (Voters decided to remove term limits for elected officials, paving the way for Chávez to run again in 2013.)

There is definitely at least a faction of the Obama administration that wants to continue the Bush policies. James Steinberg, number two to Hillary Clinton in the state department, took a gratuitous swipe at Bolivia and Venezuela during his confirmation process, saying that the US should provide a "counterweight to governments like those currently in power in Venezuela and Bolivia which pursue policies which do not serve the interests of their people or the region."

Another sign of continuity is that Obama has not yet replaced Bush's top state department official for the western hemisphere, Thomas Shannon.

The US media plays the role of enabler in this situation. Thus the Associated Press ignores the attacks from Washington and portrays Chávez's response as nothing more than an electoral ploy on his part. In fact, Chávez had been uncharacteristically restrained. He did not respond to attacks throughout the long US presidential campaign, even when Hillary Clinton and Joe Biden called him a "dictator" or Obama described him as "despotic" - labels that no serious political scientist anywhere would accept for a democratically elected president of a country where the opposition dominates the media. He wrote it off as the influence of South Florida on US presidential elections.

But there are few if any presidents in the world that would take repeated verbal abuse from another government without responding. Obama's advisers know that no matter what this administration does to Venezuela, the press will portray Chávez as the aggressor. So it's an easy, if cynical, political calculation for them to poison relations from the outset. What they have not yet realised is that by doing so they are alienating the majority of the region.

There is still hope for change in US foreign policy toward Latin America, which has become thoroughly discredited on everything from the war on drugs to the Cuba embargo to trade policy. But as during the Bush years, we will need relentless pressure from the south. Last September the Union of South American Nations strongly backed Bolivia's government against opposition violence and destabilisation. This was very successful in countering Washington's tacit support for the more extremist elements of Bolivia's opposition. It showed the Bush administration that the region was not going to tolerate any attempts to legitimise an extra-legal opposition in Bolivia or to grant it special rights outside of the democratic political process.

Several presidents, including Lula, have called upon Obama to lift the embargo on Cuba, as they congratulated him on his victory. Lula also asked Obama to meet with Chávez. Hopefully these governments will continue to assert - repeatedly, publicly and with one voice - that Washington's problems with Cuba, Bolivia and Venezuela are Washington's problems, and not the result of anything that those governments have done. When the Obama team is convinced that a "divide and conquer" approach to the region will fail just as miserably for this administration as it did for the previous one, then we may see the beginnings of a new policy toward Latin America.

Mark Weisbrot is Co-Director of the Center for Economic and Policy Research (CEPR), in Washington, DC. His column is distributed to newspapers by McClatchy-Tribune Information Services.

reposted from:

http://www.commondreams.org/view/2009/02/19-13

Thursday, February 19, 2009

Prosecute the brutes!!!

In his latest piece for Salon, former constitutional and civil rights lawyer Glenn Greenwald makes a typically excellent case for vigorously pursuing the prosecution of all the woeful war criminals and torturing troglodytes in the Bush Administration (Doubt Obama will take his advice, but he certainly should):

Do we still pretend that we abide by treaties?

by Glenn Greenwald

On Friday in Salon, Joe Conason argued that there should be no criminal investigations of any kind for Bush officials "who authorized torture or other outrages in the 'war on terror'." Instead, Conason suggests that there be a presidential commission created that is "purely investigative," and Obama should "promis[e] a complete pardon to anyone who testifies fully, honestly and publicly." So, under this proposal, not only would we adopt an absolute bar against prosecuting war criminals and other Bush administration felons, we would go in the other direction and pardon them from any criminal liability of any kind.

I've already written volumes about why immunizing political officials from the consequences for their lawbreaking is both destructive and unjust -- principally: the obvious incentives which such immunity creates (and, for decades, has been creating) for high-level executive branch officials to break the law and, even worse, the grotesque two-tiered system of justice we've implemented in this country (i.e., the creation of an incomparably harsh prison state for ordinary Americans who commit even low-level offenses as contrasted with what Conason calls, approvingly, "the institutional reluctance in Washington to punish political offenders"). Rather than repeat those arguments, I want to focus on an issue that pro-immunity advocates such as Conason simply never address.

The U.S. really has bound itself to a treaty called the Convention Against Torture, signed by Ronald Reagan in 1988 and ratified by the U.S. Senate in 1994. When there are credible allegations that government officials have participated or been complicit in torture, that Convention really does compel all signatories -- in language as clear as can be devised -- to "submit the case to its competent authorities for the purpose of prosecution" (Art. 7(1)). And the treaty explicitly bars the standard excuses that America's political class is currently offering for refusing to investigate and prosecute: "No exceptional circumstances whatsoever, whether a state of war or a threat or war, internal political instability or any other public emergency, may be invoked as a justification of torture" and "an order from a superior officer or a public authority may not be invoked as a justification of torture" (Art. 2 (2-3)). By definition, then, the far less compelling excuses cited by Conason (a criminal probe would undermine bipartisanship and distract us from more important matters) are plainly barred as grounds for evading the Convention's obligations.

There is reasonable dispute about the scope of prosecutorial discretion permitted by the Convention, and there is also some lack of clarity about how many of these provisions were incorporated into domestic law when the Senate ratified the Convention with reservations. But what is absolutely clear beyond any doubt is that -- just as is true for any advance promises by the Obama DOJ not to investigate or prosecute -- issuing preemptive pardons to government torturers would be an unambiguous and blatant violation of our obligations under the Convention. There can't be any doubt about that. It just goes without saying that if the U.S. issued pardons or other forms of immunity to accused torturers (as the Military Commissions Act purported to do), that would be a clear violation of our obligation to "submit the [torture] case to [our] competent authorities for the purpose of prosecution." Those two acts -- the granting of immunity and submission for prosecution -- are opposites.

And yet those who advocate that we refrain from criminal investigations rarely even mention our obligations under the Convention. There isn't even a pretense of an effort to reconcile what they're advocating with the treaty obligations to which Ronald Reagan bound the U.S. in 1988. Do we now just explicitly consider ourselves immune from the treaties we signed? Does our political class now officially (rather than through its actions) consider treaties to be mere suggestions that we can violate at will without even pretending to have any justifications for doing so? Most of the time, our binding treaty obligations under the Convention -- as valid and binding as every other treaty -- don't even make it into the discussion about criminal investigations of Bush officials, let alone impose any limits on what we believe we can do.

What was all the sturm und drang about in 2003 over Bush's invasion of Iraq without U.N. approval, in violation of the U.N. charter? Wasn't it supposed to be a bad thing for the U.S. to violate its own treaties? What happened to that? Conason himself was actually one of the clearest and most emphatic voices presciently highlighting the deceit on which the pro-war case was based, stridently warning of "ruined alliances and damaged institutions." Why, then, is it acceptable now to ignore and violate our treaty obligations with regard to torture and other war crimes committed by high-level Bush officials? What's the argument for simply pretending that these obligations under the Convention don't exist?

* * * * *

On a related note, Conason, in the very first paragraph of Friday's article, plainly misstated the results of a new Gallup poll on the question of whether Bush officials should be prosecuted and/or investigated. I have no doubt it was unintentional, but his error highlights a very important point about how this debate has proceeded. Here's what Conason wrote in his first paragraph (emphasis added):

More than 60 percent of Americans believe that alleged abuses and atrocities ordered by the Bush administration should be investigated either by an independent commission or by federal prosecutors, according to a poll released yesterday by the Gallup Organization. A significant minority favors criminal sanctions against officials who authorized torture or other outrages in the "war on terror" -- yet a considerably larger minority of nearly 40 percent prefers that the Obama administration leave its wayward predecessors be.


That last assertion (the one I bolded) is simply untrue. As Jim White notes here, the Gallup poll asked about three different acts of Bush lawbreaking: (1) politicization of DOJ prosecutions, (2) warrantless eavesdropping on Americans, and (3) torture. For each crime, it asked which of three options respondents favored: (1) a criminal investigation by the DOJ; (2) a non-criminal, fact-finding investigation by an independent panel; or (3) neither. The full results are here.

For all three separate acts of alleged crimes, the option that receives the most support from Americans is criminal investigations (i.e., the exact opposite of what Conason wrote). And the percentage that favor that nothing be done is in every case less than the percentage that want criminal investigations, and the "do-nothing" percentage never reaches 40% or close to it (the highest it gets is 34% -- roughly the same minority of pro-Bush dead-enders that continue to support most of what was done).

As White notes, the breakdowns are even more revealing. For all three areas of lawbreaking, majorities of Democrats (which, by the way, is now the majority party) favor criminal investigations. For each of the three areas, more independents favor criminal prosecutions than favor doing nothing, and large majorities of independents -- ranging from 59% to 71% -- want either a criminal investigation or an independent fact-finding investigation. A Washington Post poll from a couple weeks ago found very similar results: majorities of Americans (and large majorities of Democrats) favor investigations into whether Bush officials broke the law and, by a wide margin, oppose the issuance of pardons to Bush officials.

Imagine what those numbers would be in a world where virtually every establishment political pundit -- literally: whether Democratic or Republican, liberal or conservative -- weren't uniting together to oppose prosecutions for torture and war crimes. Even with that unified anti-prosecution stance from a trans-partisan rainbow of Beltway opinion-makers, criminal investigations remain the leading position among Americans generally and among majorities of Democrats specifically. Those are just facts.

As is always the case, the mere fact that majorities of Americans believe X does not mean that X is right or true. But pundits, journalists and politicians should stop claiming that they're speaking for most Americans when they argue that we should just "move on" -- or that the belief in investigations is the province of the leftist fringe -- because that claim is demonstrably false.

Recall when opposition to the Iraq War and a demand for a withdrawal timetable was routinely depicted by the Beltway class as a "liberal" or even Far Left position -- even though large majorities of Americans held exactly those views. Apparently, the Far Left encompassed more than 60% of the country. Or recall when Time's Managing Editor, Rick Stengel, went on national TV and claimed that Americans don't want Bush officials and Karl Rove investigated for the U.S. Attorney scandal even when polls showed that large majorities of Americans favored exactly those investigations (a false claim which, to this day, Stengel refuses to retract).

That is the same flagrant distortion of public opinion that one finds here in the debate over investigations. The Washington Post's David Ignatius claims that a desire for investigations of Bush crimes is confined to "liberal score-settlers." Lindsey Graham asserts that only the "hard Left" wants criminal investigations. Newsweek's Jon Barry is certain that the desire for investigations is only about "vengeance, pure and simple."

Apparently, huge numbers of Americans -- majorities, actually -- are now liberal, vengeance-seeking, score-settlers from the Hard Left. What we actually have is what one finds again and again: establishment journalists who will resort to outright distortions about American public opinion in order to render it irrelevant, by claiming that "most Americans" believe as they believe even where, as here, that claim is categorically false. It's hardly surprising (except to an insular Beltway maven) that Americans, who know that they will be subjected to one of the world's harshest and most merciless criminal justice systems if they break the law, don't want political elites exempted from the rule of law. Imagine that.

* * * * *

Finally, Newsweek's Michael Isikoff -- echoing a report from John Yoo's Berkeley colleague, Brad DeLong -- reports that an internal DOJ probe (initiated during the Bush administration) has preliminarily concluded that Bush DOJ lawyers who authorized torture (John Yoo, Jay Bybee, Stephen Bradbury) violated their professional duties as lawyers by issuing legal conclusions that had no good faith basis, and that this behavior will be referred to their state bar associations for possible disciplinary action. Those conclusions so infuriated the allegedly honorable Michael Mukasey that he refused to accept the report until changes were made. Now it is up to Eric Holder to accept and then release that report.

The implications of this event can't be overstated. One of the primary excuses offered by Bush apologists and those who oppose investigations is that Bush DOJ lawyers authorized the torture and opined that it was legal. But a finding that those lawyers breached their ethical obligations would mean, by definition, that the opinions they issued were not legitimate legal opinions -- i.e., that they were not merely wrong in their conclusions, but so blatantly and self-evidently wrong that they were issued in bad faith (with the intent to justify what they knew the President wanted to do, rather than to offer their good faith views of what the law permitted).

The Convention Against Torture explicitly prohibits the domestic legalization of torture, and specifically states that it shall not be a defense that government officials authorized it. So whether or not these legal opinions were issued in good faith is irrelevant to our obligations under that treaty to investigate and prosecute. But a finding that these legal opinions were issued in bad faith -- with the deliberate intent to knowingly legalize what was plainly criminal behavior -- will gut the primary political excuse for treating Bush officials differently than common criminals.

UPDATE: Citing numerous leading international law authorities, Valtin has an excellent discussion of the obligations the U.S. has to criminally investigate Bush crimes, not only under the Convention Against Torture but also under the Geneva Conventions. If we don't consider ourselves bound by the treaties we sign, we should just say so and abrogate them. Those demanding criminal immunity for Bush officials are advocating that we can and should violate our treaty obligations; they really ought to be honest about it.

UPDATE II: On June 28, 2004, George Bush commemorated the U.N. Day to Support Torture Victims and vowed that the U.S. "will investigate and prosecute all acts of torture and undertake to prevent other cruel and unusual punishment in all territory under our jurisdiction." In doing so, he specifically cited the U.S.'s binding obligation under the Convention to do so (h/t leftydem):

To help fulfill this commitment, the United States has joined 135 other nations in ratifying the Convention Against Torture and Other Cruel, Inhuman or Degrading Treatment or Punishment. America stands against and will not tolerate torture. We will investigate and prosecute all acts of torture and undertake to prevent other cruel and unusual punishment in all territory under our jurisdiction. American personnel are required to comply with all U.S. laws, including the United States Constitution, Federal statutes, including statutes prohibiting torture, and our treaty obligations with respect to the treatment of all detainees. . . .

The United States also remains steadfastly committed to upholding the Geneva Conventions, which have been the bedrock of protection in armed conflict for more than 50 years. . . . [W]e will not compromise the rule of law or the values and principles that make us strong. Torture is wrong no matter where it occurs, and the United States will continue to lead the fight to eliminate it everywhere.

If George Bush, citing our obligations under the Convention Against Torture and the Geneva Conventions, can publicly vow that "we will investigate and prosecute all acts of torture," why can't Democratic politicians and liberal pundits simply cite the same treaty obligations and make the same commitment?

Originally posted on Monday Feb. 16, 2009

reposted from:

http://www.salon.com/opinion/greenwald/2009/02/16/treaties/index.html

Tuesday, February 17, 2009

The cruelly cold cult of celebrity

In his latest superbly memorable and terrifically thought-provoking article Chris Hedges delivers a brutally incisive dissection of the baleful narcissism that sits so repellently at the heart of the pursuit and veneration of modern fame. He writes that "Celebrity culture is about the denial of death" and "the illusion of immortality", which ultimately "plunges us into a moral void".

Here's the article in full:

Fame! I Wanna Live Forever: How Narcissism Conquered Reality

By Chris Hedges, Truthdig.

I visited the Hollywood Forever Cemetery in Los Angeles a few days ago. It is advertised as "the final resting place to more of Hollywood's founders and stars than anywhere else on Earth." The 60-acre cemetery holds the remains of 135 Hollywood luminaries, including Rudolph Valentino, Tyrone Power, Cecil B. DeMille, Douglas Fairbanks, Nelson Eddy, Peter Lorre, Mel Blanc and John Huston.

We all have gods, Martin Luther said, it is just a question of which ones. And in American society, our gods are often celebrities. Religious belief and practice are commonly transferred to the adoration of celebrities. Our celebrity culture builds reliquaries and shrines to celebrities the way Romans built them for divine emperors, ancestors and household gods. We are a de facto polytheistic society. We engage in shamanism. Relics of celebrities, like relics of the dead among ancestor cults in Africa, Asia or the medieval Catholic Church, are coveted as magical talismans.

Hollywood Forever is next to Paramount Studios. The massive white HOLLYWOOD letters on the hillside tower above the tombs and Italian Renaissance-inspired marble buildings that hold rows of crypts. Maps with the locations of stars' graves, along with a glossy booklet of brief star biographies, are handed out at the gate. Tourists are promised visits with dead stars, who are referred to as "residents." The cemetery, which has huge marble monuments to the wealthy and the powerful, is divided into sections with names like "Garden of Eternal Love" and "Garden of Legends." It has two massive marble mausoleums, including the Cathedral Mausoleum, with 6,000 crypts -- the largest mausoleum in the world when it was built in the 1930s. Most of the celebrities, however, have simple bronze plaques that seem to indicate a yearning for the anonymity denied to them in life.

The cemetery, established in 1899 and called Hollywood Memorial Park, fell into disrepair and neglect some eight or nine decades after it was opened. By the 1990s, some families, including relatives of the makeup artist Max Factor, paid to have their loved ones removed from the grounds. By April 1996, the property was bankrupt. The cemetery was months away from being condemned. It was bought by Tyler Cassidy and his brother Brent, who renamed it Hollywood Forever Cemetery and began a marketing campaign around its celebrity "residents." The brothers established the "Forever Network," in which the noncelebrity departed could, at least in death, be the stars of their own home movies. The cemetery Web site archives the video tributes. "Families, young and old, are starting their LifeStories now, and adding to them as the years pass," the cemetery's brochure states. "What this means -- having our images, voices, and videos available for future generations -- has deep importance, both sociologically and for fully celebrating life." At funerals these specially produced tributes, which often include highlights from home videos, are shown on screens next to the caskets of the deceased. The cemetery's business is booming.

It costs a lot to be buried near a celebrity. Hugh Hefner reportedly paid $85,000 to reserve the crypt next to Marilyn Monroe at Los Angeles' Westwood Village Memorial Park Ceremony. The "prestige service" offered by Hollywood Forever runs $5,400. Jay Boileau, the executive vice president of the cemetery, conceded that getting a crypt near Valentino costs even more, although he said he did not have the price list with him. "We have sold most of them," he said of the crypts near Valentino. "Visits to his crypt are unique. Every year we hold a memorial service for him on the day he passed away. He was the first true sex symbol. Ten thousand people came to his funeral. He was the first Brad Pitt. He was the first true superstar in film and the greatest screen lover."

In celebrity culture, the object is to get as close to the celebrity as possible. Those who can touch the celebrity or own a relic of the celebrity hope for a transference of celebrity power. They hope for magic. We seek tangible artifacts of celebrity power from autographs or pictures or objects once owned by the celebrity. Celebrity items from Princess Diana's old dresses to Swatch watches once owned by Andy Warhol (that originally sold for $40) are auctioned off for thousands of dollars. Pilgrims travel to celebrity shrines. Graceland receives 750,000 visitors a year. Hard Rock Cafe has built its business around this yearning for intimacy with the famous. It ships reliquaries of stars from one restaurant to another the way the medieval church shipped the bones and other remains of saints to its cathedrals. Charlie Chaplin's corpse, like that of Evita Peron, was stolen and held for ransom. John Wayne's family, fearing grave robbers, did not mark his grave until 20 years after his death. The headstones of James Dean, Dylan Thomas, Sylvia Plath, Buddy Holly and Jim Morrison have all been uprooted and carted away.

Buses wind their way through the Hollywood Hills so tourists can gawk at the walls that barricade the homes of the famous. The celebrity interview or profile, pioneered on television by Barbara Walters and now a ubiquitous part of the news and entertainment industry, gives us the illusion that we are intimately related to celebrities as well as the characters they portray. In celebrity culture, we seek to validate ourselves through these imaginary relationships with celebrities. Real life, our own life, is viewed next to the lives of celebrities as inadequate and inauthentic. Celebrities are portrayed as idealized forms of ourselves. It is we, in perverse irony, who are never fully actualized in a celebrity culture.

Soldiers and Marines speak of entering combat as if they are entering a movie, although if they try to engage in movie-style heroics they often are killed. The difference between the celebrity-inspired heroics and the reality of war, which takes less than a minute in a firefight to grasp, is jolting. Wounded Marines booed and hissed John Wayne when he visited them in a hospital in World War II. They had uncovered the manipulation and self-delusion of celebrity culture. They understood that mass culture is a form of social control, a way to influence behavior that is self-destructive.

Neal Gabler writes in "Life: the Movie: How Entertainment Conquered Reality" that the power of celebrity culture means we often seek to enact the movies that play inside our heads. We become celebrities, at least privately, to ourselves. Celebrity culture is so ubiquitous that it has established perverse interior personal scripts and modes of speech through which our relationship with the world is often constructed. Gabler argues that celebrity culture is not a convergence between consumer culture and religion but instead is a hostile takeover of religion by celebrity culture. Commodities and celebrity culture alone define what it means to belong to American society, how we recognize our place in society and how we determine our spiritual life. Celebrity culture is about the denial of death. It is about the illusion of immortality. The portal to Valhalla is through the celebrity.

Celebrity worship is dressed up in the language of the Christian right, the frenzy around political messiahs like Barack Obama or the devotional following of Oprah by millions of women. If Jesus and "The Purpose Driven Life" won't make us a celebrity, then Tony Robbins or reality television will. We are waiting for our cue to walk on stage and be admired and envied.

Personal style has become a compensation for our loss of democratic equality. Our choice of brands becomes our pathetic expression of individuality. Celebrity is the vehicle used by a corporate society to sell us these branded commodities, most of which we do not need. Celebrities humanize commercial commodities. They are the familiar and comforting faces of the corporate state. Advertisers use celebrities to promise us that through the purchase of a product we can attain celebrity power. Wear Nikes and become, in some way, Michael Jordan.

Celebrity culture plunges us into a moral void. The highest achievements in a celebrity culture are wealth, sexual conquest and fame. It does not matter how these are obtained. These values, as Sigmund Freud understood, are illusory. They are hollow. They are hallucinations. They leave us chasing vapors. They encourage a perverted form of narcissism. They urge us toward a life of self-absorption. They tell us that existence is to be centered on the practices and desires of the self rather than the common good.

The most moving memorial in the Hollywood Forever Cemetery is held in a small glass case containing the cremated remains of the actor David White and his son Jonathan White. David White played Larry Tate, the Machiavellian advertising executive, on the television show "Bewitched" and also had a long stage career. He was married to the actress Mary Welch, who died during a second childbirth in 1958. David was left to raise Jonathan. Next to the urns are pictures of the father and young boy. There is one of Jonathan in a graduation gown, the father's eyes directed upward toward his son's face. Jonathan died at 33, a victim of the 1988 bombing of Pan Am Flight 103 over Lockerbie, Scotland. His father was devastated. He entered into a long period of mourning and seclusion. He died of a heart attack shortly before the two-year anniversary of his son's death. The modest memorial is simple and poignant veneration of the powerful bond between a father and a child. It defies the celebrity culture around it. It speaks to other values, to loss, to grief, to mortality and to the awful fragility of life. It is a reminder in a sea of kitsch of the beauty of love.

Celebrity culture encourages us to turn our love inward, to think of ourselves as potential celebrities who possess unique if unacknowledged gifts. It is the culture of narcissism. It is about the hyperinflation of the ordinary. The banal chatter of anyone, no matter how insipid, has in celebrity culture cosmic significance. This chatter fills the airwaves. Reality, however, exposes something very different. And the juxtaposition of the impossible illusions inspired by celebrity culture and our insignificant individual achievements leads to frustration, anger, insecurity and a fear of invalidation. It leads to an accelerated flight toward the celebrity culture, what Chris Rojek in his book "Celebrity" calls "the cult of distraction that valorizes the superficial, the gaudy, the domination of commodity culture."

This cult of distraction, as Rojek points out, masks the real disintegration of culture. It conceals the meaninglessness and emptiness of our own lives. It deflects the moral questions arising from mounting social injustice, growing inequalities, and costly imperial wars as well as economic and political corruption. Shamanism is not only the currency of celebrity culture; it is the currency of totalitarian culture. And as we sink into an economic and political morass, we are controlled, manipulated and distracted by the celluloid shadows on the wall of Plato's cave. The fantasy of celebrity culture is not designed simply to entertain. It is designed to keep us from fighting back, even, apparently, in death.

Chris Hedges, a Pulitzer prize-winning reporter, is a Senior Fellow at the Nation Institute. His latest book is Collateral Damage: America's War Against Iraqi Civilians.

Originally posted February 12, 2009.

reposted from:

http://www.alternet.org/mediaculture/126497/fame!_i_wanna_live_forever%3A_how_narcissism_conquered_reality/?page=entire

The US invasion and occupation of Iraq has wrought a land of such pathological perversions that even "good" news is bad

Iraq's Gravedigging Industry Is at 100% Full Employment

By Dahr Jamail, IPS News.

BAGHDAD, Feb 5 (IPS) -- Amidst the soaring unemployment in Iraq, the gravediggers have been busy. So busy that officials have no record of the number of graves dug; of the real death toll, that is.

"I've been working here four years," a gravedigger who gave his name as Ali told IPS at the largest cemetery in Baghdad, a sprawling expanse in the Abu Ghraib section of the capital city. "In 2006 and some of 2007, we buried 40- 50 people daily. This went on for one-and-a-half years.

"Twenty-five percent of these were from violence, and another 70 percent were killed by the Mehdi Army (the militia of Shia cleric Muqtada Al-Sadr)." Only a few appeared to have died from natural causes.

"Most of the dead were never logged by anyone," Ali said, "because we didn't check death certificates, we just tried to get the bodies into the ground as quickly as possible."

An Iraqi Army checkpoint was set up outside the vast cemetery a year ago.

"We opened this checkpoint because people were burying the dead and no information was being given to anyone," a soldier, speaking on condition of anonymity because he was not authorised to speak to the media, told IPS.

"Most of this (lack of reporting the dead), we found, happened during 2006," the soldier added. "Anyone could be buried here, and nobody would know about it."

Not far, in the Al-Adhamiya area of Baghdad, what used to be a park is now a cemetery with more than 5,000 graves. According to the manager, most of the dead are never counted.

"Most of the bodies buried here are never reported in the media," Abu Ayad Nasir Walid, 45, manager of the cemetery told IPS. He has been the manager here since the park was converted into a cemetery amidst the bloodletting from sectarian violence in early 2006.

"I have the name here of the first martyr buried," Walid said, pointing to a tombstone. "Gaith Al-Samarai, buried on 21 May 2006, he was the sheikh of the Al-Hurria mosque."

Latif produced the cemetery logbook. "As of this hour, exactly, there are 5,500 bodies in this place. I log their names in my book, but we've never had anyone come from the government to ask how many people are here. Nobody in the media nor the Ministry of Health seems to be interested."

Such graveyards, and there are many, raise questions about the real death toll in Iraq.

The last serious study, by a group of doctors in the U.S. and Iraq, was published in the British peer-reviewed medical journal The Lancet on Oct. 11, 2006.

The study said about 655,000 Iraqis (2.5 percent of the population) had been killed as a direct result of the invasion and occupation. The research was carried out on the ground by doctors moving from house to house, questioning families, and examining death certificates.

Homes were surveyed in 47 separated clusters across Iraq. The Lancet says the study, carried out by the Johns Hopkins Bloomberg School of Public Health in Baltimore in the U.S., has been validated by four independent experts.

The worst of the violence followed the Feb. 22, 2006 bombing of the Al- Askari shrine in Samarra. The bombing of one of the most sacred Shia mosques in the world sparked sectarian violence that lasted months, with sometimes more than 300 killed in a day.

"During that time we buried 30-40 bodies daily," Sehel Abud Al-Latif, a gravedigger at the Al-Adhamiya cemetery told IPS. "Often we had to work through the night, otherwise the bodies would just remain outside."

Some estimates of the death toll have been considerably lower than that of The Lancet. The group Iraq Body Count (IBC), which describes itself as an "ongoing human security project," estimates the number to be 98,850 as of the time of this writing.

The group says on the methodology: "Deaths in the database are derived from a comprehensive survey of commercial media and NGO-based reports, along with official records that have been released into the public sphere."

IBC adds that figures are included from "incident-based accounts to figures from hospitals, morgues and other documentary data-gathering agencies."

The website adds, however, that "IBC's main sources are information gathering and publishing agencies, principally the commercial news media who provide web access to their reports." Also, the IBC only records violent deaths, and only those of civilians.

The unofficial cemeteries around Iraq hold their own additions to the numbers doing the rounds. And no one knows what these add up to.

Dahr Jamail is an independent journalist who reports from Iraq.

reposted from:

http://www.alternet.org/waroniraq/125484/iraq%27s_gravedigging_industry_is_at_100%25_full_employment/

Can economic collapse give rise to a new ethical paradigm?

Here's an excellent new article by Peter Singer in which he outlines his hope, and naturally also his reservations, about the possibility of forging a reassessment of ethical values as a consequence of the financial turmoil currently roiling the globe:

Published on Monday, February 16, 2009 by The Guardian/UK

Un Nouveau Capitalisme Est-il Possible?

by Peter Singer

Is the global financial crisis an opportunity to forge a new form of capitalism based on sound values? So Tony Blair and the French president Nicholas Sarkozy appear to think. At a symposium in Paris last month entitled New World, New Capitalism, Sarkozy described (watch the video) capitalism based on financial speculation as "an immoral system" that has "perverted the logic of capitalism". He argued that capitalism needs to find new moral values and to accept a stronger role for governments. Blair called for a new financial order based on "values other than the maximum short-term profit".

It is surprising how readily politicians of all parties - even strong ideological defenders of the unregulated market - accepted the idea that the state should bail out banks and insurance companies when they got into trouble. With the exception of a small number of ideologically committed defenders of free enterprise, few were willing to take the risks inherent in letting major banks collapse.

Who knows what the consequences would have been? Many feared mass unemployment, a tidal wave of bankruptcies, millions of families evicted from their homes, the social safety net strained to the breaking point, and perhaps even riots and a resurgence of the political extremism that brought Hitler to power in Germany during the depression of the 1930s.

The choice to save the banks from the financial consequences of their own errors indicates a shift in values away from belief in the wisdom of the market. Evidently, the market got some things - like the value of certain financial securities - horrendously wrong. But will the downturn also produce a deeper shift in the values of consumers?

It is no accident that the "New World, New Capitalism" symposium was held in France, where some critics have seen the global financial crisis as necessary and desirable precisely because it is producing this change in values. In the newspaper Le Figaro, a section on how to scale back one's expenses predicted a "revolution in values" and claimed that people will put family ahead of work. (Americans think the French, with their shorter working hours and longer summer vacations, already put family ahead of work.)

The French have always been less likely to go into debt - when they pay with plastic, they tend to use debit cards, drawing on funds they already have, rather than credit cards. Now they see the current crisis as a vindication of the value of not spending money that you don't have.

That means, in many cases, less luxury spending - something that is hard to reconcile with the image of France as the country of fashion, perfume, and champagne. But excess is out of style, and there are reports of cutbacks in luxury goods everywhere. Richemont, the Swiss luxury goods company that owns the Cartier and Montblanc brands, has said that it is facing "the toughest market conditions" since its formation 20 years ago. But does this mark an enduring change in values, or just a temporary reduction, forced upon consumers by investment losses and greater economic uncertainty?

In his inauguration speech, Barack Obama said: The time has come to set aside childish things" and instead to choose the noble idea that "all are equal, all are free, and all deserve a chance to pursue their full measure of happiness." It would be an excellent thing if the global financial crisis restored a proper sense of what is important.

Could the crisis remind us that we buy luxury items more because of the status they bring than because of their intrinsic value? Could it help us to appreciate that many things are more central to our happiness than our ability to spend money on fashion, expensive watches, and fine dining? Could it even, as Obama suggests, make us more aware of the needs of those who are living in real poverty and are far worse off than we will ever be, financial crisis or no financial crisis?

The danger is that the potential for a real change in values will be co-opted, as has happened so often before, by those who see it as just another opportunity to make money. The designer Nathalie Rykiel is reportedly planning to show the new Sonia Rykiel collection in March not in the usual vast rented area, but in the smaller space of her own boutique. "It's a desire for intimacy, to go back to values," she told the International Herald Tribune. "We need to return to a smaller scale, one that touches people. We will be saying, 'Come to my house. Look at and feel the clothes.'"

Ah yes: in a world in which ten million children die every year from avoidable, poverty-related causes, and greenhouse gas emissions threaten to create hundreds of millions of climate refugees, we should be visiting Paris boutiques and feeling the clothes. If people were really concerned about defensible moral values, they wouldn't be buying designer clothes at all. But what are the chances of Nathalie Rykiel - or the affluent elites of France, or Italy, or the United States - adopting those values?

reposted from:

http://www.commondreams.org/view/2009/02/16-2

Monday, February 2, 2009

A few book reviews

For my first ever blog entry, some milestone you might say, I've decided to resurrect three book reviews I wrote a few years ago. These are what you might call oldies but, in my ever so humble opinion, also hopefully goodies. Here goes, as they say in downtown Tikrit...

Godard: A Portrait of the Artist at Seventy by Colin MacCabe

This dexterous and diligent biography of the greatest filmmaker of the twentieth century is to be treasured for its wide conceptualising scope, analytical depth and engaging tone. MacCabe is clearly a great admirer of Godard and makes this plainly clear throughout a tome which has a main textual body more than 350 pages in length. Sometimes MacCabe is too starry eyed and glides over some unflattering peccadilloes that, while brave of him to refer to in the first place, could have been delved into with more scalpel like precision or, perhaps, even merciless excoriation. However, as an almost devotionally deep admirer of the filmmaker these unflattering revelations, regardless of their truth, would probably have sat far too uneasily with me.

Jean-Luc Godard has always been a kind of seer and in latter years, if not from his breathtaking outset, the curmudgeonly conscience of world cinema. Along with his Nouvelle Vague cohorts, always the loosest of collectives, he transformed the production and distribution of motion pictures and rode the large crest of a wave carrying sublimely superb international cinema. The sixties were heady and experimental times, or so at least the historical cliché assures us, but the reality, as embodied particularly in European cinema of the time, leaves little doubt to the veracity of this now obdurately overused, though admittedly inescapable, claim. Things have turned sour since, but Godard presses on into his dotage and possibly even towards greater obscurity, although, in a heartening reversal of the trend towards mindless memory loss and vapid spectacle, Godard’s last two films have been internationally released and in England there have been a spate of new box sets of his older films and releases of work, such as La Chinoise, long previously unavailable.

To Godardophiles much of the story is known, but to have one book string the saga together in chronological order, with detailed information on his early life as well as turning a methodical eye to the peripheral figures who either influenced Godard or were instrumental in the realisation of his oeuvre, is to be blessed with a work of singular significance hitherto never even attempted in English. MacCabe has clearly done his homework and rarely delivers the sort of blandly cursory overview of films, many of them (the 70s work in particular) unavailable to anything resembling a wide public, that many critics display towards the perceived minor works of an artist’s legacy and, ironically, are often even more inclined to show towards those that are well known. He never descends to the level of press release shallowness and when he says next to nothing about a film, such as Helas Pour Moi (1993) or Forever Mozart (1996), one can only assume he hasn’t seen them, or, far less likely, found little of value in them.

The book provides a thorough picture of Godard as a literature-loving young man, nonchalant rebel, incessant re-inventor of himself and his medium (he consistently stresses the point that they are inseparable) and as prophetic poet who moves with isolated gravity through the increasingly exiguous wasteland of global cinema, or what is left of it in the wake of the cupidinous colossus that is Hollywood. Godard has always been a man on a mission, but ruefully reflects in the book that the mistake of the New Wave filmmakers was to think that what they “were doing was a beginning”. American cinema was then dominant, and in a very selective way the young tendentious Turks writing for Cahiers du Cinema championed the more exciting cinema originating from across the large Atlantic pond. Godard borrowed generic tropes for many of his early films and experimentally glorified in the playful self reflective fusion of American B-movies and his more personalised analytical, polemical and densely textual proclivities. For the last quarter of a century he has dropped the joie de vivre of genre allusions and somewhat settled down, if that is ever an appropriate description of Godard, as the sagacious elder statesman of sadly marginalised and resolutely art house cinema in a guise manifesting itself in intricate and often inscrutably fragmented essay-like discourses between the past and present, image and sound, narrative and discourse, literature and cinema, and, possibly more important than all the rest, between the director and society. The heady confluence of elements is still there as it was from the beginning, but the restless zest of younger days seems depleted, though never lost, in the tender reification of an art that Godard uniquely mastered in a way others could not even dream of, let alone emulate.

One of the most melancholic sections in a book filled with them is the final pages when MacCabe reflects on his inability to place Godard in a Pantheon of immortal artists in a self-devouring capitalist world ever more certain of its morbid mortality. Cinema has lost far more than it has gained since the joyous heyday of the New Wave, except on the level of technology and baleful corporate profits, and Godard, despite his proudly contrarian presence, embodies that loss more than anyone else. The hope is also still there, however, and for a new generation who may still want to give “ninety percent to ten percent of the audience”, as Godard once proclaimed his unofficial intention, the Swiss born filmmaker provides one of the most noble examples of elevating art, not cold commerce, to the highest evaluation of achievement. The world audience may be an American audience, but Godard keeps alive the barely perceived belief that “cinema is truth twenty four times a second” and that “to make a movie all you need is a gun and a girl”.

MacCabe has done an expertly accomplished job on documenting the times and life of a man so remarkable that twenty books, twice as long as this one, hardly does full justice to him. Godard: A Portrait of the Artist at Seventy is both a pellucid primer for an amateur Godard enthusiast as well as providing the almost unthinkable, at this juncture nearly half a century since Godard’s feature debut, namely a number of fresh critical perspectives on one of the most written about filmmakers of all time. The world may not prevail, as it almost certainly won’t, but in the time that remains both this book, and the artist so assiduously and enthusiastically at its centre, should remain entrenched on the very highest rung of the ideal hierarchy of cinematic greats.

The Culture of Terrorism by Noam Chomsky

While this might not be the best book to read if you’ve never before met this astounding intellect in print, it still serves to succinctly elucidate the most salient hallmarks of Chomsky’s approach to world affairs and, more specifically, his country’s foreign policy. These hallmarks include an incisive dissection of the subservience of intellectuals to state power, the flagrant hypocrisy of the US government, in this case the Reagan administration, as their public pronouncements project an image of inviolable nobility while their actions tell quite a different story, and the concentration of private power in a few hands which underpin, thus making possible, these disturbing aspects of American intellectual and political culture.

The book began life as a “postscript” to a number of foreign editions of Chomsky’s Turning the Tide, which dealt with many of the same points raised in this book, though The Culture of Terrorism deals with the Iran-Contra scandals at some length which the earlier text did not. Although the actual facts detailed in often exhausting rigorousness are well out of date, one is thoroughly exposed to the brazen dereliction of basic journalistic duty by those that Chomsky derisorily refers to throughout as representatives of the Free Press. They fall so effortlessly in line with state doctrine that the achievements, again noted by Chomsky, would make a totalitarian regime proud. That this happens in one of the freest countries in the world is nothing short of sickeningly scandalous. In case there are those that think Chomsky is a conspiracy nut or a devotee to the school of hyperbole he provides ample evidence which shows that even the so-called liberal press, namely the New York Times and the New Republic, are guilty of obscene apologetics for, and often advocates of, aggressive state terror.

The Culture of Terrorism deals predominantly with the campaign of subversion and harsh repression conducted by the Contras in Nicaragua who were armed, trained, and constantly supplied throughout this terrible period by the US government. There were flights over the countryside on an almost daily basis and the examples of their weaponry cited in the book would put most armies in other third world countries to shame, let alone the guerrilla forces who were fighting in nearby El Salvador, a country Chomsky also sketches in much socio-political detail. In 1979 the Nicaraguans overthrew the brutal dictator Somoza, a member of a dynasty stretching back to the middle of the 1920s, whose reign ended with a “paroxysm of violence claiming the lives of 40-50000 people”. This tiny Central American nation elected the leftist Sandinistas regime which immediately caused the big neighbour to the North considerable consternation. The Reagan Administration proceeded to destabilise this government by employing the Contras, many of them previously employed as members of Somoza’s abysmally vicious National Guard, to raid innocent villages, destroy houses, steal livestock, and even kill Americans who had come to aid this miserably poor country that was improving dramatically under the Sandinista regime. These leaps ahead in terms of health care, education and reduction of poverty were documented by such aid agencies as Oxfam at the time who compared the situation in this country with that of Guatemala and El Salvador. The picture created in the US media was quite different, however, as that charnel house Guatemala, along with El Salvador where political violence, including rapes, mutilation, tortures, and ‘disappearances’, were endemic, were described as “fledgling democracies”.

Conversely, Nicaragua under the Sandinistas was portrayed by the Free Press as a totalitarian state who was one of the tentacles of the Soviet Union. How interesting that by ordering an economic embargo of Nicaragua, and forcing allies to do the same, the Sandinistas are forced to turn to Russia for help which provides a retrospectively convenient basis for the Reagan Administration to scream from the roof tops that the Evil Empire is upon them. Also very intriguing, illuminated by copious quotations from leading journals and newspapers, that a country such as Guatemala, where it is estimated that around 150000 people may have been killed during the Reagan era, and El Salvador, the site of 50000 politically motivated murders during the same period, raise no impassioned denunciations of their odious socio-political conditions, or even an acknowledgement of these figures cited by human rights organizations and specialists of the region. Ignorance is indeed strength, as Chomsky notes in a very apposite evocation of Orwell, whom he often refers to throughout the book as the noted linguist creates for the reader a truly terrifying Orwellian world, all the more horrifying because it actually exists and is not only an acutely perspicacious exercise in allegory, where “democracy” implies regimes friendly to US business interests and “moderates” are people such as El Salvadoran president José Duarte who just happens to preside over a regime that assassinates Archbishops, union leaders, students, journalists of opposition newspapers, and just about anyone who dares to question the economically polarising policies of this staunch proponent of the US “development model”, another term Orwell would be proud of as the development in question applies to rich folk while the poor become demonstrably poorer, as is still much the case today in our world of ever “freer” markets.

The picture, as usual with Chomsky, is bleak, though when you have this much factual knowledge at your command, and have none of the necessary illusions required of the mendacious elites, then it is a tall task to be sanguine about world affairs, particularly those directed by the biggest terrorist state. The problem with reading a book published almost two decades ago about events that were then much publicized, is that much of the currency is unavoidably lost. At the very least the book provides an abundantly extensive historical overview of a time not all that different from our own, the primary deviation being the names of the victims and perpetrators, and at its most elevated altitudes of significant scholarship The Culture of Terrorism cogently demystifies the key characteristics, established by the voluminous historical and documentary record, of the most influential institutions in US society. This has always been Chomsky’s greatest gift and this book amply, though not definitively, showcases his remarkable ability to not only render events in breathtakingly astounding detail, but always ensures that they are related to a wider context of previous incidents and current practices.

This is not a book for those individuals who still foster illusions that the United States is the most benevolent super power the world has ever known. For those willing to look beyond the purposely constrained bounds of the mainstream media, as well as the limits of their own often self-willed ignorance, the book provides ample insights into past practices and their very grave implications for future conduct by the globe’s sole remaining hegemonic force. Chomsky may be less a voice in the wilderness than he was when the book was published, but still not enough people are hearing his extremely vital message.

The God Delusion by Richard Dawkins

Since his first publication, The Selfish Gene (1976), Richard Dawkins has injected his works with subtle and often overt jabs at religion. His latest book blazes forth with blatant, yet it must also be said rather effortless, controversy to make the case that religion is not merely illogical and wrong, but supremely dangerous for the world at large.

The God Delusion is good on demolishing traditional arguments for the existence of God, but less effective in grappling with sophisticated theology. Dakwins displays mesmerising contempt for ‘faith heads’, even the moderate ones, and is trenchantly incisive in dissecting the speciousness and hypocrisy of all religious faith while cogently making it clear that our moral sense derives from our evolutionary heritage and not some supernatural deity revealed to us in a holy book. The author spares no punches in exposing the horrors committed in the name of religion while he occasionally allows himself to be swept up in the awe engendered by science’s grand mission to render all great mysteries wondrously comprehensible. This book will rattle many cozy cages, as well it should considering that so much of what passes as ‘faith’ is simply lazy adherence to the religion of our parents or immediate cultural surroundings. Dawkins contends, not altogether successfully, that the ‘how’ questions science so assiduously sets out to answer will always trump the ‘why’ questions that, by their woolly metaphysical nature, will ever remain unanswerable.

The ideas presented are communicated in a lucid and consistently logical style that, for all the book’s faults, are never anything less than insightful and, if you’ll excuse the expression, devilishly entertaining.