Why both pain and gratitude drive us to pray

A few days back the Rev Giles Fraser had a very good column in the Guardian about praying after tragedies like the terrorist attack on Westminster Bridge. In the light of a radio presenter tweeting:

Fraser explains why he opened up his church – only a short distance from the scene of the attack – and invited strangers into pray:

Prayer is not a way of telling God the things he already knows. Nor is it some act of collective lobbying, whereby the almighty is encouraged to see the world from your perspective if you screw up your face really hard and wish it so. Forget Christopher Robin at the end of the bed. Prayer is mostly about emptying your head waiting for stuff to become clear. There is no secret formula. And holding people in your prayers is not wishful thinking. It’s a sort of compassionate concentration, where someone is deliberately thought about in the presence of the widest imaginable perspective – like giving them a mental cradling.

But above all, prayer is often just a jolly good excuse to shut up for a while and think. The adrenaline that comes from shock does not make for clear thinking or considered judgment. Those who rush to outrage say the stupidest things.

Naturally, I agree entirely. The only thing I would add is that because Fraser is writing in the context of a dark incident, he doesn’t touch on a key aspect of praying: being thankful.

Being grateful is really good for you. The Harvard Mental Health Newsletter has reported that:

Two psychologists, Dr. Robert A. Emmons of the University of California, Davis, and Dr. Michael E. McCullough of the University of Miami, have done much of the research on gratitude. In one study, they asked all participants to write a few sentences each week, focusing on particular topics.

One group wrote about things they were grateful for that had occurred during the week. A second group wrote about daily irritations or things that had displeased them, and the third wrote about events that had affected them (with no emphasis on them being positive or negative). After 10 weeks, those who wrote about gratitude were more optimistic and felt better about their lives. Surprisingly, they also exercised more and had fewer visits to physicians than those who focused on sources of aggravation.

Another leading researcher in this field, Dr. Martin E. P. Seligman, a psychologist at the University of Pennsylvania, tested the impact of various positive psychology interventions on 411 people, each compared with a control assignment of writing about early memories. When their week’s assignment was to write and personally deliver a letter of gratitude to someone who had never been properly thanked for his or her kindness, participants immediately exhibited a huge increase in happiness scores. This impact was greater than that from any other intervention, with benefits lasting for a month.

The article goes on to suggest ways of making oneself more grateful: writing thank you notes, keeping a gratitude journal, and, yes, praying. When pastors and youth workers teach children to pray, they often use the mnemonic: teaspoon. It helps you remember to say thank you, sorry and please. You see, the Christian tradition demands not only an active prayer life but also one that includes a great focus on being thankful.

There’s been a great deal of research on the objective, scientifically demonstrable benefits of mediatation. That’s led to its repackaging and propogation as mindfullness, a technique that’s now used both as a form of medicine and as an aid to personal development. I wonder if we might see a similar body of research emerge around prayer.

And God made Darwin



“If you’re interpretation of the Bible contradicts demonstrable facts, then it is your interpretation that needs to change, not the facts.”

I’m not a biologist but I’m very confident that evolution by means of natural selection is a real thing. That’s partly because people who are biologists are themselves very confident of this fact and have been for a very long time. It’s also because evolution is an ongoing process we can see happening, for example, when bacteria evolve resistance to antibiotics.

There is only an ongoing debate about this notion because some people take it as a challenge to their religious faith. This is a strong trend within Islam. But as I’m a Christian, I’m going to focus on my own tradition.

I was moved to write about this by a Ted talk by Dr. April Maskiewicz, an academic who teaches biology at a Christian University.

She reports doing straw polls of her students that indicate that 90% of them reject evolution. Their motivation appears to be a belief that one must choose between God and natural selection: a belief in one proposition apparently negating the possibility of the other. As a student, Dr. Maskiewicz herself was apparently told this by both her priest and her biology lecturer. She spends the 17 minutes of her talk patiently and politely disassembling this notion.

Allow me to be blunter. When one reads in Genesis that ‘God created the world in 6 days’, it is not the ‘in 6 days’ part that is significant. What matters is that he brought all things into existence. If he had the power to do that, then he clearly also has the power to set in motion a process whereby some matter coheres into simple life forms which then evolve into a host organisms including humans.

This is not some out there liberal re-interpretation of the Bible. It is something believed by a figure as unbending as the last Pope who has said that:

We cannot say: creation or evolution, inasmuch as these two things respond to two different realities. The story of the dust of the earth and the breath of God, which we just heard, does not in fact explain how human persons come to be but rather what they are. It explains their inmost origin and casts light on the project that they are. And, vice versa, the theory of evolution seeks to understand and describe biological developments. But in so doing it cannot explain where the ‘project’ of human persons comes from, nor their inner origin, nor their particular nature. To that extent we are faced here with two complementary—rather than mutually exclusive—realities.

Now it is true that accepting evolution means we can no longer credibly argue that ‘the only way for complex organisms to have come about is for a creator to have made them’. But all evolution does is push that kind of reasoning back a stage. We still have to explain how a universe where evolution could occur came about and indeed why it appears to have been ‘fine tuned‘ for that purpose.

Besides, we’re in trouble if we start choosing evidence that fits our conclusion rather than the reverse. If you’re interpretation of the Bible contradicts demonstrable facts, then it is your interpretation that needs to change not the facts.

Now, I would not want to give the impression Dr. Maskiewicz’s students are typical of Christians. Indeed, given that ‘theistic evolution’ is supported by the Catholic Church, most mainline protestant denominations and a large number of Eastern Orthodox adherents it is likely to be the majority position.

Nonetheless, ‘Creationism’ is still  an idea that creates problems. Dr. Maskiewicz recounts how having being told that she had to decide between her faith and the clear evidence for evolution, she opted for evolution and only found her way back by discovering that she had been presented with a false dichotomy.

And what sensible person wouldn’t do what she did? If a belief system demands believing that black is white then it deserves to be rejected. It, therefore, seems highly likely that there are people who would be Christians had they not come to associate it with anti-scientific bunkum. Creationism is thus a barrier to faith that ought to be demolished.

I’ll leave you with one final thought. The reaction against evolution among some Christians doubtless owes something to the fact that a number of prominent New Atheists are evolutionary biologists. Richard Dawkins is of course the most obvious example. But might the direction of causality also run the other way? Would Dawkins and his ilk still have become implacable opponents of evolution if they hadn’t faced believers claiming that an incidental detail of a methaphor in Genesis refuted something demonstrated time and again by paleontology, genetics and zoology?


Hat tip: Rachel Held Evans

Love, courage and cosmology: my review of Interstellar and the Theory of Everything

I was very ready not to like The Theory of Everything. For all the world it looked like a very polite biopic designed to impress Oscar voters and the Best Exotic/King’s Speech crowd. This turned out to be correct. However, it’s also a seriously impressive piece of work.

The appeal of a story about a genius battling a horrifying disease is evident. Nonetheless, there is temptation with films about interesting and multifaceted lives for the film to take an interest in every facet. Witness for example the meandering and unfocused Iron Lady. The Theory, however, finds a clear way through Hawking’s life by focusing on his relationship with his first wife Jane Wilde. Their shared affection and pain provide safe base from which we can explore Hawking’s science, his disease and his unlikely fame.

Yet ultimately it is this base that makes the Theory so fascinating. Eddie Redmayne and Felicity Jones deliver remarkable (and Oscar nominated) performances. Redmayne has to depict not only his character but also the disease that completely alters his physicality. Even when apparently completely immobilised Redmayne can still convey precisely what Hawking is thinking and feeling. However, it’s Jones who’s most remarkable. She shows just how difficult it was for Wilde to be the great woman behind a great man. She makes her seem strong without being saintly. Between them they produce a refreshingly different kind of story about a relationship: it’s not merely about falling in love but about love in its totality from the beginning of a relationship to its end and beyond.

This intensely domestic story is an apparent contrast to the massive space opera that is Interstellar. Ideas about blackholes or space-time which in the Theory of Everything are fleetingly sketched on a blackboard or explained with peas and potatoes are in Interstellar depicted in massive scale and detail. Yet pretty much as soon as I came out of the Theory of Everything, I’d bracketed the two films together. This was partly a matter of the role cosmology play in both. In fact Interstellar’s science advisor and executive producer Kip Thorne appears as a character in the Theory. He’s a collaborator and rival of Hawking’s and at one stage wins a bet with Hawking and receives a year’s subscription to penthouse as a prize!

More important, however, is the thematic connection. Both films are fundamentally about the determination to survive and how love can help us to find it. The quote from Dylan Thomas that Instellar uses extensively: “Do not go gentle into that good night, Old age should burn and rave at close of day; Rage, rage against the dying of the light” feels like a sentiment Hawking would probably have endorsed.

For all their similarities and the extent to which I enjoyed both films, I felt the Theory is the more successful film. Interstellar has plenty of fine performances but none as majestic Redmayne and Jones’. There are also points where Interstellar falls victim to the inability of a human brain that evolved in a Newtonian world to cope with the reality of relativity: at points where the drama requires us to be engaged, we are kept at a distance by the instinctive parts of our brain protesting: “this isn’t how the universe works!”

Yet of course this is indeed precisely how the universe works. The proposition that time moves more slowly for objects exposed to stronger gravity appears upon initial presentation like utter. Yet this effect is sufficiently real that the calculations underlying the GPS on your phone must include a correction for the slower passage of time experienced by satellites in orbit that are therefore at a remove from the earth’s gravity.

This is a point that both films illustrate magnificently: the human capacity to imagine that which we cannot experience. The substance of Thorne and Hawking’s bet is over whether a mysterious source of X-rays known as Cygnus X-1 would turn out to be a black hole. At the point not only was their uncertainty about whether Cygnus x-1 was a black hole but no one had been able to positively identify anything as a black hole. Yet by looking at the mathematics arising from Einstein’s work, Hawking and Thorne could be reasonably sure black holes did indeed exist. Thorne apparently gave some of these equations to the special effects team working on Interstellar who used them to create the images of wormholes and blackholes seen in the film. So through the power of both science and cinema we can look upon something that exists but which may never actually be seen by human eyes. Interstellar argues that it is this ability that allows us to invent paths out of apparently futile situations, while the Theory of Everything shows how it allows a man confined to a wheel chair to reach the farthest reaches of the cosmos.

Just how implausible are AIDS conspiracy theories?

Quite a few conspiracy theories surround HIV/AIDS. Most destructively there is the notion tragically adhered to by former South African president Thabo Mbeki that AIDS is not actually caused by HIV. There are also significant numbers who believe that it was created deliberately:

According to a 2005 survey of African Americans living in the US, almost 50% of the respondents believed that HIV was manufactured in a lab. Furthermore, over 25% believed that this was done by the government. A significant number also believed that it was created in order to control the population of black people/homosexuals.

In general I’m quite interested in conspiracy theories (or in their debunking at least) but this later idea was one I’d paid little attention to. The risks faced by the purported perpetrators and the degree of cruelty required of them seemed – even by the standards of conspiracy theories – outlandish. It’s also probably true that as a white graduate living in Berkshire I hear less about this theory than about say those around JFK or 9/11.

Therefore, it was not until I read this article on the origins of HIV at I Fucking Love Science that I came to appreciate quite how implausible it is:

Some of the earliest documented cases of HIV were in the late 1950s; it’s absurd to think that scientists would have had the knowledge or technology to create viruses back then. We only identified the structure of DNA in 1953. We’ve only just managed to create the first synthetic bacterial genome, let alone create a virus from scratch.

Creating a virus would require knowledge of genetic manipulation. We simply did not have the expertise to be able to achieve something like this at that time.

That puts this theory into its own league of detachment from reality. The CIA would at least theoretically have had the capacity to shoot JFK or hijack planes on 9/11, where as even that claim cannot be made for the supposed nefarious creators of HIV.


P.S. In case your interested the article winds up concluding that there “exists an overwhelming amount of evidence to suggest that HIV arose from cross-species transmission of closely related viruses that are found naturally in various primate hosts in Africa” and suggests this most likely occurred when these apes were hunted for bushmeat or kept as pets.

If you think you are left-brained or right-brained, what you really are is gullible


Internet quizzes are giving a new lease of life to a popular piece of junk science. Live Science explains why it’s still nonsense:

Popular culture would have you believe that logical, methodical and analytical people are left-brain dominant, while the creative and artistic types are right-brain dominant. Trouble is, science never really supported this notion.

Now, scientists at the University of Utah have debunked the myth with an analysis of more than 1,000 brains. They found no evidence that people preferentially use their left or right brain. All of the study participants — and no doubt the scientists — were using their entire brain equally, throughout the course of the experiment.

A paper describing this study appeared in August in the journal PLOS ONE. [10 Things You Didn’t Know About the Brain]

The preference to use one brain region more than others for certain functions, which scientists call lateralization, is indeed real, said lead author Dr. Jeff Anderson, director of the fMRI Neurosurgical Mapping Service at the University of Utah. For example, speech emanates from the left side of the brain for most right-handed people. This does not imply, though, that great writers or speakers use their left side of the brain more than the right, or that one side is richer in neurons.

There is a misconception that everything to do with being analytical is confined to one side of the brain, and everything to do with being creative is confined to the opposite side, Anderson said. In fact, it is the connections among all brain regions that enable humans to engage in both creativity and analytical thinking.

“It is not the case that the left hemisphere is associated with logic or reasoning more than the right,” Anderson told LiveScience. “Also, creativity is no more processed in the right hemisphere than the left.”

Anderson’s team examined brain scans of participants ages 7 to 29 while they were resting. They looked at activity in 7,000 brain regions, and examined neural connections within and between these regions. Although they saw pockets of heavy neural traffic in certain key regions, on average, both sides of the brain were essentially equal in their neural networks and connectivity.

“We just don’t see patterns where the whole left-brain network is more connected, or the whole right-brain network is more connected in some people,” said Jared Nielsen, a graduate student and first author on the new study.

Edgar Allen Poe solved one of the great mysteries of physics


Ok, solved might be a rather strong word but he seems to have guessed the answer before anyone else.

The puzzle he cracked is something called Olbers’ Paradox.  I came across both this problem and Poe’s answer  for the first time in Jim Al-Khalili’s book Paradox: The Nine Greatest Enigmas in Physics. Al-Khalili explains gives a wonderfully clear and concise explanation of the paradox in the video below, but in essence its asks how it can get dark at night if there are a virtually infinite number of stars all giving off light. The answer is that because the universe has a finite age, the light from the bulk of stars simply hasn’t had time to reach the earth.

Poe seemed to work this out in 1848 in an essay called Eureka: a Prose Poem

No astronomical fallacy is more untenable, and none has been more pertinaciously adhered to, than that of the absolute illimitation of the Universe of Stars. The reasons for limitation, as I have already assigned them, a priori, seem to me unanswerable; but, not to speak of these, observation assures us that there is, in numerous directions around us, certainly, if not in all, a positive limit — or, at the very least, affords us no basis whatever for thinking otherwise. Were the succession of stars endless, then the background of the sky would present us an uniform luminosity, like that displayed by the Galaxy — since there could be absolutely no point, in all that background, at which would not exist a star. The only mode, therefore, in which, under such a state of affairs, we could comprehend the voids which our telescopes find in innumerable directions, would be by supposing the distance of the invisible background so immense that no ray from it has yet been able to reach us at all. That this may be so, who shall venture to deny? I maintain, simply, that we have not even the shadow of a reason for believing that it is so.

I don’t know what this means for the debate about the art and science as two cultures. One the one hand Poe’s contribution was inadequate; he did not present any real evidence for his supposition.  It would a century until scientifically credible proofs were put forward. However, that a poet was the first to intuit the solution says much for the creative power of the arts.

How The Cats’ Purr has Evolved to Manipulate Human (feat. Simon’s Cat)


There comes a time in the life of every blogger looking to attract views when they resort to posts about cats. Well for me that time is nigh.

Cats have been kept by humans for so long that this has caused them to evolve to better exploit us. Take for example their meowing.

According to Karen McComb of the University of Sussex, UK, domestic cats hide a plaintive cry within their purrs that both irritates owners and appeals to their nurturing instincts.

The team recorded the purrs of 10 different cats when they were soliciting food, and when they were purring in a different context. Fifty people who were asked to rate the purrs on how pleasant and urgent they sounded consistently rated the “solicitation purrs” as more urgent and less pleasant. Cat owners were especially good at distinguishing between the two kinds of purring.


When the team examined the sound spectrum of the solicitation purrs they saw an unusual peak in the 220 to 520-hertz frequency range embedded in the much lower frequencies of the usual purr. Babies’ cries have a similar frequency range, 300 to 600 hertz, McComb says.

The louder this high-frequency element, the more urgent and less pleasant the purr was rated. Cats may be exploiting “innate tendencies in humans to respond to cry-like sounds in the context of nurturing offspring”, McComb says.

Of course, a particular mischevious cartoon cat has weapons other than baby noises at his disposal:

Hat tip: Horizon

Sugar does not make children hyperactive

According to this article in the New Scientist at least:

A 1996 review of 12 blinded studies, where no one at the time knew which kids had received sugar and which a placebo, found no evidence to support this notion. This is true even for children with ADHD or whose parents consider them to be sensitive to sugar (Critical Reviews in Food Science and Nutrition, vol 36, p 31)

In fact, one of these studies concluded that the sugar effect is all in parents’ minds.

There is a correlation between physical strength and views on redistribution (in men)

As the Economist explains:

The two researchers came to this conclusion after looking at 486 Americans, 223 Argentinians and 793 Danes. They collected data on their volunteers’ strength by measuring the circumference of the flexed biceps of an individual’s dominant arm. (Previous work has shown that this is an accurate proxy for strength.) They then measured people’s status with questionnaires about their economic situation. And they determined a person’s support for redistribution by asking the degree to which he or she agreed with statements like: “The wealthy should give more money to those who are worse off”; and “It is not fair that people have to pay taxes to fund welfare programmes.” They also asked about participants’ political ideologies.

Dr Petersen and Dr Sznycer found that, regardless of country of origin or apparent ideology, strong men argued for their self interest: the poor for redistribution, the rich against it. No surprises there. Weaklings, however, were far less inclined to make the case that self-interest suggested they would. Among women, by contrast, strength had no correlation with opinion. Rich women wanted to stay rich; poor women to become so.

This is an example of a depressingly common feature: not only are we not terribly rational about our politics but we’re also unaware of our own irrationality. I mean when did you last hear someone explain their views on the welfare state with reference to their biceps?