November 19, 2008

Why We Love To Hate Spiders

ResearchBlogging.orgAn article in a recent issue of New Scientist about what is responsible for fear of spiders led me to slightly disagree with the explanations afforded by the researchers. Have a quick read:

"Movies starring the superhero Spiderman may rake in millions at the box office, but the humble spider inspires fear and loathing quite unlike that of other creepy-crawlies. A third of women and a fifth of men admit to being scared of spiders. And an obvious explanation is that we have evolved a dread of spiders because they can be poisonous. However, psychologist Georg Alpers at the University of Würzburg, Germany, and his team believe that if this theory is correct, we would be just as afraid of stinging insects such as bees and wasps.

"To find out if this was the case, Alpers's team asked 76 students to rate photos of spiders, wasps, bees, beetles, butterflies and moths on three counts: how much fear and disgust they inspired and how dangerous the students felt they were. It transpired that spiders triggered far greater fear and disgust than any of the other creatures and were believed to be more dangerous (Evolution and Human Behaviour, DOI: 10.1016/j.evolhumbehav.2008.08.005).

"Stuart Hine, an entomologist at London's Natural History Museum, thinks fear of spiders is probably a learned behaviour. You only have to see someone standing on a chair screaming 'Spider! Spider!' to pick up on that fear, he explains. 'It stems back to the days of plagues when people suspected anything that crawled out of the thatch as carrying disease.'"

Now I think it's fair enough to invoke explanations arising from evolutionary psychology ("poisonous") or about learned behaviour, and I respect that, but couldn't we consider the most obvious explanation: that spiders are just horrible-looking little bastards?

I'd post a picture of a spider to prove my point, but as a recovering arachnophobic I don't think it would be a good idea. After years of screaming and screeching after sighting one of the little blighters, what to speak of being paralysed with fear, I'm pleased to report that I overcame my fear (somewhat) after a school trip to London Zoo. It really is quite amazing what group cohesion in the form of "dares" can do, but suffice it to say that we all forced ourselves to go and look at frightening tarantulas and the like in the Insect House. The first thing I noticed when I saw them is how small they are in reality. Many images available in books and other media tend to be close-ups and enlargements which may account for sudden shock reactions. But seeing them in reality gives one the impression that it is the fear itself which is overblown in the face of their diminutive size. If tarantulas represent an extremity in terms of fear, what more could we say about the even smaller stature of house spiders?

Since that visit to the zoo, my own fear of spiders significantly diminished from hysterical reactions to mild observations. I still exhibit a modicum of fear when I see them, but depending on my mood I may choose to squish them into oblivion or I might catch it and throw it outside.

But why do spiders provoke such extreme reactions? I downloaded and quickly read through the actual paper. Alpers et al. hypothesised that similar reactions should be exhibited with respect to bees and wasps, which could be true except that they seem to pose more of an annoyance than a fear. After all, how many apiphobics or spheksophobics do you know? I'd be more surprised if you've even heard those terms. Reading through the introduction to the study, there is much to be said for the evolutionary perspective in terms of fight-or-flight responses but too much is said about their venomous nature as well as the venomous qualities of other arthropods. Quote: "The disgust hypothesis postulates that emotional responses to spiders are culturally transmitted because these animals were historically associated with disease and infection from medieval times onward. However, it is unclear why mainly spiders, and not other 'creepy crawlies,' have been considered to be responsible for infections and disease."

Hello? Could it be because they look horrible? And could it be because they tend to move very quickly and their eight-legged appearance gives off an unnerving impression? I'm sure that much could be said for visual representation in connection with disgust hypotheses and I'm pretty sure that studies have been carried out along those lines, but you'll have to forgive me for being too lazy to dig them up right now. The researchers go on, this time suggesting that fear of spiders could be down to cultural transmission: "Other arthropods that are comparable in terms of venomousness, appearance, or behavior to spiders may elicit similar reactions, but cultural transmission may exert strong biases on verbal labeling. Individuals who report being afraid of spiders may stick with a cultural stereotype ('fear of spiders is common'), although their fears may be much less specific than commonly thought. A variety of arthropods may elicit fear or disgust (e.g., beetles), but 'fear of spiders' may merely be a culturally accepted verbal label for a wide spectrum of animal fears."

Hello? Did you ever consider that their horrific looks may account for fear??

It's no wonder that the results of the experiments suggested that "spider fear is in fact spider specific". Aside from being venomous, the study gives rise to a more interesting evolutionary question as to why spiders accounted for the highest ratings of fear in both emotional and dangerous contexts as compared to bees/wasps. One explanation provided by the authors relates to the honey-creating capacity of bees that formed part of Early Man's diet. Frequent interactions with bees due to honey obtainment and the very real possibility of being stung regularly may have contributed to an adaptive response on the human part with the result of lessening fear. In other words, surviving bee stings would be worth the trouble of obtaining the honey necessary to eat (reward). In spite of the relative rarity of spider stings, interaction with them offers no evolutionary advantage and this leads to a general lack of information about them. Which in turns contributes to informational fear acquisition that is culturally transmitted through generations, often taking the form of myths.

Another explanation relates to their unpredictable and uncontrollable behavior that could be gleaned from their rapid or abrupt movements (aha, now we're finally getting there!). Earlier studies suggest that this could be down to the inability of humans to exert control or influence upon the movements of animals, but many other animals and insects move as fast (or faster than) spiders and this is insufficient to explain spider-specific fear. After some more discussion of other points, the authors recognise one of the limitations of their study in showing static pictures to the participants as opposed to animated, or even 'live' images, and thus responses to spider/insect mobility couldn't be obtained and tested. Whereas spiders tend to be detected extremely quickly in search tasks (and where some say this is observable for other animals), the elevated fear and digust ratings in this study allow the researchers to recognise the 'specality' of spider fear. Very generously, they also recognise that existing explanations for these responses (venomousness and so on) aren't sufficient or well-founded to properly explain them.

They could start by studying reactions to sudden spider appearances, and to what extent this is moderated by their looks!

Thankfully (and at long last!) the researchers do end up suggesting two specific ways to deeply analyse the origin of animal (spider?) fear and disgust; first through detailed cross-cultural studies, and secondly by analysing the morphological and behavioural traits that trigger the fear and disgust responses.

Hallelujah! It was a long ride but they got there in the end! And I think when that kind of study is carried out and published, it will be a worthwhile read.
__________________________________________
A GERDES, G UHL, G ALPERS (2008). Spiders are special: fear and disgust evoked by pictures of arthropods☆ Evolution and Human Behavior DOI: 10.1016/j.evolhumbehav.2008.08.005

November 17, 2008

Why IDiotic Research is BAD for Science

This is a partial repost of a previous blog as I wanted to have a separate post outlining one of my postgraduate experiences of scientific research for a paper, and how it can be scuppered when Creationist/ID papers start entering research databases. After I've outlined my general concerns about this, listen to this tale of research woe. The following exchange took place in the context of a conversation with a believer:

Question: In response to your statement about religious scientists engaging in intellectual dishonesty and their position being unjustifiable, why should you or anyone object to what someone wishes to spend their time researching?

Answer: Because they basically waste everyone's time with what is essentially junk science. In fact its not just scientists, but a whole load of other people like swamis and gurus who think that they can make scientific pronouncements and get away with it. I discussed this briefly at someone's blog a few months ago. This is a very good question you've asked. This summer [2008] it is likely that I'll be involved in a massive scientific project that I hoped would be along the lines of an investigation in the religious content of auditory hallucinations in non-psychotic populations. If I get ethical approval for the basic idea, there is a possibility of significant expansion which will result in investigations beyond the original proposal and quite probably end up a something seriously publishable. As such, it is my duty to do the background research (for the introduction of what will be my paper as I explained above) by seeing what has already been done and how my study can fit in with any previous research. I came across this paper:

Norris, R.S. (2005). Examining the structure and role of emotion: Contributions of neurobiology to the study of embodied religious experience. Zygon, 40(1), 181-200.

A paper that I can only describe through gritted teeth as absolute shit. There was nothing whatsoever of any value in it, least of all to me. Norris basically draws some thoughts together along the lines of: "Look at how this explains this, and how that explains that. Isn't this grrrrrreat?" I was so disturbed by this paper that I checked out the Zygon journal. Strange name for a scientific journal, don't you think? But apparently it is also known as the Journal of Religion and Science, and here's what's written on their website:

"The journal Zygon provides a forum for exploring ways to unite what in modern times has been disconnected—values from knowledge, goodness from truth, religion from science. Traditional religions, which have transmitted wisdom about what is of essential value and ultimate meaning as a guide for human living, were expressed in terms of the best understandings of their times about human nature, society, and the world. Religious expression in our time, however, has not drawn similarly on modern science, which has superseded the ancient forms of understanding. As a result religions have lost credibility in the modern mind. Nevertheless some recent scientific studies of human evolution and development have indicated how long-standing religions have evolved well-winnowed wisdom, still essential for the best life. Zygon's hypothesis is that, when long-evolved religious wisdom is yoked with significant, recent scientific discoveries about the world and human nature, there results credible expression of basic meaning, values, and moral convictions that provides valid and effective guidance for enhancing human life."

Sounds pretty cool, huh? Like some of those weird ideas you hear about science and religion coming together in a synthesis. Personally I'd be suspicious of a publication that had people like Viggo Mortenson on their advisory board but there you go. If I did some digging around, I'm sure I'd find these guys all to be a bunch of Creationists. Whoops, Intelligent Design advocates, sorry. These statements of purpose and all sounds very professional and above-board until you take a look at some of their articles. By virtue of my ATHENS login I was able to access their latest March 2008 issue, and here's an abstract from an article entitled 'The Centrality of Incarnation':

"What we urgently need at the beginning of the twenty-first century is a christological vision that can shape and inform a new and powerful way of helping humankind to interpret their place within the universe. A christological vision that is unintelligible and uninteresting can have a profoundly deleterious soteriological implication: the orbit of God's saving grace will not be wide enough to encompass the universal place of humankind. Arthur Peacocke's move is clear and to the point: Only when the foundations and universal scope of God's grace are fully established for all of creation, only then can the importance of God's specific work in Jesus the Christ be established."

This is science, are you kidding me? And just to make sure I wasn't being unfair I had a look at some other articles much to my chagrin and my theory of these guys being a bunch of religion-biased creationists was more or less confirmed as many of them praise the ideas of Arthur Peacocke. They are definitely religious anyway and biased with it. Articles entitled with things like 'Jesus and Creativity' don't really catch my interest.

And here's a gem from an article entitled 'Is a complete biocognitive account of religion feasible?' Pretty relevant to my interests, I'd think? Unfortunately not:

"Concluding this critical review, I am convinced, along with other scholars in the field, that the cause of the cognitive science of religion would be better served if detached from the biological approach. Very often the evolutionary ideas are highly speculative, lack empirical evidence, and become misleading."

Not only is this utter nonsense, but is it surprising that it was written by someone from a university in the Vatican City? I am filled with shock even while writing this, words can't express how I feel for these morons clogging up valuable PubMed space with their hokum gobbledygook dressed up as "science". Needless to say, I ain't likely to be referring to Zygon articles in any academic piece I write. Unless I wish to poke fun of course. Much has also been written about the Templeton Foundation and their apparent bias in annually awarding $1.6m to individuals who do something in "trying various ways for discoveries and breakthroughs to expand human perceptions of divinity and to help in the acceleration of divine creativity". Other religious sects are jumping on this bandwagon too, putting out junk science books such as those authored by Michael Cremo and Richard L. Thompson; have you read Cremo's 'Human Devolution'? Don't bother, you have better things to do.

So this is basically why I'd object to this type of "research", Member1, because every few years its people like these who also waste everyone's time with court cases against various State Boards of Education in the US that contest the theory of evolution and demand that "Creationism" be taught alongside it in schools. They are outcasts with their junk science.

November 16, 2008

We're in the Top 50!

It hasn't been all that long since I started this blog, and what I've put up here so far is barely a tenth of what I've wanted to write. So it was surprising when I learned today that this blog is included in the Top 50 Psychology Blogs at Networked Blogs!

Even though we're currently at #33, and this site is an extension of a Facebook application, it still feels good. Thanks to all my regular readers! :)

November 11, 2008

War On Neuroscience? What War?

After the initial shock of reading the article about Creationism's declaration of war on neuroscience, I thought I had better give some of my own thoughts. It would be very easy (and also very lazy) to rant on as many have already done by condemning them as "stupid" and IDiots (ID = Intelligent Design, IDiots = advocates of ID) although I do feel that way sometimes. One of my favourite science writers, Steven Novella, has written an excellent two-part review of the article and also provides some of the background behind the controversy:

Reports of the Demise of Materialism Are Premature

Reports of the Demise of Materialism Are Premature - Part II

This whole affair annoys me deeply because, as a general researcher, I am bound to keep up with all the latest developments in the field so as to maintain my own standard of knowledge as well as being properly equipped to deal with issues that I come across. What to speak of any patients I may eventually treat! And now I am going to have to take a greater care with what I read. Of course due care and caution needs to be taken with what we anyway, such as whether experimental studies have been carried out by using a fairly rigorous methodology and whether the (statistical) data really do support the conclusions, but now every time I read a paper that presents somewhat startling or surprising results I'm going to have a niggly little voice in the back of my head asking, "Did an IDiot write this?"

I've already had some disturbing run-ins with IDiotic papers (blogged here) and I still shudder at the memory. Aside from all that, though, is the disturbing possibility of how old notions of neuroscience are proposed for ressurrection (for want of a better term!) in order to substantiate this new 'battle', implicated in the very term 'non-material neuroscience' that is being thrown around by them suggests that they are on a mission to decry 'material' neuroscience as if it is a bad thing. What any good neuroscientist would know through years of private practice and research is that a duality between the two doesn't exist: the mind is the brain and vice versa. This is experienced even in empirical ways where we see a patient suffering from brain injury very often undergoes variable personality changes. The effect of any changes of course depends on the severity of the injury, and cases like these have been known about and treated for nigh on two centuries already (as per the incredible case of Phineas Gage). In short, an injury (or deficiency) to an important part of the brain generally causes the patient to exhibit behaviour that is synonymous with the injury or deficit at hand. The important point about cases like these, and which is often missed, is that it is possible to suggest that fundamental things such as 'thoughts' and 'personality' which are usually thought of in abstract terms can be said to have a material origin.

This point is very unpalatable for those who tend to a spiritual or otherwise New-Agey outlook on life, and who would be given to beliefs or sentiments that favour a sense of being a controller of one's own destiny. One certainly can exhibit control over certain areas in ones life, but this isn't about which outlook, viewpoint or worldview is correct or superior. This is about simple facts. And these facts make it clear that a material viewpoint is the only real path one can take to understanding issues of neuronal importance. Any reasonable person who gives a moment's thought to the concept will be able to understand that all our experiences - sensory, emotional, somatic, metaphysical - are processed only through the brain. Thus, even at the outset, the idea of a "non-material neuroscience" as propounded by Schwartz, Beauregard, and those of their ilk, is defeated.

But for me, this is one of those areas where science and philosophy merge to such an extent that it becomes a big blur. What the ID movement is trying to do is ressurrect "Cartesian dualism" which, put simply, is Rene Descartes' idea that mind and body are separate. Applied to neuroscience, this translates as the mind being a separate and different 'entity' from the brain tissues that host it. He summed up this idea in the famous saying, "cogito ergo sum," "I think, therefore I am." According to Descartes, the mind and the body were composed of different types of substances just as oil and water. How could this be? We can see from our own experience that if we think about kicking someone up their bum and have our minds instruct our foot to do so, signals are sent to the leg that prepares and allows our foot to take aim and kick. Conversely, our bodies can also have an effect on our minds; a cut on the hand, for instance, may be painful enough to send distress signals to our brains and perhaps lead us into a state of panic. It seems that there is some ostensible connection between our bodies and minds.

Although Descartes insisted on their being separate entities and didn't adequately answer how these connections take place, Cartesian dualism, the theory that espouses these views, has come to explain these connections as a form of interactionism, that the (separate) body and mind interacted with each other in some way. In what ways they do that also hasn't been adequately explained. There are other types of dualism of course.

Perhaps the explanations above may go some way in explaining the shortcomings of the dualist theory, and why monism, the conception of the mind and the brain being one entity, is a much better model to use in trying to understand neuroscientific issues and problems. This kind of view is apparent in many modern descriptions of mind: 'Minds are simply what brains do' (Minsky, 1986); "'Mind is designer language for the functions that the brain carries out' (Claxton, 1994); Mind is 'the personalisation of the physical brain' (Greenfield, 2000). To quote Susan Blackmore:

"Such descriptions make it possible to talk about mental activities and mental abilities without supposing that there is a separate mind. This is probably how most psychologists and neuroscientists think of 'mind' today, but there is much less agreement when it comes to consciousness." - Consciousness: An Introduction, 2007 (p. 13).

And this is in fact one of the current problems in neuroscience: how consciousness works. The New Scientist article correctly identifies this as an area where the ID movement are very likely to strike. But before we discuss that, a short description of consciousness must suffice. To describe a neural function that is, to say the least, responsible for our being alive is very hard to do. Is it appropriate to describe consciousness as a 'live' phenomenon? What about those unfortunate individuals who exist in a vegetative state due to horrific injuries, aren't they technically "alive"? Or are they? Who can adequately describe consciousness, in all its fancies and frivolities, dreams and nightmares, naturals and supernaturals, illusions and vividity, in a way that would comprehensively define it? The answer is: there isn't one. Consciousness is simply too big and too difficult to describe and there is no general definition that could come close to fully explaining it.

However, there are ways in which we can come close to understanding it or how it works. The ability to categorise stimuli and react to them, to discriminate between things, the way different cognitive structures integrate to provide information, the reportability of mental states, the mechanics of focus and attention, the deliberate control of behaviour, the difference between sleep and wakefulness, all of these are generally separate issues that can be understood in themselves. They are what we call the 'easy' problems of consciousness, denoting that these issues are relatively easy to understand when sufficient research has been carried out and these processes unfold. When we have 'easy' problems, it automatically follows that we have a 'hard' problem and it is this very hard problem that lies unsolved in the mystery of consciousness. The hard problem can be properly described as how physical processes in the brain give rise to subjective experience. Or put another way, how can the functioning of neurons (objective processes) give rise to the subjective experiences that make us who we are, our loves, our joys, our sadnesses, our life experiences, our memories, our emotions, everything about us that makes us unique?

This is an issue that neuroscience cannot yet fully explain, although research is always ongoing. Some neuroscientists are sceptical and say that the hard problem will never be solved. Others think, as per the article, that new physical principles need to be postulated in order to guide research and solve it. Still others suggest that sufficient research into the easy problems will cause the hard problem to disappear automatically. Time (and research) will tell.

There will be those who, throwing their hands up in frustration (or thunder from their pulpits à la Jeffrey Schwartz), decide it's all a waste of time and go to the opposite extreme in their extreme thirst for an explanation. As per Daniel Dennett (1991), 'accepting dualism is giving up.' And this is precisely what these people appear to have done in joining the ID movement. But before you start thinking about the influence of right-wing Christian fundamentalists, Steven Novella has shown how the current agents for non-material neuroscience have links to Buddhism, loose associations with Deepak Chopra, as well as the intellectual abuse of quantum mechanics. This makes things a little more difficult because there are some neuroscientists who are interested in Buddhist meditational methodologies (and who employ them in their own lives) as a tool to better understand the experiential quality of consciousness, and some papers are sometimes published that discuss the possibility of what those Buddhist principles may be able to contribute to research in the area. Novella also goes into an excellent discussion of what constitutes the correct understanding of materialism, or naturalism, that is required to understand scientific or neuroscientific issues, and how IDeology diverts and is generally incompatible with the basic precepts of science, such as how a hypothesis should be falsifiable in principle. One example of this is how ID'ers suggest that "unexplained" issues in science can be explained once one accepts the notion of an intelligent top-down designer ('Godiddit!'), but how could this assumption be falsifiable? How is it possible to even prove that an intelligent designer exists? Thus, how could ID ever be scientific in spite of their claims to be so?

All in all, it appears that the IDeologues have learnt nothing from their abject failures in attacking evolution. As outlined in their mission statement they seek nothing less than the destruction of materialism, so it is expected that they will simply up sticks and move somewhere else to kick up a fuss. If they follow similar strategies to when they attacked evolution, we can expect more of the same: attacking all the "weak points" and filling the gaps with God. The disturbing thing is that they do this academically and while wearing the same white lab coats that genuine scientists wear, so the public will be fooled into thinking that any controversy they stir up will be a genuine one and that "conflicting opinions" may have some substance to them. They will publish their "scientific" academic papers (mostly in their own journals) and leave them to confuse the innocent wide-eyed newbies. This is all very disappointing, and underlines all the negatives of being influenced by an ideology that conflicts with the facts. Who would ever attempt to square a circle? Yet this is what the IDiots are trying to do.

I do not think much of their declaration of war. What war? Based on previous history, IDiots hardly ever come up with any real evidence of their claims; they simply re-interpret older and 'classic' experiments to suit their ways of thinking. When the 'weaknesses' of neuroscience are an open secret, the ID'ers will have the tough job of explaining away the 'hard problem' as well as having to explain how Cartesian dualist principles are valid after all. I'm not envious, but I'm not expecting too much from them either. Simply saying 'Godiddit!' to everything isn't a scientifically valid answer nor does it provide satisfactory explanations. It also turns out that David Chalmers, the philosopher who coined the term 'hard problem', has shown significant unease at how it has been hijacked by the ID'ers and has made some interesting points on his blog.

What worries me are the reactions of the public. As mentioned before, they are likely to be fooled into thinking that non-material neuroscience is just as equal and valid a paradigm as 'material' neuroscience is. We are likely to hear more 'spiritual' explanations for how various neural functions work from individuals such as the odious Deepak Chopra, and quite possibly the repellent 'Godiddit!' chorus from the Bible-quoting (or Dhammapada-quoting) peanut gallery. And of course, the usual criticisms about evil crackpot scientists with their chemicals and their test tubes, and how damn myopic and narrow-minded they are to ignore the "spiritual realm" in their doomed endeavour to search for the meaning of everything. This isn't a fantasy - this is history - which has the peculiar quality of repeating itself. That the whole evolution debacle even made it to several legal courts and education boards brought the indignation of many a scientist and a judge, but the one good thing about this "war" on neuroscience is that it is unlikely to have a large effect on public education as the subject is generally only taught at university level. Still, the idea of graduates' heads being filled with 'alternative' theories (when there are already plenty of 'orthodox' theories to digest) is something that causes me to shudder.

At the end of the day, what matters is that - war or no war - this shift is important to acknowledge and represents a challenge for this scientific establishment to face it head-on. Plenty of people would disagree about there being anything to face, and they would be right, but the final paragraph of the article was very telling: "What can scientists do? They have been criticised for not doing enough to teach the public about evolution. Maybe now they need a big pre-emptive push to engage people with the science of the brain - and help the public appreciate that the brain is no place to invoke the 'God of the gaps'."

And this is the reason why this blog exists. It represents my very small and humble contribution to public education.

October 27, 2008

Creationists Declare War Over The Brain

I received the latest copy of New Scientist magazine yesterday, and spotted this disturbing article in it which I have decided to post here. It seems to me that after being more or less defeated by the likes of the late Stephen Jay Gould, Steve Jones, Richard Dawkins, PZ Myers and other prominent evolutionary biologists on the subject of evolution, the IDiots are now turning their attention to the neurosciences. I cannot fully explain how disturbing to me this is as an up-and-coming neuroscientist. I feel a tad infuriated about it but I would be lying if I said I saw it coming. I had lulled myself into a false sense of security by thinking that evolution was the IDiot battleground, but perhaps it is the nature of grandiose 'conquerors' to move to pastures new. Perhaps I shall write up my own review of it soon, but for now here is the actual article. It can also be viewed online at the New Scientist site complete with links (all reproduced here):

"YOU cannot overestimate," thundered psychiatrist Jeffrey Schwartz, "how threatened the scientific establishment is by the fact that it now looks like the materialist paradigm is genuinely breaking down. You're gonna hear a lot in the next calendar year about... how Darwin's explanation of how human intelligence arose is the only scientific way of doing it... I'm asking us as a world community to go out there and tell the scientific establishment, enough is enough! Materialism needs to start fading away and non-materialist causation needs to be understood as part of natural reality."

His enthusiasm was met with much applause from the audience gathered at the UN's east Manhattan conference hall on 11 September for an international symposium called Beyond the Mind-Body Problem: New Paradigms in the Science of Consciousness. Earlier Mario Beauregard, a researcher in neuroscience at the University of Montreal, Canada, and co-author of The Spiritual Brain: A neuroscientist's case for the existence of the soul, told the audience that the "battle" between "maverick" scientists like himself and those who "believe the mind is what the brain does" is a "cultural war".

Schwartz and Beauregard are part of a growing "non-material neuroscience" movement. They are attempting to resurrect Cartesian dualism - the idea that brain and mind are two fundamentally different kinds of things, material and immaterial - in the hope that it will make room in science both for supernatural forces and for a soul. The two have signed the "Scientific dissent from Darwinism" petition, spearheaded by the Seattle-based Discovery Institute, headquarters of the intelligent design movement. ID argues that biological life is too complex to have arisen through evolution.

In August, the Discovery Institute ran its 2008 Insider's Briefing on Intelligent Design, at which Schwartz and Michael Egnor, a neurosurgeon at Stony Brook University in New York, were invited to speak. When two of the five main speakers at an ID meeting are neuroscientists, something is up. Could the next battleground in the ID movement's war on science be the brain?

Well, the movement certainly seems to hope that the study of consciousness will turn out to be "Darwinism's grave", as Denyse O'Leary, co-author with Beauregard of The Spiritual Brain, put it. According to proponents of ID, the "hard problem" of consciousness - how our subjective experiences arise from the objective world of neurons - is the Achilles heel not just of Darwinism but of scientific materialism. This fits with the Discovery Institute's mission as outlined in its "wedge document", which seeks "nothing less than the overthrow of materialism and its cultural legacies", to replace the scientific world view with a Christian one.

Now the institute is funding research into "non-material neuroscience". One recipient of its cash is Angus Menuge, a philosophy professor at Concordia University, Wisconsin, a Christian college, who testified in favour of teaching ID in state-funded high-schools at the 2005 "evolution hearings" in Kansas. Using a Discovery Institute grant, Menuge wrote Agents Under Fire, in which he argued that human cognitive capacities "require some non-natural explanation". In June, James Porter Moreland, a professor at the Talbot School of Theology near Los Angeles and a Discovery Institute fellow, fanned the flames with Consciousness and the Existence of God. "I've been doing a lot of thinking about consciousness," he writes, "and how it might contribute to evidence for the existence of God in light of metaphysical naturalism's failure to provide a helpful explanation." Non-materialist neuroscience provided him with this helpful explanation: since God "is" consciousness, "the theist has no need to explain how consciousness can come from materials bereft of it. Consciousness is there from the beginning."

To properly support dualism, however, non-materialist neuroscientists must show the mind is something other than just a material brain. To do so, they look to some of their favourite experiments, such as research by Schwartz in the 1990s on people suffering from obsessive-compulsive disorder. Schwartz used scanning technology to look at the neural patterns thought to be responsible for OCD. Then he had patients use "mindful attention" to actively change their thought processes, and this showed up in the brain scans: patients could alter their patterns of neural firing at will.

From such experiments, Schwartz and others argue that since the mind can change the brain, the mind must be something other than the brain, something non-material. In fact, these experiments are entirely consistent with mainstream neurology - the material brain is changing the material brain. But William Dembski, one of ID's founding fathers and a senior fellow at the Discovery Institute, praised Schwartz's work as providing "theoretical support for the irreducibility of mind to brain". Dembski's website shows that he is currently co-editing The End of Materialism with Schwartz and Beauregard. Meanwhile, Schwartz has been working with Henry Stapp, a physicist at the US Department of Energy's Lawrence Berkeley National Laboratory, who also spoke at the symposium. They have been developing non-standard interpretations of quantum mechanics to explain how the "non-material mind" affects the physical brain.

Clearly, while there is a genuine attempt to appropriate neuroscience, it will not influence US laws or education in the way that anti-evolution campaigns can because neuroscience is not taught as part of the core curriculum in state-funded schools. But as Andy Clark, professor of logic and metaphysics at the University of Edinburgh, UK, emphasises: "This is real and dangerous and coming our way."

He and others worry because scientists have yet to crack the great mystery of how consciousness could emerge from firing neurons. "Progress in science is slow on many fronts," says John Searle, a philosopher at the University of California, Berkeley. "We don't yet have a cure for cancer, but that doesn't mean cancer has spiritual causes." And for Patricia Churchland, a philosopher of neuroscience at the University of California, San Diego, "it is an argument from ignorance. The fact something isn't currently explained doesn't mean it will never be explained or that we need to completely change not only our neuroscience but our physics."

The attack on materialism proposes to do just that, but it all turns on definitions. "At one time it looked like all physical causation was push/pull Newtonianism," says Owen Flanagan, professor of philosophy and neurobiology at Duke University, North Carolina. "Now we have a new understanding of physics. What counts as material has changed. Some respectable philosophers think that we might have to posit sentience as a fundamental force of nature or use quantum gravity to understand consciousness. These stretch beyond the bounds of what we today call 'material', and we haven't discovered everything about nature yet. But what we do discover will be natural, not supernatural."

And as Clark observes: "This is an especially nasty mind-virus because it piggybacks on some otherwise reasonable thoughts and worries. Proponents make such potentially reasonable points as 'Oh look, we can change our brains just by changing our minds,' but then leap to the claim that mind must be distinct and not materially based. That doesn't follow at all. There's nothing odd about minds changing brains if mental states are brain states: that's just brains changing brains."

That is the voice of mainstream academia. Public perception, however, is a different story. If people can be swayed by ID, despite the vast amount of solid evidence for evolution, how hard will it be when the science appears fuzzier?

What can scientists do? They have been criticised for not doing enough to teach the public about evolution. Maybe now they need a big pre-emptive push to engage people with the science of the brain - and help the public appreciate that the brain is no place to invoke the "God of the gaps".

October 13, 2008

Why Are Some People Black?

As a follow-up of sorts to the last post on evolution, an excellent article by Steve Jones from the same book discusses the reasons for why evolution results in different skin complexions. It was written around 1996 or so.

----------------------------------------


Everyone knows - do they not? - that many people have black skin. What is more, black people are concentrated in certain places - most notably, in Africa - and, until the upheavals of the past few centuries, they were rare in Europe, Asia, and the Americas. Why should this be so? It seems a simple question. Surely, it we cannot give a simple answer, there is something wrong with our understanding of ourselves. In fact, there is no straightforward explanation of this striking fact about humankind. Its absence says a lot about the strengths and weaknesses of the theory of evolution and of what science can and cannot say about the past. Any anatomy book gives one explanation of why people look different. Doctors love pompous words, particularly if they refer to other doctors who lived long ago. Black people have black skin, their textbooks say, because they have a distinctive Malphigian layer. This is a section of the skin named after the seventeenth-century Italian anatomist Malphigii. It contains lots of cells called melanocytes. Within them is a dark pigment called melanin. The more there is, the blacker the skin. Malphigii found that African skin had more melanin than did that of Europeans. The question was, it seemed, solved.

This is an example of what I sometimes think of as 'the Piccadilly explanation.' One of the main roads in London is called Piccadilly - an oddly un-English word. I have an amusing book that explains how London's streets got their names. What it says about Piccadilly sums up the weakness of explanations that depend, like the anatomists', only on describing a problem in more detail. The street is named, it says, after the tailors who once lived there and made high collars called piccadills. Well, fair enough; but surely that leaves the interesting question unanswered. Why call a collar a piccadill in the first place? It is not an obvious word for an everyday piece of clothing. My book is, alas, silent.

Malphigii's explanation may be good enough for doctors, but will not satisfy any thinking person. It answers the question how but not the more interesting question why there is more melanin in African skin.

Because the parents, grandparents, and - presumably - distant ancestors of black people are black, and those of white people are white, the solution must lie in the past. And that is a difficulty for the scientific method. It is impossible to check directly just what was going on when the first blacks appeared on earth. Instead, we must rely on indirect evidence. There is one theory that is, if nothing else, simple and consistent. It has been arrived at again and again. It depends solely on belief; and if there is belief, the question of proof does not arise. Because of this, the theory lies outside science.

It is that each group was separately created by divine action. The Judeo-Christian version has it that Adam and Eve were created in the Garden of Eden. Later, there was a gigantic flood; only one couple, the Noahs, survived. They had children: Ham, Shem, and Japheth. Each gave rise to a distinct branch of the human race, Shem to the Semites, for example. The children of Ham had dark skins. From them sprang the peoples of Africa. That, to many people, is enough to answer the question posed in this essay.

The Noah story is just a bald statement about history. Some creation myths are closer to science. They try to explain why people look different. One African version is that God formed men from clay, breathing life into his creation after it had been baked. Only the Africans were fully cooked - they were black. Europeans were not quite finished and were an unsatisfactory muddy pink. The trouble with such ideas is that they cannot be disproved. I get lots of letters from people who believe passionately that life, in all its diversity, appeared on earth just a few thousand years ago as a direct result of God's intervention. There is no testimony that can persuade the otherwise. Prove that there were dinosaurs millions of years before humans, and they come up with rock 'footprints' showing, they say, that men and dinosaurs lived together as friends. So convinced are they of the truth that they insist that their views appear in school textbooks.

If all evidence, whatever it is, can only be interpreted as supporting one theory, then there is no point in arguing. In fact, if belief in the theory is strong enough, there is no point in looking for evidence in the first place. Certainty is what blocked science for centuries. Scientists are, if nothing else, uncertain. Their ideas must constantly be tested against new knowledge. If they fail the test, they are rejected.

No biologist now believes that humans were created through some miraculous act. All are convinced that they evolved from earlier forms of life. Although the proof of the fact of evolution is overwhelming, there is plenty of room for controversy about how it happened. Nowhere is this clearer than in the debate about skin colour.

Modern evolutionary biology began with the nineteenth-century English biologist Charles Darwin. He formed his ideas after studying geology. In his day, many people assumed that grand features such as mountain ranges or deep valleys could arise only through sudden catastrophes such as earthquakes or volcanic eruptions, which were unlikely to be seen by scientists as they were so rare. Darwin realised that, given enough time, even a small stream can, by gradually wearing away the rocks, carve a deep canyon. The present, he said, is the key to the past. By looking at what is going on in a landscape today. It is possible to infer the events of millions of years ago. In the same way, the study of living creatures can show what happened in evolution.

In The Origin of Species, published in 1859, Darwin suggested a mechanism whereby new forms of life could evolve. Descent with modification, as he called it, is a simple piece of machinery, with two main parts. One produces inherited diversity. This process is now known as mutation. In each generation, there is a small but noticeable chance of a mistake in copying genes as sperm or eggs are made. Sometimes we can see the results of mutations in skin colour; one person in several thousand is an albino, lacking all skin pigment. Albinos are found all over the world, including Africa. They descend from sperm or eggs that have suffered damage in the pigment genes. The second piece of the machine is a filter. It separates mutations which are good at coping with what the environment throws at them from those which are not. Most mutations - albinism, for example - are harmful. The people who carry mutant genes are less likely to survive and to have children than do those who do not. Such mutations quickly disappear. Sometimes, though, one turns up which is better at handling life's hardships than what went before. Perhaps the environment is changing, or perhaps the altered gene does its job better. Those who inherit it are more likely to survive; they have more children, and the gene becomes more common. By this simple mechanism, the population has evolved through natural selection. Evolution, thought Darwin, was a series of successful mistakes.

If Darwin's machine worked for long enough, then new forms of life - new species - would appear. Given enough time, all life's diversity could emerge from simple ancestors. There was no need to conjure up ancient and unique events (such as a single incident of creation) which could neither be studied nor duplicated. Instead, the living world was itself evidence for the workings of evolution. What does Darwin's machine tell us about skin colour? As so often in biology, what we have is a series of intriguing clues, rather than a complete explanation.

There are several kinds of evidence about how things evolve. The best is from fossils; the preserved remnants of ancient times. These contain within themselves a statement of their age. The chemical composition of bones (or of the rocks into which they are transformed) shifts with time. The molecules decay at a known rate, and certain radioactive substances change from one form into another. This gives a clue as to when the original owner of the bones died. It may be possible to trace the history of a family of extinct creatures in the changes that occur as new fossils succeed old.

The human fossil record is not good - much worse, for example, than that of horses. In spite of some enormous gaps, enough survives to make it clear that creatures looking not too different from ourselves first appeared around a hundred and fifty thousand years ago. Long before that, there were apelike animals which looked noticeably human but would not be accepted as belonging to our own species if they were alive today. No one has traced an uninterrupted connection between these extinct animals and ourselves. Nevertheless, the evidence for ancient creatures that changed into modern humans is overwhelming. As there are no fossilised human skins, fossils say nothing directly about skin colour. They do show that the first modern humans appeared in Africa. Modern Africans are black. Perhaps, then, black skin evolved before white. Those parts of the world in which people have light skins - northern Europe, for example - were not populated until about a hundred thousand years ago, so that white skin evolved quite quickly. Darwin suggested another way of inferring what happened in the past: to compare creatures living today. If two species share a similar anatomy, they probably split from a common ancestor more recently than did another which has a different body plan. Sometimes it is possible to guess at the structure of an extinct creature by looking at its living descendants. This approach can be used not just for bones but for molecules such as DNA. Many biologists believe that DNA evolves at a regular rate; that in each generation, a small but predictable proportion of its subunits changes from one form into another. If this is true (and often it is), then counting the changes between two species reveals how closely they are related. What is more, if they share an ancestor that has been dated using fossils, it allows DNA to be used as a 'molecular clock,' timing the speed of evolution. The rate at which the clock ticks can then be used to work out when other species split by comparing their DNA, even if no fossils are available.

Chimpanzees and gorillas seem, from their body plan, to be our relatives. Their genes suggest the same thing. In fact, each shares 98 percent of its DNA with ourselves, showing just how recently we separated. The clock suggests that the split was about six million years ago. Both chimp and gorilla have black skins. This, too, suggests that the first humans were black and that white skin evolved later. However, it does not explain why white skin evolved. The only hint from fossils and chimps is that the change took place when humans moved away from the tropics. We are, without doubt, basically tropical animals. It is much harder for men and women to deal with cold than with heat. Perhaps climate has something to do with skin colour. To check this idea, we must, like Darwin, look at living creatures. Why should black skin be favoured in hot and sunny places and white where it is cool and cloudy? It is easy to come up with theories, some of which sound pretty convincing. However, it is much harder to test them.

The most obvious idea is wrong. It is that black skin protects against heat. Anyone who sits on a black iron bench on a hot sunny day soon discovers that black objects heat up more than white ones do when exposed to the sun. This is because they absorb more solar energy. The sun rules the lives of many creatures. Lizards shuttle back and forth between sun and shade. In the California desert, if they stray more than six feet from shelter on a hot day, they die of heat stroke before they can get back. African savannahs are dead places at noon, when most animals are hiding in the shade because they cannot cope with the sun. In many creatures, populations from hot places are lighter - not darker - in colour to reduce the absorption of solar energy. People, too, find it hard to tolerate full sunshine - blacks more so than whites. Black skin does not protect those who bear it from the sun's heat. Instead, it makes the problem worse. However, with a bit of ingenuity, it is possible to bend the theory slightly to make it fit. Perhaps it pays to have black skin in the chill of the African dawn, when people begin to warm up after a night's sleep. In the blaze of noon, one can always find shelter under a tree.

The sun's rays are powerful things. They damage the skin. Melanin helps to combat this. One of the first signs of injury is an unhealthy tan. The skin is laying down an emergency layer of melanin pigment. Those with fair skin are at much greater risk from skin cancer than are those with dark. The disease reaches its peak in Queensland, in Australia, where fair-skinned people expose themselves to a powerful sun by lying on the beach. Surely, this is why black skin is common in sunny places - but, once again, a little thought shows that it probably is not. Malignant melanoma, the most dangerous skin cancer, may be a vicious disease, but it is an affliction of middle age. It kills its victims after they have passed on their skin-colour genes to their children. Natural selection is much more effective if it kills early in life. If children fail the survival test, then their genes perish with their carriers. The death of an old person is irrelevant, as their genes (for skin colour or anything else) have already been handed on to the next generation.

The skin is an organ in its own right, doing many surprising things. One is to synthesise vitamin D. Without this, children suffer from rickets: soft, flexible bones. We get most vitamins (essential chemicals needed in minute amounts) from food. Vitamin D is unusual. It can be made in the skin by the action of sunlight on a natural body chemical. To do this, the sun must get into the body. Black people in sunshine hence make much less vitamin D than do those with fair skins. Vitamin D is particularly important for children, which is why babies (African or European) are lighter in colour than are adults. Presumably, then, genes for relatively light skin were favoured during the spread from Africa into the cloud and rain of the north. That might explain why Europeans are white - but does it reveal why Africans are black? Too much vitamin D is dangerous (as some people who take vitamin pills discover to their cost). However, even the fairest skin cannot make enough to cause harm. The role of black skin is not to protect against excess vitamin D.

It may, though, be important in preserving other vitamins. The blood travels around the body every few minutes. On the way, it passes near the surface of the skin through fine blood vessels. There, it is exposed to the damaging effects of the sun. The rays destroy vitamins - so much so, that a keen blond sunbather is in danger of vitamin deficiency. Even worse, the penetrating sunlight damages antibodies, the defensive proteins made by the immune system. In Africa, where infections are common and, sometimes, food is short, vitamin balance and the immune system are already under strain. The burden imposed by penetrating sunlight may be enough to tip the balance between health and disease. Dark skin pigmentation may be essential for survival. No one has yet shown directly whether this is true.

There are plenty of other theories as to why some people are black. For an African escaping from the sun under a tree, black skin is a perfect camouflage. Sexual preference might even have something to do with the evolution of skin colour. If, for one reasons or another, people choose their partners on the basis of colour, then the most attractive genes will be passed on more effectively. A slight (and perhaps quite accidental) preference for dark skin in Africa and light in Europe would be enough to do the job, This kind of thing certainly goes on with peacocks - in which females prefer males with brightly patterned tails - but there is no evidence that it happens in humans. Accident might be important in another way, too. Probably only a few people escaped from Africa a hundred thousand years and more ago. If, by chance, some of them carried genes for relatively light skins, then part of the difference in appearance between Africans and their northern descendants results from a simple fluke. There is a village of North American Indians today where albinos are common. By chance, one of the small number of people who founded the community long ago carried the albino mutation and it is still abundant there.

All this apparent confusion shows how difficult it is for science to reconstruct history. Science is supposed to be about testing, and perhaps disproving, hypotheses. As we have seen, there is no shortage of ideas about why people differ in skin colour. Perhaps none of the theories is correct, or perhaps one, two, or all of them are. Because whatever gave rise to the differences in skin colour in different parts of the world happened long ago, no one can check directly. But science does not always need direct experimental tests. A series of indirect clues may be almost as good. The hints that humans evolved from simpler predecessors and are related to other creatures alive today are so persuasive that it is impossible to ignore them. So far, we have too few facts and too many opinions to be certain of all the details of our own evolutionary past. However, the history of the study of evolution makes me confident that, some day, the series of hints outlined in this essay will suddenly turn into a convincing proof of just why some people are black and some white.

October 10, 2008

Three Facets of Evolution

I'd like to put up this excellent short piece by the late Stephen Jay Gould, the renowned palaeontologist and evolutionary biologist. It was originally published in How Things Are: A Science Toolkit for the Mind by John & Katinka Brockman (1996), and it outlines evolution theory and the bare basics of how to deal with some of the current controversies.

---------------------------------------


Three Facets of Evolution

§ - What Evolution Is Not.

Of all the fundamental concepts in the life sciences, evolution is both the most important and the most widely misunderstood. Since we often grasp a subject best by recognising what it isn't, and what it cannot do, we should begin with some disclaimers, acknowledging for science what G. K. Chesterton considered so important for the humanities: 'Art is limitation; the essence of every picture is the frame.'

First, neither evolution, nor any science, can access the subject of ultimate origins and ethical meanings. (Science, as an enterprise, tries to discover and explain the phenomena and regularities of the empirical world, under the assumption that natural laws are uniform in space and time. This restriction places an endless world of fascination within the 'picture'; most subjects thus relegated to the 'frame' are unanswerable in any case.) Thus, evolution is not the study of life's ultimate origin in the universe or of life's intrinsic significance among nature's objects; these questions are philosophical (or theological) and do not fall within the purview of science. (I also suspect that they have no universally satisfactory answers, but this is another subject for another time.) This point is important because zealous fundamentalists, masquerading as 'scientific creationists,' claim that creation must be equated with evolution, and be given equal time in schools, because both are equally 'religious' in dealing with ultimate unknowns. In fact, evolution does not treat such subjects at all, and thus remains fully scientific.

Second, evolution has been saddled with a suite of concepts and meanings that represent long-standing Western social prejudices and psychological hopes, rather than any account of nature's factuality. Such 'baggage' may be unavoidable for any field so closely allied with such deep human concerns (see Part 3 of this statement), but this strong social overlay has prevented us from truly completing Darwin's revolution. Most pernicious and constraining among these prejudices is the concept of progress, the idea that evolution possesses a driving force of manifests an overarching trend towards increasing complexity, better biomechanical design, bigger brains, or some other parochial definition of progress centered upon a long-standing human desire to place ourselves atop nature's pile - and thereby assert a natural right to rule and exploit our planet.

Evolution, in Darwin's formation, is adaptation to changing local environments, not universal 'progress.' A lineage of elephants that evolves a heavier coating of hair to become a woolly mammoth as the ice sheets advance does not become a superior elephant in any general sense, but just an elephant better adapted to local conditions of increasing cold. For every species that does become more complex as an adaptation to its own environment, look for parasites (often several species) living within its body - for parasites are usually greatly simplified in anatomy compared with their freeliving ancestors, yet these parasites are as well adapted to the internal environment of their host as the host has evolved to match the needs of its external environment.


§ What Evolution Is.

In its minimalist, 'bare bones' formulation, evolution is a simple idea with a remarkable range of implications. The basic claim includes two linked statements that provide rationales for the two central disciplines of natural history: taxonomy (or the order of relationships among organisms), and palaeontology (or the history of life). Evolution means (1) that all organisms are related by ties of genealogy or descent from common ancestry along the branching patterns of life's tree, and (2) that lineages alter their form and diversity through time by a natural process of change - 'descent with modification' in Darwin's chosen phrase. This simple, yet profound, insight immediately answers the great biological question of the ages: What is the basis for the 'natural system' of relationships among organisms (cats closer to dogs than to lizards; all vertebrates closer to each other than any to an insect - a fact well appreciated, and regarded as both wonderful and mysterious, long before evolution provided the reason). Previous explanations were unsatisfactory because they were either untestable (God's creative hand making each species by fiat, with taxonomic relationships representing the order of divine thought), or arcane and complex (species as natural places, like chemical elements in the periodic table, for the arrangement of organic matter). Evolution's explanation for the natural system is so stunningly simple: Relationship is genealogy; humans are like apes because we share such a recent common ancestor. The taxonomic order is a record of history.

But the basic fact of genealogy and change - descent with modification - is not enough to characterise evolution as a science. For science has two missions: (1) to record and discover the factual state of the empirical world, and (2) to devise and test explanations for why the world works as it does. Genealogy and change only represent the solution to this first goal - a description of the fact of evolution. We also need to know the mechanism by which evolutionary change occurs - the second goal of explaining the causes of descent with modification. Darwin proposed the most famous and best-documented mechanism for change in the principle that he named 'natural selection.'

The fact of evolution is as well documented as anything we know in science - as secure as our conviction that Earth revolves about the sun, and not vice versa. The mechanism of evolution remains a subject of exciting controversy - and science is most lively and fruitful when engaged in fundamental debates about the causes of well-documented facts. Darwin's natural selection has been affirmed, in studies both copious and elegant, as a powerful mechanism, particularly in evolving the adaptations of organisms to their local environments - what Darwin called 'that perfection of structure and coadaptation which most justly excites our admiration.' But the broad-scale history of life includes other phenomena that may require different kinds of causes as well (potentially random effects, for example, in another fundamental determinant of life's pattern - which groups live, and which die, in episodes of catastrophic extinction).


§ Why Should We Care?

The deepest, in-the-gut, answer to the question lies in the human psyche, and for reasons that I cannot begin to fathom. We are fascinated by physical ties of ancestry; we feel that we will understand ourselves better, know who we are in some fundamental sense, when we trace the sources of our descent. We haunt graveyards and parish records; we pore over family Bibles and search out elderly relatives, all to fill in the blanks on our family tree. Evolution is this same phenomenon on a much more inclusive scale - roots writ large. Evolution is the family tree of our races, species, and lineages - not just of our little, local surname. Evolution answers, insofar as science can address such questions at all, the troubling and fascinating issues of 'Who are we?' 'To which other creatures are we related, and how?' 'What is the history of our interdependency with the natural world?' 'Why are we here at all?'

Beyond this, I think that the importance of evolution in human thought is best captured in a famous statement by Sigmund Freud, who observed, with wry and telling irony, that all great scientific revolutions have but one feature in common: the casting of human arrogance off one pedestal after another of previous convictions about our ruling capacity in the universe. Freud mentions three such revolutions: the Copernican, for moving our home from center stage in a small universe to a tiny peripheral hunk of rock amid inconceivable vastness; the Darwinian, for 'relegating us to descent from an animal world'; and (in one of the least modest statements of intellectual history) his own, for discovering the unconscious and illustrating the nonrationality of the human mind. What can be more humbling, and therefore more liberating, than a transition from viewing ourselves as 'just a little lower than the angels,' the created rulers of nature, made in God's image to shape and subdue the earth - to the knowledge that we are not only natural products of a universal process of descent with modification (and thus kin to all other creatures), but also a small, late-blooming, and ultimately transient twig on the copiously arborescent tree of life, and not the foreordained summit of a ladder of progress. Shake complacent certainty, and kindle the fire of intellect.

October 5, 2008

Brain Basics!

An excellent article by Michael Craig Miller M.D. that I found in Newsweek that serves as a basic introduction into the false mind-brain dichotomy and other issues:


The brain is the mind is the brain. One hundred billion nerve cells, give or take, none of which individually has the capacity to feel or to reason, yet together generating consciousness. For about 400 years, following the ideas of French philosopher René Descartes, those who thought about its nature considered the mind related to the body, but separate from it. In this model—often called “dualism” or the mind-body problem—the mind was “immaterial,” not anchored in anything physical. Today neuroscientists are finding abundant evidence of an idea that even Freud played with more than 100 years ago, that separating mind from brain makes no sense. Nobel Prize-winning psychiatrist-neuroscientist Eric Kandel stated it directly in a watershed paper published in 1998: “All mental processes, even the most complex psychological processes, derive from operations of the brain.”

Neuroscientists consider it settled that the mind arises from the cooperation of billions of interconnected cells that, individually, are no smarter than amoebae. But it’s a shocking idea to some that the human mind could arise out of such an array of mindlessness. Many express amazement that emotions, pain, sexual feelings or religious belief could be a product of brain function. They are put off by the notion that such rich experiences could be reduced to mechanical or chemical bits. Or they worry that scientific explanations may seduce people into a kind of moral laziness that provides a ready excuse for any human failing: “My brain made me do it.” Our brains indeed do make us do it, but that is nonetheless consistent with meaningful lives and moral choices. Writing for the President’s Council on Bioethics earlier this year, philosopher Daniel Dennett made the point that building knowledge about the biology of mental life may improve our decision making, even our moral decision making. And it could enhance our chances of survival as a species, too.

Your heart, lungs, kidneys and digestive tract keep you alive. But your brain is where you live. The brain is responsible for most of what you care about—language, creativity, imagination, empathy and morality. And it is the repository of all that you feel. The endeavor to discovery the biological basis for these complex human experiences has given rise to a relatively new discipline: cognitive neuroscience. It has recently exploded as a field, thanks, in part, to decades of advances in neuroimaging technology that enable us to see the brain at work. As Dr. Joel Yager, professor of psychiatry at the University of Colorado, has said, “We can now watch the mind boggle!”

Certainly, you won’t find an entry for “mind-boggling” in the index of a modern neuroscience textbook. You will also have a hard time finding the words “happiness” or “sadness,” “anger” or “love.” Neuroscientists do, however, have a rapidly growing appreciation of the emotional brain and are beginning to look closely at these subjective states, which were formerly the province of philosophers and poets. It is complex science that holds great promise for improving the quality of life. Fortunately, understanding basic principles does not require an advanced degree.

Fear Itself
Fear is a good place to start, because it is one of the emotions that cognitive neuroscientists understand well. It is an unpleasant feeling, but necessary to our survival; humans would not have lasted very long in the wilderness without it. Two deep brain structures called the amygdalae manage the important task of learning and remembering what you should be afraid of.

Each amygdala, a cluster of nerve cells named after its almond shape (from the Greek amugdale), sits under its corresponding temporal lobe on either side of the brain. Like a network hub, it coordinates information from several sources. It collects input from the environment, registers emotional significance and—when necessary—mobilizes a proper response. It gets information about the body’s response to the environment (for example, heart rate and blood pressure) from the hypothalamus. It communicates with the reasoning areas in the front of the brain. And it connects with the hippocampus, an important memory center.

The fear system is extraordinarily efficient. It is so efficient that you don’t need to consciously register what is happening for the brain to kick off a response. If a car swerves into your lane of traffic, you will feel the fear before you understand it. Signals travel between the amygdala and your crisis system before the visual part of your brain has a chance to “see.” Organisms with slower responses probably did not get the opportunity to pass their genetic material along. Fear is contagious because the amygdala helps people not only recognize fear in the faces of others, but also to automatically scan for it. People or animals with damage to the amygdala lose these skills. Not only is the world more dangerous for them, the texture of life is ironed out; the world seems less compelling to them because their “excitement” anatomy is impaired.

Anger Management
Until recently, there was relatively little research showing how the brain processes anger. But that has begun to change. Recent studies indicate that anger may trigger activity in a part of the brain not named as poetically as the amygdala—the dorsal anterior cingulate cortex (abbreviated dACC). Like the amygdala, the dACC’s function makes sense, given its connections to areas of the brain involved in recognizing an offense (he just stole my iPod), registering a feeling (I’m angry) and acting on it (I’m going to …). It also links to the reasoning centers in the front part of the brain, as well as memory centers, which play a role in angry rumination or stewing after the fact.

Researchers, however, have been more focused on one of the consequences of anger—aggression—probably because it can be observed through behavior. It’s known, for example, that men are overtly more aggressive than women because of differences in male and female hormones. But the brains of men and women are also different, and some of those differences may affect aggression. In the front of the brain, the orbitofrontal cortex is recruited to help make decisions and temper emotional responses. It lights up when people are making judgments. Adrian Raine and colleagues at the University of Southern California note that, on average, men have a lower volume of gray matter (the bodies of nerve cells) in the orbitofrontal cortex than women. According to their analysis, this brain difference accounts for a healthy portion of the gender gap seen in the frequency of antisocial behavior.

Even a neuroscientist can see that murder and mayhem are undesirable. But a neuroscientist can also see why that trait might still be in the gene pool. The gene for sickle cell anemia survived because it provided protection against another disease, malaria. Similarly, aggression is often an advantage. Until recently in historical terms, a readiness to fight and the ability to kill was a way to consolidate control over resources for survival. Fortunately, diplomats have also evolved. Some of our ancestors who understood that aggression carried risks as well as advantages used their creative human brains to devise better solutions for resolving conflicts. Our predecessors also originated symbolic diversions for aggression, like sports and chess.

So Happy (and Sad) Together
The common emotions of sadness and happiness are a problem for researchers. Depression and mania are core areas of study for a neuroscientist. But everyday ups and downs are so broadly defined that researchers have a hard time figuring out what exactly to study. They note activity in virtually every part of the brain. Last year Drs. Peter J. Freed and J. John Mann, publishing in The American Journal of Psychiatry, reported on the literature of sadness and the brain. In 22 studies, brain scans were performed on nondepressed but sad volunteers. Sadness was mostly induced (subjects were shown sad pictures or films, asked to remember a sad event), although, in a couple of studies, subjects had recently experienced a loss. In the aggregate, sadness appeared to cause altered activity in more than 70 different brain regions. The amygdala and hippocampus both show up on this list, as do the front part of the brain (prefrontal cortex) and the anterior cingulate cortex. A structure called the insula (which means “island") also appears here—it is a small region of cortex beneath the temporal lobes that registers body perceptions and taste.

The authors believe this complicated picture makes sense. The brain regions on their list process conflict, pain, social isolation, memory, reward, attention, body sensations, decision making and emotional displays, all of which can contribute to feeling sad. Sadness triggers also vary—for example, the memory of a personal loss; a friend stressing over a work conflict; seeing a desolate film.

In the brain, happiness is as widely distributed as sadness. In his book “This Is Your Brain on Music,” Dr. Daniel Levitin (page 58) notes that music simultaneously enlists many parts of the brain. We listen and respond to sounds and rhythms (auditory, sensory and motor cortex, cerebellum). We interpret (sensory cortex) and reason (prefrontal cortex). Music pulls on memories for experience and emotion (amygdala and hippocampus). If the music is working for you, it is probably triggering the reward system (nucleus accumbens). And if you’re playing it, as Dr. Levitin does, you also get to throw satisfaction into the mix. This may or may not be a description of happiness, but it certainly coincides with the notion of flow, described by the author Dr. Mihály Csíkszentmihályi: concentrated attention and the absence of self-consciousness. A neuroscientist might say that a life that fully engages your brain in these ways is a life worth living.

Faith, Love and Understanding
The challenge to cognitive neuroscientists becomes greater as we go up the ladder to more-complicated emotional states. And the stakes become higher, too, because research into such highly valued and personal mental processes can be easily misunderstood.

Empathy is more than being nice. It is the ability to feel what another person feels, and in its most refined form it is the capacity to deeply understand another person’s point of view. The brain’s empathic powers actually begin with fear detection. Most of us are extraordinarily skilled face readers. We readily act on the emotions communicated to us through facial expression. And the grammar of facial expression, in some instances, is plain. We are masters at telling when a smile is insincere by the absence of wrinkles (called Duchenne lines) around the smiler’s eyes. In a spontaneous smile, the corners of the mouth curl up and muscles around the eyes contract. Duchenne lines are almost impossible to fake.

In the Marx Brothers movie “Duck Soup,” Groucho encounters his brother Chico in a doorway, dressed like him in a nightshirt and cap and a fake mustache. They perform a famous version of the mirror routine, Chico copying Groucho’s actions. The humor may derive, at least in part, from humans’ highly developed skill as copycats. When you observe someone eating ice cream or stubbing a toe, the brain regions that are activated in the eater and the stubber are also activated in you.

But empathy depends on more than an ability to mirror actions or sensations. It also requires what some cognitive neuroscientists call mentalizing, or a “theory of mind.” Simon Baron-Cohen, a leading researcher in the study of autism, has identified the inability to generate a theory of mind as a central deficit in that illness. He has coined the term “mindblindness” to designate that problem. The corollary, “mindsightedness,” requires healthy function in several areas of the brain. The processing and remembering of subtle language cues take place toward the ends of the temporal lobes. At the junction of the temporal and parietal lobes, the brain handles memory for events, moral judgment and biological motion (what we might call body language). And the prefrontal cortex handles many complex reasoning functions involved in feelings of empathy.

Not surprisingly, love also engages a whole lot of brain. Areas that are deeply involved include the insula, anterior cingulate, hippocampus and nucleus accumbens— in other words, parts of the brain that involve body and emotional perception, memory and reward. There is also an increase in neurotransmitter activity along circuits governing attachment and bonding, as well as reward (there’s that word again). And there’s scientific evidence that love really is blind; romantic love turns down or shuts off activity in the reasoning part of the brain and the amygdala. In the context of passion, the brain’s judgment and fear centers are on leave. Love also shuts down the centers necessary to mentalize or sustain a theory of mind. Lovers stop differentiating you from me.

Faith is also being studied. Earlier this year the Annals of Neurology published an article by Sam Harris and colleagues exploring what happens in the brain when people are in the act of either believing or disbelieving. In an accompanying editorial, Oliver Sachs and Joy Hirsch underscored the significance of what the researchers found. Belief and disbelief activated different regions of the brain. But in the brain, all belief reactions looked the same, whether the stimulus was relatively neutral: an equation like (2+6)+8=16, or emotionally charged: “A Personal God exists, just as the Bible describes.”

By putting a big religious idea next to a small math equation, some readers might think the researchers intend to glibly dismiss it. But a discovery about brain function does not imply a value judgment. And understanding the reality of the natural world—how the brain works—shouldn’t muddle the big questions about human experience. It should help us answer them.

October 3, 2008

Does Religion Make You Nice?

HOT OFF THE PRESS! (3 Oct 2008)

The very latest issue of the Science journal has published a review of religious prosociality.

This is an issue that has an indirect connection with my recent Masters thesis. What a pity this wasn't published a little earlier, as I could have incorporated some of this data into my thesis.

But never mind, I'll read this review and criticise it here soon.

In plain English (as per the abstract): This review examines the evidence for religious prosociality, in other words, the extent to which religion goes in ensuring you're a "good guy". Do you help people and do favours for them because of your natural niceness, or because your religion tells you to? While surveys find a correlation between religiosity and helpful behaviour, experiments have found that this correlation occurs mainly in context where the reputation of the 'helper' is enhanced. In other words, religiously-motivated helpers want to do good and be seen doing good. Other experiments find that religious thoughts reduce the inclination to cheat and increase helpful behaviour towards random strangers.

I don't know what the rest of it means (re devotion and trust, morally concerned deities, etc) as I'll have to read it to find out. But this is a taste!

October 2, 2008

Let's Celebrate The Real Big Questions

Although it isn't directly relevant to neuroscience or neuropsychology, this excellent essay from Lawrence Krauss in the September (2008) edition of New Scientist discusses premises that I often encounter in my discussions with people:
LAST year I agreed to write a short essay for an advertisement featuring the question: "Does the universe have a purpose?" It was to appear in major media outlets, including The New York Times, The Economist and New Scientist. I was asked to express my views in my own words, so I wasn't worried that they would be distorted to support an ulterior agenda. I considered the ad a useful outlet for communicating how I believe science can inform this question.

I was naive. The ad, which was sponsored by the John Templeton Foundation - an organisation that aims to find links between science and religion - was the first instalment in a Big Ideas series, and has been followed up by essays on: "Does science make belief in God obsolete?" Next week the otherwise well-grounded Skeptics Society is to run a related conference, also sponsored by Templeton, called Origins. According to their promotional material, "the Big Questions... involve Origins", such as the origins of the universe, the laws of nature, time's arrow, life and consciousness. "Science is making significant headway into providing natural explanations for these ultimate questions, which leaves us with the biggest question of all: does science make belief in God obsolete?"

Unfortunately, despite the money being channelled into such meetings and ads, this is neither a very big question nor a very big idea. The issue may be of importance to some theologians and philosophers, but it is essentially irrelevant to scientists. In the academic departments where these origins are being investigated, the question is almost never raised.

Scientists may, if asked, express views on issues relating to purpose and religion, especially to counter ill-conceived notions that might mislead the public, but in our work we focus on scientific questions that can be addressed by the tools we have to explore the universe. Whether any form of modern religion is made obsolete by our progress is a tangential and almost trivial point. If new knowledge about the universe cannot be worked into these philosophies, they will become obsolete. Otherwise, they persist.

While the participants have changed, the so-called debate over the relation between science and religion has hardly progressed in 400 years. Today's arguments about intelligent design, for example, are little different from those of Thomas Aquinas and William Paley, though the realm in which the debate is taking place has been shifted from human scales to scales that are many, many times smaller or larger. Focusing on such stale and fruitless questions prevents the public from appreciating the truly interesting intellectual frontiers in science.

I recently moved to Arizona to lead a new programme on Origins at Arizona State University. Its purpose is to explore and celebrate emerging knowledge on origins: from that of the universe to humanity, consciousness to culture. We will be sponsoring, among other things, a big public event in Phoenix in April 2009, where speakers including Stephen Hawking, Richard Dawkins, Craig Venter, Brian Greene and Steven Pinker will focus on the real questions driving intellectual progress across science. Is there a multiverse? Are the laws of nature unique? What caused the big bang? How did life arise on Earth? How abundant and diverse is life in the universe? How did humans evolve consciousness? Can machines think? Can we genetically re-engineer humans?

These are the questions that reflect the remarkable upheavals and challenges that our understanding of nature has faced over the past century. Our efforts to answer them will form the basis of knowledge and action in the next.

September 12, 2008

A Short Break

Regular readers may have noticed the proliferation of humourous postings of late and a decline in "pure science" posts. I have to admit that I'm writing up a thesis at the moment and am understandably busy with it, so I hardly have much time to write some good analyses of current neuroscience or psychology issues. Hence the tendency to have a quick jab or poke fun as a means to relieve the stress on myself as well as have a laugh.
Not to worry though, I will be back with something of a bang by the end of next week. (Or realistically, a shaky sputtering!)

Are We Dead Yet?

For those who exemplified the definition of hysteria and apocalptic doom and gloom during the run-up to the Large Hadron Collider switch-on, they might like to breathe a sigh while looking at this.

And while we're on the subject of the LHC (oh-so-briefly!), if you still have some nagging doubts about the potentiality of the LHC to cause a major disaster, this website may answer your questions:

September 10, 2008

Hadron Switch-on

Now that the Large Hadron Collider has been switched on, we can expect a flurry of astounding news and reports that will contribute to the advancement of knowledge and science. For a start, it promises to recreate the conditions at the birth of the universe that will allow scientists a better view of how the universe came into being. Billed as "the largest experiment in the world" where protons will collide at 99.99% the speed of light, there isn't much remaining to say to people who continue to cling to beliefs about sky fairies and diablos as fundamental and important forces in the universes.

Except this:


September 8, 2008

Recapitulation of Criticism against the Theory of Childhood

From Ari Rahikkala's LiveJournal and Pharyngula: An awesome parody of ludicrous creationist nonsense arguments being presented in several scientific fields. I nearly cracked a rib laughing at this!

The theory of childhood, also known as child origin, is a damnable, loathsome and indefensible lie. How can any thinking person suppose all humans used to be babies once? Just consider these arguments:

There is no development path from babies to adults, no transitional forms between these two species. Show me even one baby with the head of a grown man on his body. Can you? No? Not even a bearded toddler? No adults with unfused skullbones, outside unfortunate disorders? Not even a tiny little newborn girl suddenly sprouting a respectable bosom? You can't find them, because they don't exist. There isn't a single transitional form between children and adults, and you will never find one because the theory simply is an unscientific lie.

The development of children has been well-researched in our six-month study following a sample of one thousand children and adults of various ages. We have conclusively proven that while there are minor changes in features like height and body fat, and replacement of deciduous teeth with permanent teeth, incontravertibly still every creature in the study that started out as a child had only slightly more adult features at the end of the observation period than at its beginning. Children and adults are separate kinds and there will never be sufficient changes to change one into the other. We reject any evidence from longer-term studies as we believe the laws of physics have changed within the last year.

To claim people come from children is demeaning and morally degrading. We have observed how children behave. If we acted like small children we'd all be demanding and impatient, and we'd be cheating, lying, and stealing from each other all the time. If the theory of childhood were true there would be no morality, and with no morality to build one on, no society. Childhood is a wicked lie used by charlatans to justify evils such as public schools.

There is no consensus on the theory of childhood in the scientific community. We should teach the controversy. Our children will be served well to learn that the prospect of them becoming adults is merely a theoretical idea. Many children come from families that do not subscribe to the theory of childhood, and they could be disturbed if the theory were taught as fact.