Thursday, May 15, 2008

Mind & Brain - Decision Theory, Memory, The Not-So-Modular Brain, Mirror Neurons, and So Much More

I've been stacking up open tabs for a couple of days now. Some excellent new articles and a handful of new books. By the way, if there are topics anyone out wants to see, drop me a comment or an email and I'll try to find related articles. Enjoy!

From More Intelligent Life: MIND OVER MATTER, an article on decision theory.

The less you know, the more wisely you seem to choose. Helen Joyce rummages through the mental toolbox you use when the facts are scarce ...

From INTELLIGENT LIFE magazine, Spring 2008

"Set down all the Reasons, pro and con, in opposite Columns," wrote Benjamin Franklin in 1779 to his nephew, who was attempting to choose which of two women to propose to. "When you have considered them two or three days...observe what Reasons or Motives in each Column are equal in weight, one to one, one to two, two to three, or the like, and when you have struck out from both sides all the Equalities, you will see in which column remains the balance."

If you have been faced with a difficult decision--which house to buy; whether to accept a posting abroad--you may have done something similar yourself. And you may have had the following strange experience. You listed, you weighed, you calculated the answer--and, in a flash of insight, you realised it was the wrong one.

Did you omit some small but vital factor from one of your Columns? And how on earth did your subconscious get it right so fast?

The answer to these puzzles, says Gerd Gigerenzer, a German psychologist, lies in the way we make decisions, which is not how Franklin--or modern students of decision theory--think we should. Gigerenzer was one of the researchers whose studies of human cognition underpinned Malcolm Gladwell's 2005 best-seller, "Blink", which was about how snap decisions often seem to yield better results than careful analysis. In his new book, "Gut Feelings", Gigerenzer describes some of the quick-and-dirty decision-making tools our brains come fitted with--an "adaptive toolbox" of tricks that we skilfully, and usually unconsciously, pick for the task at hand.

Read more. A key quote: "Ignorance isn't random; it's systematic," says Gigerenzer. "If you know too much, it is harder to distinguish between what is important, and what is not." Dang, and I thought knowing everything was the key to life.

* * * * *

Total Recall … Or At Least the Gist
, from Miller-McCune

Our memory is like an ear of corn. At least, that's what Valerie Reyna was taught in graduate school.

Its Forrest Gumpish feel notwithstanding, the metaphor seemed scientifically sound. After all, researchers had already concluded there are two distinct types of memory: Verbatim, which allows us to recall what specifically happened at any given moment, and gist, which enables us to put the event in context and give it meaning.

"We were taught you extracted the gist from the verbatim memory," recalled Reyna, an experimental psychologist and former senior research adviser to the U.S. Department of Education. "It was like husking an ear of corn. You threw away the husk, which was the verbatim, and you kept the gist, which was the kernel of meaning."

There it was: Neat. Simple. Agrarian.

And also, as Reyna discovered over decades of subsequent research, wrong.

After conducting numerous studies with her partner, psychologist Charles Brainerd, Reyna concluded that verbatim and gist memory are separate, parallel systems. So separate, in fact, that "there is some evidence" they occupy different sections of the brain.

Reyna and Brainerd's hypothesis, which they call "fuzzy trace theory," explains how we can "remember" things that never really happened.

Read the rest.

One good question: Why did we develop two separate memory systems? And another: Did each have a correspondence to some other brain system when it arose?

* * * * *

From Michael Shermer (of The Skeptic) at Scientific American: The Brain Is Not Modular: What fMRI Really Tells Us

The atom is like a solar system, with electrons whirling around the nucleus like planets orbiting a star. No, actually, it isn’t. But as a first approximation to help us visualize something that is so invisible, that image works as a metaphor.

Science traffics in metaphors because our brains evolved to grasp intuitively a world far simpler than the counterintuitive world that science has only recently revealed. The functional activity of the brain, for example, is nearly as invisible to us as the atom, and so we employ metaphors. Over the centuries the brain has been compared to a hydraulic ma­chine (18th century), a mechanical calculator (19th century) and an electronic computer (20th century). Today a popular metaphor is that the brain is like a Swiss Army knife, with specialized modules for vision, language, facial recognition, cheating detection, risk taking, spi­rit­uality and even God.

Modularity metaphors have been fueled by a new brain-scanning technology called functional magnetic resonance imaging (fMRI). We have all seen scans with highlighted (usually in red) areas where your brain “lights up” when thinking about X (money, sex, God, and so on). This new modularity metaphor is so seductive that I have employed it myself in several books on the evolution of religion (belief modules), morality (moral modules) and economics (money modules). There is a skeptical movement afoot to curtail abuses of the metaphor, however, and it is being driven by neuroscientists themselves. The November 11, 2007, edition of the New York Times, for example, published an opinion piece entitled “This Is Your Brain on Politics,” by neuroscientist Marco Iacoboni of the University of California, Los Angeles, and his colleagues. The writers presented the results of their brain scans on swing voters. “When we showed subjects the words ‘Democrat,’ ‘Republican’ and ‘independent,’ they exhibited high levels of activity in the part of the brain called the amygdala, indicating anxiety,” the authors note. “The two areas in the brain associated with anxiety and disgust—the amygdala and the insula—were especially active when men viewed ‘Republican.’ But all three labels also elicited some activity in the brain area associated with reward, the ventral striatum, as well as other regions related to desire and feeling connected.” So the word “Republican” elicits anxiety and disgust, except for when it triggers feelings of desire and connectedness. The rest of the conclusions are similarly obfuscating.

In a response befitting the self-correcting nature of science, Iacoboni’s U.C.L.A. colleague Russell Poldrack and 16 other neuroscientists from labs around the world published a response three days later in the Times, explaining: “As cognitive neuroscientists who use the same brain imaging technology, we know that it is not possible to definitively determine whether a person is anxious or feeling connected simply by looking at activity in a particular brain region. This is so because brain regions are typically en­gaged by many mental states, and thus a one-to-one mapping between a brain region and a mental state is not possible.” For example, the amygdala is activated by arousal and positive emotions as well, so the key to interpreting such scans is careful experimental design that allows comparison between brain states.

Read the rest.

I've never liked the "modular theory" of brain function. It's tempting to look at fMRI images and think that we can pin-point certain brain functions (which is sometimes true) but often this is simply just not the case.

The brain seems to be much more of a neural net, with various regions of the brain involved in even the simplest functions, such as vision (which is dauntingly complex).

* * * * *

Also from Scientific American: Do Infants See Colors Differently?
How do we perceive a rainbow? And does everyone perceive a rainbow in the same way? These seemingly simple questions can reveal some interesting features of the human brain. For instance, is the “striped” appearance of the rainbow—the seven distinct bands of color that we see—a construct of our higher mental processes, or do the mechanics of human color vision determine it at a very early perceptual level? If your language does not have separate words for “blue” and “green” (and many languages, including Welsh, do not), do you perceive these shades as more similar than a speaker of English?

Searching for answers to these questions, in recent years many scientists have concluded that speakers of languages that label color in ways distinct from those used in English may see a different rainbow from that of English speakers. Recent studies have claimed that language processing is automatically involved in perceptual decisions about color in the brains of adults, even when hues are visible only briefly (100 milliseconds) or when decisions do not require participants to name colors verbally. Moreover, these effects are language-specific, so speakers of Russian or Korean show a different pattern of responses to color than speakers of English.

A recent study in PNAS by researchers at the University of Surrey challenges this view, however. It suggests an intriguing and novel account of color categorization in infants. In this study 18 English-speaking adults and 13 four-month-old infants were shown a colored target on a colored background. Adults were faster to initiate eye-movements toward the target when the target and background colors came from different color categories (for example, blue target, green background) than when both target and background were the same color (such as different shades of blue).

How Babies See Color

This discrimination advantage for different-category compared to same-category judgments is called Categorical Perception (CP). It is now clear that the effect in adults is language-driven. For instance, healthy, right-handed adults only show CP selectively when colors are presented to the right visual field. It is generally accepted that CP occurs because colors presented to the right visual field preferentially access language-processing areas located in the left hemisphere.

The authors of the new article agree with the current general consensus that CP in adults depends on privileged access to language areas in the left hemisphere. They also agree that the precise color terms that are represented in language are culturally transmitted during childhood and that there has been no “nativist,” or innate, pre-linguistic partitioning by the visual processing pathways into innate color categories in the left hemisphere. This idea fits with their data demonstrating that four-month-old infants showed no hint of CP when targets were presented in the right visual field. Because these infants have not yet acquired language, it is unsurprising that they do not show language-driven category effects in the left hemisphere.

So far, so predictable. What is striking, however, is that the same four-month-old infants did show a CP effect in the right hemisphere, exactly the reverse of the effect shown by adults. When a green target appeared on a green background in the left visual field (which has preferential access to the right hemisphere), infants were significantly slower to move their eyes toward the target than when a blue target appeared on the same green background. The authors claim that their results provide some evidence for pre-linguistic partitioning of color categories in four-month-old infants, but only from stimuli that preferentially access the right hemisphere. Such a result provides some empirical evidence for the existence of an innate pre-linguistic category boundary between blue and green.
Read the rest.

It's amazing to me sometimes how much of our perception of the world is language-based. But as this study suggests, we may have some innate grasp (perhaps too strong a word) of color distinctions. Nature vs. nurture rages on.

* * * * *

From New Scientist, How the brain detects the emotions of others
People who are good at interpreting facial expressions have "mirror neuron" systems that are more active, say researchers. The finding adds weight to the idea that these cells are crucial to helping us figure out how others are feeling.

Mirror neurons are brain cells that fire both when you do something and when you watch someone else do the same thing.

Because they allow us to mimic what others are doing, it is thought that these neurons may be responsible for why we can feel empathy, or understand others' intentions and states of mind. People with autism, for instance, show reduced mirror neuron activity during social cognition tasks.

Now Peter Enticott at Monash University in Melbourne, Australia, and his colleagues have found evidence supporting this theory. They asked 20 healthy adults to look at pairs of images. In one task, they had to decide if paired images of faces were the same person. In another, they had to decide if both faces were showing the same emotion.

In a separate task, volunteers watched video clips of thumb movement, a hand grasping a pen and a hand while writing, while the activity in the primary motor cortex of the brain, which contains mirror neurons, was recorded.

Read the rest.

Mirror neurons have been attributed all kinds of powers, but it seems that interpersonal interaction seems to be where they are most important and most active.

* * * * *

Now for something really geeky, from PhysOrg, First evidence of native dendritic cells in brain
In a finding that has the potential to change the way researchers think about the brain, scientists at Rockefeller University have found dendritic cells where they’ve never been seen before: among this organ’s neurons and connective cells.
The immunity-directing dendritic cell had previously been seen in the human nervous system only after brain injury or disease. But the new study, published next month in the Journal of Comparative Neurology, shows for the first time that the brain has its own, resident population of dendritic cells that may serve as a line of defense against pathogens that sneak past the blood-brain barrier.

The brain is packed with different types of microglia — cells that perform a variety of immune functions in the central nervous system. Until now, however, no one had identified dendritic cells among them. Research associate professor Karen Bulloch and her colleagues made their discovery using mice bred to have a fluorescent marker attached to a dendritic-cell specific protein.

Originally developed by Rockefeller scientists to help them visualize dendritic cells in immune tissues, Bulloch reasoned that the mice might be helpful in determining whether dendritic cells are also in the central nervous system. Bulloch, who’s spent a lot of time studying microglia, knew exactly where in the brain to look. But what she saw when she first peered through the microscope surprised her so much, she says, “I literally almost fell off my chair.”
Read the rest. Key quote: “The dendritic cells serve both as shepherds of newborn nerve cells and as gatekeepers, which police intruding molecules that may come into the brain through these pathways.”

* * * * *

Finally, some books that may be of interest.

~ A portrait of the brain - a chance to hear the author -- "In his recent book "A Portrait Of the Brain" neurologist Adam Zeman seeks to explore the brain all the way from its atoms to the soul. He does this Oliver Sacks-style, by discussing patients he has encountered, one of whom, for example, manifests psychological problems which are caused by a simple, yet devastating DNA abnormality, while another suffers physical problems (blackouts) which are actually psychological in their origin." Includes links to audio and Zeman's page.

~ The Boy Who Was Raised as a Dog: And Other Stories from a Child Psychiatrist's Notebook--What Traumatized Children Can Teach Us About Loss, Love, and Healing
by Bruce Perry with Maia Szalavitz
Basic, 2007
Review by Christian Perring
May 13th 2008 (Volume 12, Issue 20)

Bruce Perry is a compassionate, insightful and thoughtful child psychiatrist who works in Texas. This collection of clinical cases focuses on the effects of trauma and abuse on children. He explains why they react as they do to their experiences, and when he can, he finds ways to help them. His approach is distinctive because of its emphasis on neuroscience and the ways in which extreme experiences affect the growth of a child's brain.

Perry starts off with the case of 7 year old Tina, who had suffered repeated sexual abuse, and now believed that she should act sexually with all men to win their approval. Her early experiences also caused her terrible stress, and affected her whole body, including her heart rate, her attention, her sleep, her fine motor control, and in her language development. Perry finds that he has only partial success in treating such profound damage caused by abuse. Other cases include a three year old girl who witnessed the murder of her mother and was alone with her mother's body for an extended period of time, children from the Branch Davidian ranch, a boy who was raised in a cage like a dog, a murderer who had been abused as a child, and children who claimed that they had abused by Satan worshippers. Through these cases, Perry sets out a great deal of scientific understanding of brain development and the importance of nurturing young children. He also points out some of the dangers of faddish theories about curing disorders in young children and of using treatments that haven't been shown to be effective. For example, he explains the enthusiasm for diagnosing Reactive Attachment Disorder and he argues that "Holding Therapy" that has been proposed and used as a cure can in fact be coercive and abusive.

Child psychiatry gets into the news with increasing frequency these days, and it is clear to all that in order to understand a child's problems you have to look not just at the child, but also his or her family.

~ Why Truth Matters
by Ophelia Benson and Jeremy Stangroom
Continuum, 2007
Review by Ed Brandon
May 13th 2008 (Volume 12, Issue 20)

We are surrounded by lies, and "liars in public places," as Ezra Pound said many years ago. We are inundated with varieties of fashionable nonsense. People ascribe intellectual authority to ancient scriptures or to today's clerics or snake oil salesmen. It is possible not to be concerned with these corruptions of thought, but if you are dismayed by any of them you will find that you need the simple contrast of truth and falsehood, and the somewhat more complex contrast of better or worse supported claims, even to characterize them, let alone to begin to deal with them.

There are strong pressures not to confront culpable or negligent error. People seem to think they have a right not to be mocked or simply questioned when they believe the unbelievable. People seem to think that various beliefs are too disturbing to be discussed. And many of these people will take to the streets, or picket lectures, when their prejudices are threatened. Diplomacy seems to call for a language bereft of even the possibility of stating unpalatable facts. And if all this were not enough, many intellectuals subscribe to doctrines that make our ability to distinguish truth from falsehood, or better from worse supported explanations, seem impossible.Ophelia Benson and Jeremy Stangroom, who run a website devoted to trashing fashionable nonsense (Butterfliesandwheels.com), provide, in Why Truth Matters, a survey of various errors, and attempt to show us why indeed truth matters, and how we can sensibly affirm it. Their hearts are, as far as I am concerned, in the right place; what worries me is that they have perhaps failed to go far enough to demonstrate the superiority of the truth as they see it. It is a common maxim in responsible criticism to take the strongest version of a position to be refuted. It is not clear to me that Benson and Strangroom (B & S henceforth) have always heeded that admonition. And when the issue in contention is broadly factual (such as with claims about an African role in Greek philosophy, or Irving's holocaust denial), it is really beyond the scope of a general discussion such as this to set out the masses of evidence that should convince an impartial inquirer. B & S can really only invoke authority, and belittle the absurdities they reject.

~ ID: The Quest for Identity in the 21st Century by Susan Greenfield

In this short book neuroscientist Susan Greenfield, well known for her work communicating science to the general public, attempts to explain what it is about the human brain that allows it to become host to a mind, self-conscious, aware and able to reflect on its own existence and mortality.

While it is billed as “a stark warning” of the threats to individuality which arise in “the modern world” it is really an exploration of the latest thinking in neurophysiology and the physiological basis of thought, consciousness and identity.

Greenfield bases her book on an exploration of the characteristics of three different personality archetypes, “Somebody”, “Anybody” and “Nobody”, which she loosely identifies with individualism, collective fundamentalism and the blurred lack of self that results from a life lived in front of the screen. She later adds a fourth, the creative “Eureka” mindset, and ends the book with a set of policy recommendations for the education system designed to promote this creativity.


1 comment:

Anonymous said...

Good Job! :)