Study reveals rats show regret, a cognitive behavior once thought to be uniquely human

EurekAlert!: New research from the Department of Neuroscience at the University of Minnesota reveals that rats show regret, a cognitive behavior once thought to be uniquely and fundamentally human.

Research findings were recently published in Nature Neuroscience.

To measure the cognitive behavior of regret, A. David Redish, Ph.D., a professor of neuroscience in the University of Minnesota Department of Neuroscience, and Adam Steiner, a graduate student in the Graduate Program in Neuroscience, who led the study, started from the definitions of regret that economists and psychologists have identified in the past.

“Regret is the recognition that you made a mistake, that if you had done something else, you would have been better off,” said Redish. “The difficult part of this study was separating regret from disappointment, which is when things aren’t as good as you would have hoped. The key to distinguishing between the two was letting the rats choose what to do.” [Continue reading...]

The boundaries delineating what is taken to be uniquely human are constantly being challenged by new scientific findings. But it’s worth asking why those boundaries were there in the first place.

Surely the scientific approach when investigating a cognitive state such as regret would be to start out without making any suppositions about what non-humans do or don’t experience.

The idea that there is something uniquely human about regret, seems like a vestige of biblically inspired notions of human uniqueness.

That as humans we might be unaware of the regrets of rats says much less about what rats are capable of experiencing than it says about our capacity to imagine non-human experience.

Yet at least rationally, it seems no great leap is required in assuming that any creature that makes choices will also experience something resembling regret.

A cat learning to hunt, surely feels something when it makes a premature strike, having yet to master the right balance between stalking and attacking its prey. That feeling is most likely some form of discomfort that spurs learning. The cat has no names for its feelings yet feels them nonetheless.

That animals lack some of the means through which humans convey their own feelings says much more about our powers of description than their capacity to feel.

facebooktwittermail

Cynicism is toxic

Cynics fool themselves by thinking they can’t be fooled.

The cynic imagines he’s guarding himself against being duped. He’s not naive, he’s worldly wise, so he’s not about to get taken in — but this psychic insulation comes at a price.

The cynic is cautious and mistrustful. Worst of all, the cynic by relying too much on his own counsel, saps the foundation of curiosity, which is the ability to be surprised.

While the ability to develop and sustain an open mind has obvious psychological value, neurologists now say that it’s also necessary for the health of the brain. Cynicism leads towards dementia.

One of the researchers in a new study suggests that the latest findings may offer insights on how to reduce the risks of dementia, yet that seems to imply that people might be less inclined to become cynical simply by knowing that its bad for their health. How are we to reduce the risks of becoming cynical in the first place?

One of the most disturbing findings of a recent Pew Research Center survey, Millenials in Adulthood, was this:

In response to a long-standing social science survey question, “Generally speaking, would you say that most people can be trusted or that you can’t be too careful in dealing with people,” just 19% of Millennials say most people can be trusted, compared with 31% of Gen Xers, 37% of Silents and 40% of Boomers.

While this trust deficit among Millennials no doubt has multiple causes, such as the socially fragmented nature of our digital world, I don’t believe that there has ever before been a generation so thoroughly trained in fear. Beneath cynicism lurks fear.

The fear may have calmed greatly since the days of post-9/11 hysteria, yet it has not gone away. It’s the background noise of American life. It might no longer be focused so strongly on terrorism, since there are plenty of other reasons to fear — some baseless, some over-stated, and some underestimated. But the aggregation of all these fears produces a pervasive mistrust of life.

ScienceDaily: People with high levels of cynical distrust may be more likely to develop dementia, according to a study published in the May 28, 2014, online issue of Neurology®, the medical journal of the American Academy of Neurology.

Cynical distrust, which is defined as the belief that others are mainly motivated by selfish concerns, has been associated with other health problems, such as heart disease. This is the first study to look at the relationship between cynicism and dementia.

“These results add to the evidence that people’s view on life and personality may have an impact on their health,” said study author Anna-Maija Tolppanen, PhD, of the University of Eastern Finland in Kuopio. “Understanding how a personality trait like cynicism affects risk for dementia might provide us with important insights on how to reduce risks for dementia.”

For the study, 1,449 people with an average age of 71 were given tests for dementia and a questionnaire to measure their level of cynicism. The questionnaire has been shown to be reliable, and people’s scores tend to remain stable over periods of several years. People are asked how much they agree with statements such as “I think most people would lie to get ahead,” “It is safer to trust nobody” and “Most people will use somewhat unfair reasons to gain profit or an advantage rather than lose it.” Based on their scores, participants were grouped in low, moderate and high levels of cynical distrust.

A total of 622 people completed two tests for dementia, with the last one an average of eight years after the study started. During that time, 46 people were diagnosed with dementia. Once researchers adjusted for other factors that could affect dementia risk, such as high blood pressure, high cholesterol and smoking, people with high levels of cynical distrust were three times more likely to develop dementia than people with low levels of cynicism. Of the 164 people with high levels of cynicism, 14 people developed dementia, compared to nine of the 212 people with low levels of cynicism.

The study also looked at whether people with high levels of cynicism were more likely to die sooner than people with low levels of cynicism. A total of 1,146 people were included in this part of the analysis, and 361 people died during the average of 10 years of follow-up. High cynicism was initially associated with earlier death, but after researchers accounted for factors such as socioeconomic status, behaviors such as smoking and health status, there was no longer any link between cynicism and earlier death.

facebooktwittermail

How the brain creates personality: A new theory

Stephen M. Kosslyn and G. Wayne Miller write: It is possible to examine any object — including a brain — at different levels. Take the example of a building. If we want to know whether the house will have enough space for a family of five, we want to focus on the architectural level; if we want to know how easily it could catch fire, we want to focus on the materials level; and if we want to engineer a product for a brick manufacturer, we focus on molecular structure.

Similarly, if we want to know how the brain gives rise to thoughts, feelings, and behaviors, we want to focus on the bigger picture of how its structure allows it to store and process information — the architecture, as it were. To understand the brain at this level, we don’t have to know everything about the individual connections among brain cells or about any other biochemical process. We use a rela­tively high level of analysis, akin to architecture in buildings, to characterize relatively large parts of the brain.

To explain the Theory of Cognitive Modes, which specifies general ways of thinking that underlie how a person approaches the world and interacts with other people, we need to provide you with a lot of information. We want you to understand where this theory came from — that we didn’t just pull it out of a hat or make it up out of whole cloth. But there’s no need to lose the forest for the trees: there are only three key points that you will really need to keep in mind.

First, the top parts and the bottom parts of the brain have differ­ent functions. The top brain formulates and executes plans (which often involve deciding where to move objects or how to move the body in space), whereas the bottom brain classifies and interprets incoming information about the world. The two halves always work together; most important, the top brain uses information from the bottom brain to formulate its plans (and to reformulate them, as they unfold over time).

Second, according to the theory, people vary in the degree that they tend to rely on each of the two brain systems for functions that are optional (i.e., not dictated by the immediate situation): Some people tend to rely heavily on both brain systems, some rely heavily on the bottom brain system but not the top, some rely heavily on the top but not the bottom, and some don’t rely heavily on either system.

Third, these four scenarios define four basic cognitive modes— general ways of thinking that underlie how a person approaches the world and interacts with other people. According to the Theory of Cognitive Modes, each of us has a particular dominant cognitive mode, which affects how we respond to situations we encounter and how we relate to others. The possible modes are: Mover Mode, Perceiver Mode, Stimulator Mode, and Adaptor Mode. [Continue reading...]

facebooktwittermail

How memory speaks

Jerome Groopman writes: I began writing these words on what appeared to be an unremarkable Sunday morning. Shortly before sunrise, the bedroom still dim, I awoke and quietly made my way to the kitchen, careful not to disturb my still-sleeping wife. The dark-roast coffee was retrieved from its place in the pantry, four scoops then placed in a filter. While the coffee was brewing, I picked up The New York Times at the door. Scanning the front page, my eyes rested on an article mentioning Svoboda, the far-right Ukrainian political party (svoboda, means, I remembered, “freedom”).

I prepared an egg-white omelette and toasted two slices of multigrain bread. After a few sips of coffee, fragments of the night’s dream came to mind: I am rushing to take my final examination in college chemistry, but as I enter the amphitheater where the test is given, no one is there. Am I early? Or in the wrong room? The dream was not new to me. It often occurs before I embark on a project, whether it’s an experiment in the laboratory, a drug to be tested in the clinic, or an article to write on memory.

The start of that Sunday morning seems quite mundane. But when we reflect on the manifold manifestations of memory, the mundane becomes marvelous. Memory is operative not only in recalling the meaning of svoboda, knowing who was sleeping with me in bed, and registering my dream as recurrent, but also in rote tasks: navigating the still-dark bedroom, scooping the coffee, using a knife and fork to eat breakfast. Simple activities of life, hardly noticed, reveal memory as a map, clock, and mirror, vital to our sense of place, time, and person.

This role of memory in virtually every activity of our day is put in sharp focus when it is lost. Su Meck, in I Forgot to Remember, pieces together a fascinating tale of life after suffering head trauma as a young mother. A ceiling fan fell and struck her head:

You might wonder how it feels to wake up one morning and not know who you are. I don’t know. The accident didn’t just wipe out all my memories; it hindered me from making new ones for quite some time. I awoke each day to a house full of strangers…. And this wasn’t just a few days. It was weeks before I recognized my boys when they toddled into the room, months before I knew my own telephone number, years before I was able to find my way home from anywhere. I have no more memory of those first several years after the accident than my own kids have of their first years of life.

A computed tomography (CT) scan of Meck’s brain showed swelling over the right frontal area. But neurologists were at a loss to explain the genesis of her amnesia. Memory does not exist in a single site or region of the central nervous system. There are estimated to be 10 to 100 billion neurons in the human brain, each neuron making about one thousand connections to other neurons at the junctions termed synapses. Learning, and then storing what we learn through life, involve intricate changes in the nature and number of these trillions of neuronal connections. But memory is made not only via alterations at the synaptic level. It also involves regional remodeling of parts of our cortex. Our brain is constantly changing in its elaborate circuitry and, to some degree, configuration. [Continue reading...]

facebooktwittermail

How music hijacks our perception of time

Jonathan Berger writes: One evening, some 40 years ago, I got lost in time. I was at a performance of Schubert’s String Quintet in C major. During the second movement I had the unnerving feeling that time was literally grinding to a halt. The sensation was powerful, visceral, overwhelming. It was a life-changing moment, or, as it felt at the time, a life-changing eon.

It has been my goal ever since to compose music that usurps the perceived flow of time and commandeers the sense of how time passes. Although I’ve learned to manipulate subjective time, I still stand in awe of Schubert’s unparalleled power. Nearly two centuries ago, the composer anticipated the neurological underpinnings of time perception that science has underscored in the past few decades.

The human brain, we have learned, adjusts and recalibrates temporal perception. Our ability to encode and decode sequential information, to integrate and segregate simultaneous signals, is fundamental to human survival. It allows us to find our place in, and navigate, our physical world. But music also demonstrates that time perception is inherently subjective—and an integral part of our lives. “For the time element in music is single,” wrote Thomas Mann in his novel, The Magic Mountain. “Into a section of mortal time music pours itself, thereby inexpressibly enhancing and ennobling what it fills.” [Continue reading...]

facebooktwittermail

Searching for the elephant’s genius inside the largest brain on land

elephant

Ferris Jabr writes: Many years ago, while wandering through Amboseli National Park in Kenya, an elephant matriarch named Echo came upon the bones of her former companion Emily. Echo and her family slowed down and began to inspect the remains. They stroked Emily’s skull with their trunks, investigating every crevice; they touched her skeleton gingerly with their padded hind feet; they carried around her tusks. Elephants consistently react this way to other dead elephants, but do not show much interest in deceased rhinos, buffalo or other species. Sometimes elephants will even cover their dead with soil and leaves.

What is going through an elephant’s mind in these moments? We cannot explain their behavior as an instinctual and immediate reaction to a dying or recently perished compatriot. Rather, they seem to understand—even years and years after a friend or relative’s death—that an irreversible change has taken place, that, here on the ground, is an elephant who used to be alive, but no longer is. In other words, elephants grieve.

Such grief is but one of many indications that elephants are exceptionally intelligent, social and empathic creatures. After decades of observing wild elephants—and a series of carefully controlled experiments in the last eight years—scientists now agree that elephants form lifelong kinships, talk to one another with a large vocabulary of rumbles and trumpets and make group decisions; elephants play, mimic their parents and cooperate to solve problems; they use tools, console one another when distressed, and probably have a sense of self (See: The Science Is In: Elephants Are Even Smarter Than We Realized)

All this intellect must emerge, in one way or another, from the elephant brain—the largest of any land animal, three times as big as the human brain with individual neurons that seem to be three to five times the size of human brain cells. [Continue reading...]

facebooktwittermail

Too much to remember?

AnalysisBenedict Carey writes: People of a certain age (and we know who we are) don’t spend much leisure time reviewing the research into cognitive performance and aging. The story is grim, for one thing: Memory’s speed and accuracy begin to slip around age 25 and keep on slipping.

The story is familiar, too, for anyone who is over 50 and, having finally learned to live fully in the moment, discovers it’s a senior moment. The finding that the brain slows with age is one of the strongest in all of psychology.

Over the years, some scientists have questioned this dotage curve. But these challenges have had an ornery-old-person slant: that the tests were biased toward the young, for example. Or that older people have learned not to care about clearly trivial things, like memory tests. Or that an older mind must organize information differently from one attached to some 22-year-old who records his every Ultimate Frisbee move on Instagram.

Now comes a new kind of challenge to the evidence of a cognitive decline, from a decidedly digital quarter: data mining, based on theories of information processing. In a paper published in Topics in Cognitive Science, a team of linguistic researchers from the University of Tübingen in Germany used advanced learning models to search enormous databases of words and phrases.

Since educated older people generally know more words than younger people, simply by virtue of having been around longer, the experiment simulates what an older brain has to do to retrieve a word. And when the researchers incorporated that difference into the models, the aging “deficits” largely disappeared.

“What shocked me, to be honest, is that for the first half of the time we were doing this project, I totally bought into the idea of age-related cognitive decline in healthy adults,” the lead author, Michael Ramscar, said by email. But the simulations, he added, “fit so well to human data that it slowly forced me to entertain this idea that I didn’t need to invoke decline at all.” [Continue reading...]

facebooktwittermail

Why we find it difficult to face the future

Alisa Opar writes: The British philosopher Derek Parfit espoused a severely reductionist view of personal identity in his seminal book, Reasons and Persons: It does not exist, at least not in the way we usually consider it. We humans, Parfit argued, are not a consistent identity moving through time, but a chain of successive selves, each tangentially linked to, and yet distinct from, the previous and subsequent ones. The boy who begins to smoke despite knowing that he may suffer from the habit decades later should not be judged harshly: “This boy does not identify with his future self,” Parfit wrote. “His attitude towards this future self is in some ways like his attitude to other people.”

Parfit’s view was controversial even among philosophers. But psychologists are beginning to understand that it may accurately describe our attitudes towards our own decision-making: It turns out that we see our future selves as strangers. Though we will inevitably share their fates, the people we will become in a decade, quarter century, or more, are unknown to us. This impedes our ability to make good choices on their—which of course is our own—behalf. That bright, shiny New Year’s resolution? If you feel perfectly justified in breaking it, it may be because it feels like it was a promise someone else made.

“It’s kind of a weird notion,” says Hal Hershfield, an assistant professor at New York University’s Stern School of Business. “On a psychological and emotional level we really consider that future self as if it’s another person.”

Using fMRI, Hershfield and colleagues studied brain activity changes when people imagine their future and consider their present. They homed in on two areas of the brain called the medial prefrontal cortex and the rostral anterior cingulate cortex, which are more active when a subject thinks about himself than when he thinks of someone else. They found these same areas were more strongly activated when subjects thought of themselves today, than of themselves in the future. Their future self “felt” like somebody else. In fact, their neural activity when they described themselves in a decade was similar to that when they described Matt Damon or Natalie Portman. [Continue reading...]

facebooktwittermail

Understanding the psychology shaping negotiations with Iran

“The only way for interaction with Iran is dialogue on an equal footing, confidence-building and mutual respect as well as reducing antagonism and aggression,” Iranian President Hassan Rouhani said in a speech after taking the oath of office last August.

“If you want the right response, don’t speak with Iran in the language of sanctions, speak in the language of respect.”

In the following article, Nicholas Wright and Karim Sadjadpour describe how an understanding of neuroscience — or lack of it — may determine the outcome of negotiations with Iran.

The whole piece is worth reading, but keep this in mind: every single insight that gets attributed to neuroscience has been clearly established without the need to conduct a single brain scan. Indeed, everything that is here being attributed to the “exquisite neural machinery” of the brain can be understood by studying the workings of the human mind and how thought shapes behavior.

It is important to draw a sharp distinction between the examination of the mind and observing the workings of the brain because the latter is totally dependent on the output of intermediary electronic scanning devices, whereas minds can study themselves and each other directly and through shared language.

One of the insidious effects of neuroscience is that it promotes a view that understanding the ways brains work has greater intrinsic value than understanding how minds work. What the negotiations with Iran demonstrate, however, is that the exact opposite is true.

To the extent that through the development of trust, negotiations are able to advance, this will have nothing to do with anyone’s confidence about what is happening inside anyone’s brain. On the contrary, it will depend on a meeting of minds and mutual understanding. No one will need to understand what is happening in their own or anyone else’s insula cortex, but what will most likely make or break the talks will be whether the Iranians believe they are being treated fairly. The determination of fairness does not depend on the presence or absence of a particular configuration of neural activity but rather on an assessment of reality.

Treat us as equals, Iran’s president said — and that was almost 15 years ago!

Nicholas Wright and Karim Sadjadpour write: “Imagine being told that you cannot do what everyone else is doing,” appealed Iranian Foreign Minister Javad Zarif in a somber YouTube message defending the country’s nuclear program in November. “Would you back down? Would you relent? Or would you stand your ground?”

While only 14 nations, including Iran, enrich uranium (e.g. “what everyone else is doing”), Zarif’s message raises a question at the heart of ongoing talks to implement a final nuclear settlement with Tehran: Why has the Iranian government subjected its population to the most onerous sanctions regime in contemporary history in order to do this? Indeed, it’s estimated that Iran’s antiquated nuclear program needs one year to enrich as much uranium as Europe’s top facility produces in five hours.

To many, the answer is obvious: Iran is seeking a nuclear weapons capability (which it has arguably already attained), if not nuclear weapons. Yet the numerous frameworks used to explain Iranian motivations—including geopolitics, ideology, nationalism, domestic politics, and threat perception—lead analysts to different conclusions. Does Iran want nuclear weapons to dominate the Middle East, or does it simply want the option to defend itself from hostile opponents both near and far? While there’s no single explanation for Tehran’s actions, if there is a common thread that connects these frameworks and may help illuminate Iranian thinking, it is the brain.

Although neuroscience can’t be divorced from culture, history, and geography, there is no Orientalism of the brain: The fundamental biology of social motivations is the same in Tokyo, Tehran, and Tennessee. It anticipates, for instance, how the mind’s natural instinct to reject perceived unfairness can impede similarly innate desires for accommodation, and how fairness can lead to tragedy. It tells us that genuinely conciliatory gestures are more likely and natural than many believe, and how to make our own conciliatory gestures more effective.

Distilled to their essence, nations are led by and comprised of humans, and the success of social animals like humans rests on our ability to control the balance between cooperation and self-interest. The following four lessons from neuroscience may help us understand the obstacles that were surmounted to reach an interim nuclear deal with Iran, and the enormous challenges that still must be overcome in order to reach a comprehensive agreement. [Continue reading...]

facebooktwittermail

How we feel at home

Moheb Costandi writes: Home is more than a place on a map. It evokes a particular set of feelings, and a sense of safety and belonging. Location, memories, and emotions are intertwined within those walls. Over the past few decades, this sentiment has gained solid scientific grounding. And earlier this year, researchers identified some of the cells that help encode our multifaceted homes in the human brain.

In the early 1970s, neuroscientist John O’Keefe of University College London and his colleagues began to uncover the brain mechanisms responsible for navigating space. They monitored the electrical activity of neurons within a part of the brain called the hippocampus. As the animals moved around an enclosure with electrodes implanted in their hippocampus, specific neurons fired in response to particular locations. These neurons, which came to be known as place cells, each had a unique “place field” where it fired: For example, neuron A might be active when the rat was in the far right corner, near the edge of the enclosure, while neuron B fired when the rat was in the opposite corner.

Since then, further experiments have shown that the hippocampus contains at least two other types of brain cells involved in navigation. Grid cells fire periodically as an animal traverses a space, and head direction cells fire when the animal faces a certain direction. Together, place cells, grid cells, and head direction cells form the brain’s GPS, mapping the space around an animal and its location within it.

Neuroscientists assumed that these three types of cells in the hippocampus are how we humans, too, navigate our surroundings. But solid evidence of these cell types came only recently, when a research team implanted electrodes into the brains of epilepsy patients being evaluated before surgery. They measured the activity of neurons in the hippocampus while the patients navigated a computer-generated environment, and found that some of the cells fired at regular intervals, as grid cells in rodents did. The authors of the study, published last August, conclude that the mechanisms of spatial navigation in mice and humans are likely the same. [Continue reading...]

facebooktwittermail

A theory of how networks become conscious

Wired: It’s a question that’s perplexed philosophers for centuries and scientists for decades: Where does consciousness come from? We know it exists, at least in ourselves. But how it arises from chemistry and electricity in our brains is an unsolved mystery.

Neuroscientist Christof Koch, chief scientific officer at the Allen Institute for Brain Science, thinks he might know the answer. According to Koch, consciousness arises within any sufficiently complex, information-processing system. All animals, from humans on down to earthworms, are conscious; even the internet could be. That’s just the way the universe works.

“The electric charge of an electron doesn’t arise out of more elemental properties. It simply has a charge,” says Koch. “Likewise, I argue that we live in a universe of space, time, mass, energy, and consciousness arising out of complex systems.”

What Koch proposes is a scientifically refined version of an ancient philosophical doctrine called panpsychism — and, coming from someone else, it might sound more like spirituality than science. But Koch has devoted the last three decades to studying the neurological basis of consciousness. His work at the Allen Institute now puts him at the forefront of the BRAIN Initiative, the massive new effort to understand how brains work, which will begin next year.

Koch’s insights have been detailed in dozens of scientific articles and a series of books, including last year’s Consciousness: Confessions of a Romantic Reductionist. WIRED talked to Koch about his understanding of this age-old question. [Continue reading...]

facebooktwittermail

Sleep cleanses the brain

The Washington Post reports: While we are asleep, our bodies may be resting, but our brains are busy taking out the trash.

A new study has found that the cleanup system in the brain, responsible for flushing out toxic waste products that cells produce with daily use, goes into overdrive in mice that are asleep. The cells even shrink in size to make for easier cleaning of the spaces around them.

Scientists say this nightly self-clean by the brain provides a compelling biological reason for the restorative power of sleep.

“Sleep puts the brain in another state where we clean out all the byproducts of activity during the daytime,” said study author and University of Rochester neurosurgeon Maiken Nedergaard. Those byproducts include beta-amyloid protein, clumps of which form plaques found in the brains of Alzheimer’s patients.

Staying up all night could prevent the brain from getting rid of these toxins as efficiently, and explain why sleep deprivation has such strong and immediate consequences. Too little sleep causes mental fog, crankiness, and increased risks of migraine and seizure. Rats deprived of all sleep die within weeks.

Although as essential and universal to the animal kingdom as air and water, sleep is a riddle that has baffled scientists and philosophers for centuries. Drifting off into a reduced consciousness seems evolutionarily foolish, particularly for those creatures in danger of getting eaten or attacked. [Continue reading...]

facebooktwittermail

Henry Gustave Molaison — the man who forgot everything

Steven Shapin writes: In the movie “Groundhog Day,” the TV weatherman Phil Connors finds himself living the same day again and again. This has its advantages, as he has hundreds of chances to get things right. He can learn to speak French, to sculpt ice, to play jazz piano, and to become the kind of person with whom his beautiful colleague Rita might fall in love. But it’s a torment, too. An awful solitude flows from the fact that he’s the only one in Punxsutawney, Pennsylvania, who knows that something has gone terribly wrong with time. Nobody else seems to have any memory of all the previous iterations of the day. What is a new day for Rita is another of the same for Phil. Their realities are different—what passes between them in Phil’s world leaves no trace in hers—as are their senses of selfhood: Phil knows Rita as she cannot know him, because he knows her day after day after day, while she knows him only today. Time, reality, and identity are each curated by memory, but Phil’s and Rita’s memories work differently. From Phil’s point of view, she, and everyone else in Punxsutawney, is suffering from amnesia.

Amnesia comes in distinct varieties. In “retrograde amnesia,” a movie staple, victims are unable to retrieve some or all of their past knowledge — Who am I? Why does this woman say that she’s my wife? — but they can accumulate memories for everything that they experience after the onset of the condition. In the less cinematically attractive “anterograde amnesia,” memory of the past is more or less intact, but those who suffer from it can’t lay down new memories; every person encountered every day is met for the first time. In extremely unfortunate cases, retrograde and anterograde amnesia can occur in the same individual, who is then said to suffer from “transient global amnesia,” a condition that is, thankfully, temporary. Amnesias vary in their duration, scope, and originating events: brain injury, stroke, tumors, epilepsy, electroconvulsive therapy, and psychological trauma are common causes, while drug and alcohol use, malnutrition, and chemotherapy may play a part.

There isn’t a lot that modern medicine can do for amnesiacs. If cerebral bleeding or clots are involved, these may be treated, and occupational and cognitive therapy can help in some cases. Usually, either the condition goes away or amnesiacs learn to live with it as best they can — unless the notion of learning is itself compromised, along with what it means to have a life. Then, a few select amnesiacs disappear from systems of medical treatment and reappear as star players in neuroscience and cognitive psychology.

No star ever shone more brightly in these areas than Henry Gustave Molaison, a patient who, for more than half a century, until his death, in 2008, was known only as H.M., and who is now the subject of a book, “Permanent Present Tense” (Basic), by Suzanne Corkin, the neuroscientist most intimately involved in his case. [Continue reading...]

facebooktwittermail

Can we get our heads around consciousness?

Michael Hanlon writes: The question of how the brain produces the feeling of subjective experience, the so-called ‘hard problem’, is a conundrum so intractable that one scientist I know refuses even to discuss it at the dinner table. Another, the British psychologist Stuart Sutherland, declared in 1989 that ‘nothing worth reading has been written on it’. For long periods, it is as if science gives up on the subject in disgust. But the hard problem is back in the news, and a growing number of scientists believe that they have consciousness, if not licked, then at least in their sights.

A triple barrage of neuroscientific, computational and evolutionary artillery promises to reduce the hard problem to a pile of rubble. Today’s consciousness jockeys talk of p‑zombies and Global Workspace Theory, mirror neurones, ego tunnels, and attention schemata. They bow before that deus ex machina of brain science, the functional magnetic resonance imaging (fMRI) machine. Their work is frequently very impressive and it explains a lot. All the same, it is reasonable to doubt whether it can ever hope to land a blow on the hard problem.

For example, fMRI scanners have shown how people’s brains ‘light up’ when they read certain words or see certain pictures. Scientists in California and elsewhere have used clever algorithms to interpret these brain patterns and recover information about the original stimulus — even to the point of being able to reconstruct pictures that the test subject was looking at. This ‘electronic telepathy’ has been hailed as the ultimate death of privacy (which it might be) and as a window on the conscious mind (which it is not).

The problem is that, even if we know what someone is thinking about, or what they are likely to do, we still don’t know what it’s like to be that person. Hemodynamic changes in your prefrontal cortex might tell me that you are looking at a painting of sunflowers, but then, if I thwacked your shin with a hammer, your screams would tell me you were in pain. Neither lets me know what pain or sunflowers feel like for you, or how those feelings come about. In fact, they don’t even tell us whether you really have feelings at all. One can imagine a creature behaving exactly like a human — walking, talking, running away from danger, mating and telling jokes — with absolutely no internal mental life. Such a creature would be, in the philosophical jargon, a zombie. (Zombies, in their various incarnations, feature a great deal in consciousness arguments.)

Why might an animal need to have experiences (‘qualia’, as they are called by some) rather than merely responses? In this magazine, the American psychologist David Barash summarised some of the current theories. One possibility, he says, is that consciousness evolved to let us to overcome the ‘tyranny of pain’. Primitive organisms might be slaves to their immediate wants, but humans have the capacity to reflect on the significance of their sensations, and therefore to make their decisions with a degree of circumspection. This is all very well, except that there is presumably no pain in the non-conscious world to start with, so it is hard to see how the need to avoid it could have propelled consciousness into existence.

Despite such obstacles, the idea is taking root that consciousness isn’t really mysterious at all; complicated, yes, and far from fully understood, but in the end just another biological process that, with a bit more prodding and poking, will soon go the way of DNA, evolution, the circulation of blood, and the biochemistry of photosynthesis. [Continue reading...]

facebooktwittermail

How consciousness works

Michael Graziano writes: Scientific talks can get a little dry, so I try to mix it up. I take out my giant hairy orangutan puppet, do some ventriloquism and quickly become entangled in an argument. I’ll be explaining my theory about how the brain — a biological machine — generates consciousness. Kevin, the orangutan, starts heckling me. ‘Yeah, well, I don’t have a brain. But I’m still conscious. What does that do to your theory?’

Kevin is the perfect introduction. Intellectually, nobody is fooled: we all know that there’s nothing inside. But everyone in the audience experiences an illusion of sentience emanating from his hairy head. The effect is automatic: being social animals, we project awareness onto the puppet. Indeed, part of the fun of ventriloquism is experiencing the illusion while knowing, on an intellectual level, that it isn’t real.

Many thinkers have approached consciousness from a first-person vantage point, the kind of philosophical perspective according to which other people’s minds seem essentially unknowable. And yet, as Kevin shows, we spend a lot of mental energy attributing consciousness to other things. We can’t help it, and the fact that we can’t help it ought to tell us something about what consciousness is and what it might be used for. If we evolved to recognise it in others – and to mistakenly attribute it to puppets, characters in stories, and cartoons on a screen — then, despite appearances, it really can’t be sealed up within the privacy of our own heads.

Lately, the problem of consciousness has begun to catch on in neuroscience. How does a brain generate consciousness? In the computer age, it is not hard to imagine how a computing machine might construct, store and spit out the information that ‘I am alive, I am a person, I have memories, the wind is cold, the grass is green,’ and so on. But how does a brain become aware of those propositions? The philosopher David Chalmers has claimed that the first question, how a brain computes information about itself and the surrounding world, is the ‘easy’ problem of consciousness. The second question, how a brain becomes aware of all that computed stuff, is the ‘hard’ problem.

I believe that the easy and the hard problems have gotten switched around. The sheer scale and complexity of the brain’s vast computations makes the easy problem monumentally hard to figure out. How the brain attributes the property of awareness to itself is, by contrast, much easier. If nothing else, it would appear to be a more limited set of computations. In my laboratory at Princeton University, we are working on a specific theory of awareness and its basis in the brain. Our theory explains both the apparent awareness that we can attribute to Kevin and the direct, first-person perspective that we have on our own experience. And the easiest way to introduce it is to travel about half a billion years back in time. [Continue reading...]

facebooktwittermail

Surge of brain activity may explain near-death experience, study says

The Washington Post reports: It’s called a near-death experience, but the emphasis is on “near.” The heart stops, you feel yourself float up and out of your body. You glide toward the entrance of a tunnel, and a searing bright light envelops your field of vision.

It could be the afterlife, as many people who have come close to dying have asserted. But a new study says it might well be a show created by the brain, which is still very much alive. When the heart stops, neurons in the brain appeared to communicate at an even higher level than normal, perhaps setting off the last picture show, packed with special effects.

“A lot of people believed that what they saw was heaven,” said lead researcher and neurologist Jimo Borjigin. “Science hadn’t given them a convincing alternative.”

Scientists from the University of Michigan recorded electroencephalogram (EEG) signals in nine anesthetized rats after inducing cardiac arrest. Within the first 30 seconds after the heart had stopped, all the mammals displayed a surge of highly synchronized brain activity that had features associated with consciousness and visual activation. The burst of electrical patterns even exceeded levels seen during a normal, awake state.

In other words, they may have been having the rodent version of a near-death experience. [Continue reading...]

facebooktwittermail