In order for our minds to go beyond syntax to semantics, we need feelings


Stephen T Asma writes: After you spend time with wild animals in the primal ecosystem where our big brains first grew, you have to chuckle a bit at the reigning view of the mind as a computer. Most cognitive scientists, from the logician Alan Turing to the psychologist James Lloyd McClelland, have been narrowly focused on linguistic thought, ignoring the whole embodied organism. They see the mind as a Boolean algebra binary system of 1 or 0, ‘on’ or ‘off’. This has been methodologically useful, and certainly productive for the artifical intelligence we use in our digital technology, but it merely mimics the biological mind. Computer ‘intelligence’ might be impressive, but it is an impersonation of biological intelligence. The ‘wet’ biological mind is embodied in the squishy, organic machinery of our emotional systems — where action-patterns are triggered when chemical cascades cross volumetric tipping points.

Neuroscience has begun to correct the computational model by showing how our rational, linguistic mind depends on the ancient limbic brain, where emotions hold sway and social skills dominate. In fact, the cognitive mind works only when emotions preferentially tilt our deliberations. The neuroscientist Antonio Damasio worked with patients who had damage in the communication system between the cognitive and emotional brain. The subjects could compute all the informational aspects of a decision in detail, but they couldn’t actually commit to anything. Without clear limbic values (that is, feelings), Damasio’s patients couldn’t decide their own social calendars, prioritise jobs at work, or even make decisions in their own best interest. Our rational mind is truly embodied, and without this emotional embodiment we have no preferences. In order for our minds to go beyond syntax to semantics, we need feelings. And our ancestral minds were rich in feelings before they were adept in computations.

Our neo-cortex mushroomed to its current size less than one million years ago. That’s a very recent development when we remember that the human clade or group broke off from the great apes in Africa 7 million years ago. That future-looking, tool-wielding, symbol-juggling cortex grew on top of the limbic system. Older still is the reptile brain — the storehouse of innate motivational instincts such as pain-avoidance, exploration, hunger, lust, aggression and so on. Walking around (very carefully) on the Serengeti is like visiting the nursery of our own mind. [Continue reading…]


The microbes that make us who we are


Most people, however strongly they might hold to what they regard as a scientific view of life — that we are biological organisms, products of evolution, not destined for a supernatural afterlife — nevertheless most likely have a sense of identity that does not easily accommodate the idea that our thoughts and feelings are influenced by bacteria. Indeed, such an idea might sound delusional.

Yet this is what is increasingly clearly understood: that the body is not the abode of an elusive self; nor that human experience can be reduced to the aggregation of cascades of action potentials producing a neural symphony; but that this seemingly unitary being is in fact a community in which what we are and what lives inside our body cannot be separated.

Science magazine reports: The 22 men took the same pill for four weeks. When interviewed, they said they felt less daily stress and their memories were sharper. The brain benefits were subtle, but the results, reported at last year’s annual meeting of the Society for Neuroscience, got attention. That’s because the pills were not a precise chemical formula synthesized by the pharmaceutical industry.

The capsules were brimming with bacteria.

In the ultimate PR turnaround, once-dreaded bacteria are being welcomed as health heroes. People gobble them up in probiotic yogurts, swallow pills packed with billions of bugs and recoil from hand sanitizers. Helping us nurture the microbial gardens in and on our bodies has become big business, judging by grocery store shelves.

These bacteria are possibly working at more than just keeping our bodies healthy: They may be changing our minds. Recent studies have begun turning up tantalizing hints about how the bacteria living in the gut can alter the way the brain works. These findings raise a question with profound implications for mental health: Can we soothe our brains by cultivating our bacteria?

By tinkering with the gut’s bacterial residents, scientists have changed the behavior of lab animals and small numbers of people. Microbial meddling has turned anxious mice bold and shy mice social. Rats inoculated with bacteria from depressed people develop signs of depression themselves. And small studies of people suggest that eating specific kinds of bacteria may change brain activity and ease anxiety. Because gut bacteria can make the very chemicals that brain cells use to communicate, the idea makes a certain amount of sense.

Though preliminary, such results suggest that the right bacteria in your gut could brighten mood and perhaps even combat pernicious mental disorders including anxiety and depression. The wrong microbes, however, might lead in a darker direction.

This perspective might sound a little too much like our minds are being controlled by our bacterial overlords. But consider this: Microbes have been with us since even before we were humans. Human and bacterial cells evolved together, like a pair of entwined trees, growing and adapting into a (mostly) harmonious ecosystem.

Our microbes (known collectively as the microbiome) are “so innate in who we are,” says gastroenterologist Kirsten Tillisch of UCLA. It’s easy to imagine that “they’re controlling us, or we’re controlling them.” But it’s becoming increasingly clear that no one is in charge. Instead, “it’s a conversation that our bodies are having with our microbiome,” Tillisch says. [Continue reading…]


Your brain’s music circuit has been discovered


Daniel A Gross writes: Before Josh McDermott was a neuroscientist, he was a club DJ in Boston and Minneapolis. He saw first-hand how music could unite people in sound, rhythm, and emotion. “One of the reasons it was so fun to DJ is that, by playing different pieces of music, you can transform the vibe in a roomful of people,” he says.

With his club days behind him, McDermott now ventures into the effects of sound and music in his lab at the Massachusetts Institute of Technology, where he is an assistant professor in the Department of Brain and Cognitive Sciences. In 2015, he and a post-doctoral colleague, Sam Norman-Haignere, and Nancy Kanwisher, a professor of cognitive neuroscience at MIT, made news by locating a neural pathway activated by music and music alone. McDermott and his colleagues played a total of 165 commonly heard natural sounds to ten subjects willing to be rolled into an fMRI machine to listen to the piped-in sounds. The sounds included a man speaking, a songbird, a car horn, a flushing toilet, and a dog barking. None sparked the same population of neurons as music.

Their discovery that certain neurons have “music selectivity” stirs questions about the role of music in human life. Why do our brains contain music-selective neurons? Could some evolutionary purpose have led to neurons devoted to music? McDermott says the study can’t answer such questions. But he is excited by the fact that it shows music has a unique biological effect. “We presume those neurons are doing something in relation to the analysis of music that allows you to extract structure, following melodies or rhythms, or maybe extract emotion,” he says. [Continue reading…]


New clues on how the brain tracks time


Emily Singer writes: Our brains have an extraordinary ability to monitor time. A driver can judge just how much time is left to run a yellow light; a dancer can keep a beat down to the millisecond. But exactly how the brain tracks time is still a mystery. Researchers have defined the brain areas involved in movement, memory, color vision and other functions, but not the ones that monitor time. Indeed, our neural timekeeper has proved so elusive that most scientists assume this mechanism is distributed throughout the brain, with different regions using different monitors to keep track of time according to their needs.

Over the last few years, a handful of researchers have compiled growing evidence that the same cells that monitor an individual’s location in space also mark the passage of time. This suggests that two brain regions — the hippocampus and the entorhinal cortex, both famous for their role in memory and navigation — can also act as a sort of timer.

In research published in November, Howard Eichenbaum, a neuroscientist at Boston University, and collaborators showed that cells in rats that form the brain’s internal GPS system, known as grid cells, are more malleable than had been anticipated. Typically these cells act like a dead-reckoning system, with certain neurons firing when an animal is in a specific place. (The researchers who discovered this shared the Nobel Prize in 2014.) Eichenbaum found that when an animal is kept in place — such as when it runs on a treadmill — the cells keep track of both distance and time. The work suggests that the brain’s sense of space and time are intertwined. [Continue reading…]


Stories about our deepest values activate brain region once thought to be its autopilot

University of Southern California: Everyone has at least a few non-negotiable values. These are the things that, no matter what the circumstance, you’d never compromise for any reason – such as “I’d never hurt a child,” or “I’m against the death penalty.”

Real-time brain scans show that when people read stories that deal with these core, protected values, the “default mode network” in their brains activates.

This network was once thought of as just the brain’s autopilot, since it has been shown to be active when you’re not engaged by anything in the outside world – but studies like this one suggest that it’s actually working to find meaning in the narratives.

“The brain is devoting a huge amount of energy to whatever that network is doing. We need to understand why,” said Jonas Kaplan of the USC Dornsife Brain and Creativity Institute. Kaplan was the lead author of the study, which was published on Jan. 7 in the journal Cerebral Cortex.

Kaplan thinks that it’s not just that the brain is presented with a moral quandary, but rather that the quandary is presented in a narrative format.

“Stories help us to organize information in a unique way,” he said. [Read more…]


Babies have more acute powers of perception than adults

Science News reports: Six-month-old babies can spot subtle differences between two monkey faces easy as pie. But 9-month-olds — and adults — are blind to the differences. In a 2002 study of facial recognition, scientists pitted 30 6-month-old babies against 30 9-month-olds and 11 adults. First, the groups got familiar with a series of monkey and human faces that flashed on a screen. Then new faces showed up, interspersed with already familiar faces. The idea is that the babies would spend more time looking at new faces than ones they had already seen.

When viewing human faces, all of the observers, babies and adults alike, did indeed spend more time looking at the new people, showing that they could easily pick out familiar human faces. But when it came to recognizing monkey faces, the youngsters blew the competition out of the water. Six-month-old babies recognized familiar monkey faces and stared at the newcomers longer. But both adults and 9-month-old babies were flummoxed, and looked at the new and familiar monkey faces for about the same amount of time.

Superior visual skills don’t apply to just faces, either. Three- to 4-month-old babies can see differences in lighting that are undetectable to adults. This ephemeral superskill evaporates just months later, scientists reported in December in Current Biology. To test babies’ visual acuity, researchers led by Jiale Yang of Chuo University in Tokyo first generated a series of 3-D pictures of snails. The shiny snails were made to look as though light was hitting them from different places. Like adults, 5- to 6-month-old babies couldn’t spot the lighting differences. But younger babies could, the team found. [Continue reading…]


Old and new: How the brain evokes a sense of familiarity

Science News reports: It’s happened to all of us at one time or another: You’re walking through a crowd, and suddenly a face seems incredibly familiar — so much so that you do a double-take. Who is that? How do you know them? You have no idea, but something about their face nags at you. You know you’ve seen it before.

The reason you know that face is in part because of your perirhinal cortex. This is an area of the brain that helps us to determine familiarity, or whether we have seen an object before. A new study of brain cells in this area finds that firing these neurons at one frequency makes the brain treat novel images as old hat. But firing these same neurons at another frequency can make the old new again.

“Novelty and familiarity are both really important,” says study coauthor Rebecca Burwell, a neuroscientist at Brown University in Providence, R.I. “They are important for learning and memory and decision making.” Finding a cache of food and knowing it is new could be useful for an animal’s future. So is recognizing a familiar place where the pickings were good in the past.

But knowing that something is familiar is not quite the same thing as knowing what that thing is. “You’re in a crowd and you see a familiar face, and there’s a feeling,” Burwell explains. “You can’t identify them, you don’t know where you met them, but there’s a sense of familiarity.” It’s different from recalling where you met the person, or even who the person is. This is a sense at the base of memory. And while scientists knew the perirhinal cortex was involved in this sense of familiarity, how that feeling of new or old was coded in the brain wasn’t fully understood. [Continue reading…]


When you develop the habit of distraction, it becomes harder and harder to think deeply

Huffington Post spoke to Nicholas Carr who five years ago wrote in his book, The Shallows: How The Internet Is Changing Our Brains, about the way technology seemed to be eroding our ability to concentrate:

Are you optimistic about any of the ways we currently seem to be adapting [to the constant flow of information through digital devices]?

No. It’s the ease with which we adapt that makes me most nervous. It doesn’t take long for someone to get used to glancing at their smartphone 200 times a day. We’re creatures of habit mentally and physically.

When you develop that habit of distraction, it becomes harder and harder to back away and engage our minds in deeper modes of thinking.

Is there anything we can do to keep our mental faculties intact, or is it pretty much hopeless at this point?

Well, you can use the technology less and set aside your phone and spend a good part of your day trying to maintain your focus and not be interrupted. The good thing about that — because of the plasticity of our brains — is that if you change your habits, your brain is happy to go along with whatever you do.

What makes me more pessimistic is that we’re kind of building our personalities and our entire societies around this new set of norms and expectations that says you need to be constantly connected. As long as we continue going down that path it’s going to be ever harder for us to buck the status quo. [Continue reading…]


The neurological complexity of a mouse’s brain


Oliver Sacks, casting light on the interconnectedness of life

Michiko Kakutani writes: It’s no coincidence that so many of the qualities that made Oliver Sacks such a brilliant writer are the same qualities that made him an ideal doctor: keen powers of observation and a devotion to detail, deep reservoirs of sympathy, and an intuitive understanding of the fathomless mysteries of the human brain and the intricate connections between the body and the mind.

Dr. Sacks, who died on Sunday at 82, was a polymath and an ardent humanist, and whether he was writing about his patients, or his love of chemistry or the power of music, he leapfrogged among disciplines, shedding light on the strange and wonderful interconnectedness of life — the connections between science and art, physiology and psychology, the beauty and economy of the natural world and the magic of the human imagination.

In his writings, as he once said of his mentor, the great Soviet neuropsychologist and author A. R. Luria, “science became poetry.” [Continue reading…]


Is mirror-touch synesthesia a superpower or a curse?

Erika Hayasaki writes: No one, it seemed, knew what the patient clutching the stuffed blue bunny was feeling. At 33, he looked like a bewildered boy, staring at the doctors who crowded into his room in Massachusetts General Hospital. Lumpy oyster-sized growths shrouded his face, the result of a genetic condition that causes benign tumors to develop on the skin, in the brain, and on organs, hindering the patient’s ability to walk, talk, and feel normally. He looked like he was grimacing in pain, but his mother explained that her son, Josh, did not have a clear threshold for pain or other sensations. If Josh felt any discomfort at all, he was nearly incapable of expressing it.

“Any numbness?” asked Joel Salinas, a soft-spoken doctor in the Harvard Neurology Residency Program, a red-tipped reflex hammer in his doctor’s coat pocket. “Like it feels funny?” Josh did not answer. Salinas pulled up a blanket, revealing Josh’s atrophied legs. He thumped Josh’s left leg with the reflex hammer. Again, Josh barely reacted. But Salinas felt something: The thump against Josh’s left knee registered on Salinas’s own left knee as a tingly tap. Not just a thought of what the thump might feel like, but a distinct physical sensation.

That’s because Salinas himself has a rare medical condition, one that stands in marked contrast to his patients’: While Josh appeared unresponsive even to his own sensations, Salinas is peculiarly attuned to the sensations of others. If he sees someone slapped across the cheek, Salinas feels a hint of the slap against his own cheek. A pinch on a stranger’s right arm might become a tickle on his own. “If a person is touched, I feel it, and then I recognize that it’s touch,” Salinas says.

The condition is called mirror-touch synesthesia, and it has aroused significant interest among neuroscientists in recent years because it appears to be an extreme form of a basic human trait. In all of us, mirror neurons in the premotor cortex and other areas of the brain activate when we watch someone else’s behaviors and actions. Our brains map the regions of the body where we see someone else caressed, jabbed, or whacked, and they mimic just a shade of that feeling on the same spots on our own bodies. For mirror-touch synesthetes like Salinas, that mental simulacrum is so strong that it crosses a threshold into near-tactile sensation, sometimes indistinguishable from one’s own. Neuroscientists regard the condition as a state of “heightened empathic ability.” [Continue reading…]


On the value of not knowing everything

James McWilliams writes: In January 2010, while driving from Chicago to Minneapolis, Sam McNerney played an audiobook and had an epiphany. The book was Jonah Lehrer’s How We Decide, and the epiphany was that consciousness could reside in the brain. The quest for an empirical understanding of consciousness has long preoccupied neurobiologists. But McNerney was no neurobiologist. He was a twenty-year-old philosophy major at Hamilton College. The standard course work — ancient, modern, and contemporary philosophy — enthralled him. But after this drive, after he listened to Lehrer, something changed. “I had to rethink everything I knew about everything,” McNerney said.

Lehrer’s publisher later withdrew How We Decide for inaccuracies. But McNerney was mentally galvanized for good reason. He had stumbled upon what philosophers call the “Hard Problem” — the quest to understand the enigma of the gap between mind and body. Intellectually speaking, what McNerney experienced was like diving for a penny in a pool and coming up with a gold nugget.

The philosopher Thomas Nagel drew popular attention to the Hard Problem four decades ago in an influential essay titled “What Is It Like to Be a Bat?” Frustrated with the “recent wave of reductionist euphoria,” Nagel challenged the reductive conception of mind — the idea that consciousness resides as a physical reality in the brain — by highlighting the radical subjectivity of experience. His main premise was that “an organism has conscious mental states if and only if there is something that it is like to be that organism.”

If that idea seems elusive, consider it this way: A bat has consciousness only if there is something that it is like for that bat to be a bat. Sam has consciousness only if there is something it is like for Sam to be Sam. You have consciousness only if there is something that it is like for you to be you (and you know that there is). And here’s the key to all this: Whatever that “like” happens to be, according to Nagel, it necessarily defies empirical verification. You can’t put your finger on it. It resists physical accountability.

McNerney returned to Hamilton intellectually turbocharged. This was an idea worth pondering. “It took hold of me,” he said. “It chose me — I know you hear that a lot, but that’s how it felt.” He arranged to do research in cognitive science as an independent study project with Russell Marcus, a trusted professor. Marcus let him loose to write what McNerney calls “a seventy-page hodgepodge of psychological research and philosophy and everything in between.” Marcus remembered the project more charitably, as “a huge, ambitious, wide-ranging, smart, and engaging paper.” Once McNerney settled into his research, Marcus added, “it was like he had gone into a phone booth and come out as a super-student.”

When he graduated in 2011, McNerney was proud. “I pulled it off,” he said about earning a degree in philosophy. Not that he had any hard answers to any big problems, much less the Hard Problem. Not that he had a job. All he knew was that he “wanted to become the best writer and thinker I could be.”

So, as one does, he moved to New York City.

McNerney is the kind of young scholar adored by the humanities. He’s inquisitive, open-minded, thrilled by the world of ideas, and touched with a tinge of old-school transcendentalism. What Emerson said of Thoreau — “he declined to give up his large ambition of knowledge and action for any narrow craft or profession” — is certainly true of McNerney. [Continue reading…]


The neuroscience of a sense of place

Rick Paulas writes: Comedian Eddie Pepitone once said — and I’m paraphrasing here — that there are no great neighborhoods in Los Angeles, only great blocks. The stretch of Echo Park on Sunset Boulevard between Glendale and Logan is one. The establishments on that short stretch include an upscale wine bar, a hipster concert venue, a vegan restaurant, a deep dish pizza place, cheap thrift stores, not-so-cheap “vintage” stores selling roughly the same stuff, a check-cashing joint, a few fast food chains, and even a supermarket for time travelers.

While it’s not the most diverse cross-section you’ll find in the city, the block can be used as a social barometer when brought up in conversations. Mention the stretch, and whatever landmark the other person’s familiar with tells the tale of the socioeconomic sphere they inhabit; the landmark that puts a gleam of recognition in the other person’s eye says everything about their story.

Blocks and neighborhoods aren’t concrete concepts that mean the same thing to everyone, unlike, say, things like “apple” or “sky.” Points of reference shift depending on the person that’s using that reference, so blocks/neighborhoods are more like alternate realities laid atop one another, like plastic sheets on an overhead projector. There’s even a phrase for the study of this murky concept: mental maps. They can help us understand why some neighborhoods thrive, others die, and how changes are made.

The theory of mental (or cognitive) maps was first developed in 1960 by Massachusetts Institute of Technology professor Kevin Lynch in his book The Image of the City. Rather than relying on how cartographers saw a city, Lynch asked residents to draw a map, from memory, depicting how their city was arranged. He found that five elements compose a person’s understanding of where they are: landmarks, paths, edges, districts, and nodes. Landmarks are reference points, paths connect them, edges mark boundaries, and the other elements define larger areas that contain some combination of each of those designations.

Neuroscience backs up Lynch’s findings. In 1971, Jon O’Keefe discovered “place cells” in the hippocampus, neurons that activate when an animal enters an environment. The neurons calculate a current location based on what the animal can see, as well as through “dead reckoning” — that is, accounting based on subconscious calculations using previous positions in the recent past and how quickly it traveled over a stretch of time. In 2005, husband-and-wife team Edvard and May-Britt Moser discovered “grid cells,” neurons that fire in a grid-like pattern to measure distances and direction. O’Keefe and the Mosers all won Nobel Prizes in 2014 for their discoveries. [Continue reading…]


What rats in a maze can teach us about our sense of direction

By Francis Carpenter, UCL and Caswell Barry, UCL

London’s taxi drivers have to pass an exam in which they are asked to name the shortest route between any two places within six miles of Charing Cross – an area with more than 60,000 roads. We know from brain scans that learning “the knowledge” – as the drivers call it – increases the size of their hippocampi, the part of the brain crucial to spatial memory.

Now, new research suggests that bigger hippocampi may not be the only neurological benefit of driving a black cab. While the average person likely has many separate mental maps for different areas of London, the hours cabbies spend navigating may result in the joining of these maps into a single, global map.

[Read more…]


The rhythm of consciousness

Gregory Hickok writes: In 1890, the American psychologist William James famously likened our conscious experience to the flow of a stream. “A ‘river’ or a ‘stream’ are the metaphors by which it is most naturally described,” he wrote. “In talking of it hereafter, let’s call it the stream of thought, consciousness, or subjective life.”

While there is no disputing the aptness of this metaphor in capturing our subjective experience of the world, recent research has shown that the “stream” of consciousness is, in fact, an illusion. We actually perceive the world in rhythmic pulses rather than as a continuous flow.

Some of the first hints of this new understanding came as early as the 1920s, when physiologists discovered brain waves: rhythmic electrical currents measurable on the surface of the scalp by means of electroencephalography. Subsequent research cataloged a spectrum of such rhythms (alpha waves, delta waves and so on) that correlated with various mental states, such as calm alertness and deep sleep.

Researchers also found that the properties of these rhythms varied with perceptual or cognitive events. The phase and amplitude of your brain waves, for example, might change if you saw or heard something, or if you increased your concentration on something, or if you shifted your attention.

But those early discoveries themselves did not change scientific thinking about the stream-like nature of conscious perception. Instead, brain waves were largely viewed as a tool for indexing mental experience, much like the waves that a ship generates in the water can be used to index the ship’s size and motion (e.g., the bigger the waves, the bigger the ship).

Recently, however, scientists have flipped this thinking on its head. We are exploring the possibility that brain rhythms are not merely a reflection of mental activity but a cause of it, helping shape perception, movement, memory and even consciousness itself. [Continue reading…]


What ants can teach us about the operation of the human brain

Carrie Arnold writes: Deborah Gordon spent the morning of August 27 watching a group of harvester ants foraging for seeds outside the dusty town of Rodeo, N.M. Long before the first rays of sun hit the desert floor, a group of patroller ants was already on the move. Their task was to find out whether the area near the nest was free from flash floods, high winds, and predators. If they didn’t return to the nest, departing foragers would know it wasn’t safe to go search for food.

When the patrollers returned and the first foragers did leave, they scattered in all directions, hunting for the fat-laden, energy-rich seeds on which the colony depends. Other foragers waited in the entrance of the nest for the first wave to return. If lots of food were nearby, foragers would return and depart quickly, creating a massive chain reaction. If food was scarce, however, the second group of foragers might not leave the nest at all.

“It’s a brilliant system. The ants can take advantage of sudden windfalls of food but they don’t waste energy and resources if there’s nothing there,” said Gordon, who is an ecologist at Stanford University.

The behavior of each individual in the group is set by the rate at which it meets other ants and a set of basic rules. Its behavior alters that of its neighbors, which in turn affects the original ant, in a classic example of feedback. The result is astonishing, complex behavior. “Individually, an ant is dumb,” Gordon says. She gazes off into the distance and inhales sharply. “But the colony? That’s where the intelligence is.”

About 110 miles from Gordon’s offices in Palo Alto, Calif., Mark Goldman studies a different kind of complex, emergent behavior. Goldman is a neuroscientist at the University of California, Davis. For most of his life, he was never particularly interested in ants. But when he traveled to Stanford in 2012 to plan some experiments with a colleague who had recently attended one of Gordon’s talks, something clicked.

“As I watched films of these ant colonies, it looked like what was happening at the synapse of neurons. Both of these systems accumulate evidence about their inputs—returning ants or incoming voltage pulses—to make their decisions about whether to generate an output—an outgoing forager or a packet of neurotransmitter,” Goldman said. On his next trip to Stanford, he extended his stay. An unusual research collaboration had begun to coalesce: Ants would be used to study the brain, and the brain, to study ants. [Continue reading…]


A deficit in patience produces the illusion of a shortage of time

Chelsea Wald writes: Not long ago I diagnosed myself with the recently identified condition of sidewalk rage. It’s most pronounced when it comes to a certain friend who is a slow walker. Last month, as we sashayed our way to dinner, I found myself biting my tongue, thinking, I have to stop going places with her if I ever want to … get there!

You too can measure yourself on the “Pedestrian Aggressiveness Syndrome Scale,” a tool developed by University of Hawaii psychologist Leon James. While walking in a crowd, do you find yourself “acting in a hostile manner (staring, presenting a mean face, moving closer or faster than expected)” and “enjoying thoughts of violence?”

Slowness rage is not confined to the sidewalk, of course. Slow drivers, slow Internet, slow grocery lines — they all drive us crazy. Even the opening of this article may be going on a little too long for you. So I’ll get to the point. Slow things drive us crazy because the fast pace of society has warped our sense of timing. Things that our great-great-grandparents would have found miraculously efficient now drive us around the bend. Patience is a virtue that’s been vanquished in the Twitter age.

Once upon a time, cognitive scientists tell us, patience and impatience had an evolutionary purpose. They constituted a yin and yang balance, a finely tuned internal timer that tells when we’ve waited too long for something and should move on. When that timer went buzz, it was time to stop foraging at an unproductive patch or abandon a failing hunt.

“Why are we impatient? It’s a heritage from our evolution,” says Marc Wittmann, a psychologist at the Institute for Frontier Areas of Psychology and Mental Health in Freiburg, Germany. Impatience made sure we didn’t die from spending too long on a single unrewarding activity. It gave us the impulse to act.

But that good thing is gone. The fast pace of society has thrown our internal timer out of balance. It creates expectations that can’t be rewarded fast enough — or rewarded at all. When things move more slowly than we expect, our internal timer even plays tricks on us, stretching out the wait, summoning anger out of proportion to the delay. [Continue reading…]


Your gut tells your mind, more than you may imagine

Charles Schmidt writes: The notion that the state of our gut governs our state of mind dates back more than 100 years. Many 19th- and early 20th-century scientists believed that accumulating wastes in the colon triggered a state of “auto-intoxication,” whereby poisons emanating from the gut produced infections that were in turn linked with depression, anxiety and psychosis. Patients were treated with colonic purges and even bowel surgeries until these practices were dismissed as quackery.

The ongoing exploration of the human microbiome promises to bring the link between the gut and the brain into clearer focus. Scientists are increasingly convinced that the vast assemblage of microfauna in our intestines may have a major impact on our state of mind. The gut-brain axis seems to be bidirectional — the brain acts on gastrointestinal and immune functions that help to shape the gut’s microbial makeup, and gut microbes make neuroactive compounds, including neurotransmitters and metabolites that also act on the brain. These interactions could occur in various ways: microbial compounds communicate via the vagus nerve, which connects the brain and the digestive tract, and microbially derived metabolites interact with the immune system, which maintains its own communication with the brain. Sven Pettersson, a microbiologist at the Karolinska Institute in Stockholm, has recently shown that gut microbes help to control leakage through both the intestinal lining and the blood-brain barrier, which ordinarily protects the brain from potentially harmful agents.

Microbes may have their own evolutionary reasons for communicating with the brain. They need us to be social, says John Cryan, a neuroscientist at University College Cork in Ireland, so that they can spread through the human population. Cryan’s research shows that when bred in sterile conditions, germ-free mice lacking in intestinal microbes also lack an ability to recognize other mice with whom they interact. In other studies, disruptions of the microbiome induced mice behavior that mimics human anxiety, depression and even autism. In some cases, scientists restored more normal behavior by treating their test subjects with certain strains of benign bacteria. Nearly all the data so far are limited to mice, but Cryan believes the findings provide fertile ground for developing analogous compounds, which he calls psychobiotics, for humans. “That dietary treatments could be used as either adjunct or sole therapy for mood disorders is not beyond the realm of possibility,” he says. [Continue reading…]