Category Archives: Consciousness

In order for our minds to go beyond syntax to semantics, we need feelings

tufted-capuchin-monkey

Stephen T Asma writes: After you spend time with wild animals in the primal ecosystem where our big brains first grew, you have to chuckle a bit at the reigning view of the mind as a computer. Most cognitive scientists, from the logician Alan Turing to the psychologist James Lloyd McClelland, have been narrowly focused on linguistic thought, ignoring the whole embodied organism. They see the mind as a Boolean algebra binary system of 1 or 0, ‘on’ or ‘off’. This has been methodologically useful, and certainly productive for the artifical intelligence we use in our digital technology, but it merely mimics the biological mind. Computer ‘intelligence’ might be impressive, but it is an impersonation of biological intelligence. The ‘wet’ biological mind is embodied in the squishy, organic machinery of our emotional systems — where action-patterns are triggered when chemical cascades cross volumetric tipping points.

Neuroscience has begun to correct the computational model by showing how our rational, linguistic mind depends on the ancient limbic brain, where emotions hold sway and social skills dominate. In fact, the cognitive mind works only when emotions preferentially tilt our deliberations. The neuroscientist Antonio Damasio worked with patients who had damage in the communication system between the cognitive and emotional brain. The subjects could compute all the informational aspects of a decision in detail, but they couldn’t actually commit to anything. Without clear limbic values (that is, feelings), Damasio’s patients couldn’t decide their own social calendars, prioritise jobs at work, or even make decisions in their own best interest. Our rational mind is truly embodied, and without this emotional embodiment we have no preferences. In order for our minds to go beyond syntax to semantics, we need feelings. And our ancestral minds were rich in feelings before they were adept in computations.

Our neo-cortex mushroomed to its current size less than one million years ago. That’s a very recent development when we remember that the human clade or group broke off from the great apes in Africa 7 million years ago. That future-looking, tool-wielding, symbol-juggling cortex grew on top of the limbic system. Older still is the reptile brain — the storehouse of innate motivational instincts such as pain-avoidance, exploration, hunger, lust, aggression and so on. Walking around (very carefully) on the Serengeti is like visiting the nursery of our own mind. [Continue reading…]

Facebooktwittermail

Crawick Multiverse: A former opencast coal mine transformed into a cosmic landscape

Crawick Multiverse

Philip Ball writes: When work began in 2012, the excavations unearthed thousands of boulders half-buried in the ground. [Charles] Jencks used them to create a panorama of standing stones and sculpted tumuli, organised to frame the horizon and the Sun’s movements.

“One theory of pre-history is that stone circles frame the far hills and key points, and while I wanted to capture today’s cosmology, not yesterday’s, I was aware of this long landscape tradition,” Jencks says.

The landscape also explores the idea that our Universe is just one of many.

Over the last decade or so, the argument for a plurality of universes has moved from fringe speculation to seriously entertained possibility. One leading multiverse theory supposes that other universes are continually being spawned in an ongoing process of “eternal inflation” – the same that caused our own Universe’s Big Bang 13.7 billion years ago.

These are the theories explored on this Scottish hillside. [Continue reading…]

Facebooktwittermail

Stories about our deepest values activate brain region once thought to be its autopilot

University of Southern California: Everyone has at least a few non-negotiable values. These are the things that, no matter what the circumstance, you’d never compromise for any reason – such as “I’d never hurt a child,” or “I’m against the death penalty.”

Real-time brain scans show that when people read stories that deal with these core, protected values, the “default mode network” in their brains activates.

This network was once thought of as just the brain’s autopilot, since it has been shown to be active when you’re not engaged by anything in the outside world – but studies like this one suggest that it’s actually working to find meaning in the narratives.

“The brain is devoting a huge amount of energy to whatever that network is doing. We need to understand why,” said Jonas Kaplan of the USC Dornsife Brain and Creativity Institute. Kaplan was the lead author of the study, which was published on Jan. 7 in the journal Cerebral Cortex.

Kaplan thinks that it’s not just that the brain is presented with a moral quandary, but rather that the quandary is presented in a narrative format.

“Stories help us to organize information in a unique way,” he said. Continue reading

Facebooktwittermail

Old and new: How the brain evokes a sense of familiarity

Science News reports: It’s happened to all of us at one time or another: You’re walking through a crowd, and suddenly a face seems incredibly familiar — so much so that you do a double-take. Who is that? How do you know them? You have no idea, but something about their face nags at you. You know you’ve seen it before.

The reason you know that face is in part because of your perirhinal cortex. This is an area of the brain that helps us to determine familiarity, or whether we have seen an object before. A new study of brain cells in this area finds that firing these neurons at one frequency makes the brain treat novel images as old hat. But firing these same neurons at another frequency can make the old new again.

“Novelty and familiarity are both really important,” says study coauthor Rebecca Burwell, a neuroscientist at Brown University in Providence, R.I. “They are important for learning and memory and decision making.” Finding a cache of food and knowing it is new could be useful for an animal’s future. So is recognizing a familiar place where the pickings were good in the past.

But knowing that something is familiar is not quite the same thing as knowing what that thing is. “You’re in a crowd and you see a familiar face, and there’s a feeling,” Burwell explains. “You can’t identify them, you don’t know where you met them, but there’s a sense of familiarity.” It’s different from recalling where you met the person, or even who the person is. This is a sense at the base of memory. And while scientists knew the perirhinal cortex was involved in this sense of familiarity, how that feeling of new or old was coded in the brain wasn’t fully understood. [Continue reading…]

Facebooktwittermail

Our moral identity makes us who we are

Nina Strohminger writes: e morning after her accident, a woman I’ll call Kate awoke in a daze. She looked at the man next to her in bed. He resembled her husband, with the same coppery beard and freckles dusted across his shoulders. But this man was definitely not her husband.

Panicked, she packed a small bag and headed to her psychiatrist’s office. On the bus, there was a man she had been encountering with increasing frequency over the past several weeks. The man was clever, he was a spy. He always appeared in a different form: one day as a little girl in a sundress, another time as a bike courier who smirked at her knowingly. She explained these bizarre developments to her doctor, who was quickly becoming one of the last voices in this world she could trust. But as he spoke, her stomach sank with a dreaded realisation: this man, too, was an impostor.

Kate has Capgras syndrome, the unshakeable belief that someone – often a loved one, sometimes oneself – has been replaced with an exact replica. She also has Fregoli syndrome, the delusion that the same person is taking on a variety of shapes, like an actor donning an expert disguise. Capgras and Fregoli delusions offer hints about an extraordinary cognitive mechanism active in the healthy mind, a mechanism so exquisitely tuned that we are hardly ever aware of it. This mechanism ascribes to each person a unique identity, and then meticulously tracks and updates it. This mechanism is crucial to virtually every human interaction, from navigating a party to navigating a marriage. Without it, we quickly fall apart. [Continue reading…]

Facebooktwittermail

The dangerous idea that life is a story

Galen Strawson writes: ‘Each of us constructs and lives a “narrative”,’ wrote the British neurologist Oliver Sacks, ‘this narrative is us’. Likewise the American cognitive psychologist Jerome Bruner: ‘Self is a perpetually rewritten story.’ And: ‘In the end, we become the autobiographical narratives by which we “tell about” our lives.’ Or a fellow American psychologist, Dan P McAdams: ‘We are all storytellers, and we are the stories we tell.’ And here’s the American moral philosopher J David Velleman: ‘We invent ourselves… but we really are the characters we invent.’ And, for good measure, another American philosopher, Daniel Dennett: ‘we are all virtuoso novelists, who find ourselves engaged in all sorts of behaviour… and we always put the best “faces” on it we can. We try to make all of our material cohere into a single good story. And that story is our autobiography. The chief fictional character at the centre of that autobiography is one’s self.’

So say the narrativists. We story ourselves and we are our stories. There’s a remarkably robust consensus about this claim, not only in the humanities but also in psychotherapy. It’s standardly linked with the idea that self-narration is a good thing, necessary for a full human life.

I think it’s false – false that everyone stories themselves, and false that it’s always a good thing. These are not universal human truths – even when we confine our attention to human beings who count as psychologically normal, as I will here. They’re not universal human truths even if they’re true of some people, or even many, or most. The narrativists are, at best, generalising from their own case, in an all-too-human way. At best: I doubt that what they say is an accurate description even of themselves. [Continue reading…]

Facebooktwittermail

Oliver Sacks, casting light on the interconnectedness of life

Michiko Kakutani writes: It’s no coincidence that so many of the qualities that made Oliver Sacks such a brilliant writer are the same qualities that made him an ideal doctor: keen powers of observation and a devotion to detail, deep reservoirs of sympathy, and an intuitive understanding of the fathomless mysteries of the human brain and the intricate connections between the body and the mind.

Dr. Sacks, who died on Sunday at 82, was a polymath and an ardent humanist, and whether he was writing about his patients, or his love of chemistry or the power of music, he leapfrogged among disciplines, shedding light on the strange and wonderful interconnectedness of life — the connections between science and art, physiology and psychology, the beauty and economy of the natural world and the magic of the human imagination.

In his writings, as he once said of his mentor, the great Soviet neuropsychologist and author A. R. Luria, “science became poetry.” [Continue reading…]

Facebooktwittermail

How language can shape a sense of place

Claire Cameron writes: English speakers and others are highly egocentric when it comes to orienting themselves in the world. Objects and people exist to the left, right, in front, and to the back of you. You move forward and backward in relation to the direction you are facing. For an aboriginal tribe in north Queensland, Australia, called the Guugu Ymithirr, such a “me me me” approach to spatial information makes no sense. Instead, they use cardinal directions to express spatial information (pdf). So rather than “Can you move to my left?” they would say “Can you move to the west?”

Linguist Guy Deustcher says that Guugu Ymithirr speakers have a kind of “internal compass” that is imprinted from an extremely young age. In the same way that English-speaking infants learn to use different tenses when they speak, so do Guugu Ymithirr children learn to orient themselves along compass lines, not relative to themselves. In fact, says Deustcher, if a Guugu Ymithirr speaker wants to direct your attention to the direction behind him, he “points through himself, as if he were thin air and his own existence were irrelevant.” Whether that translates into less egocentric worldviews is a matter for further study and debate.

Other studies have shown that speakers of languages that use cardinal directions to express locations have fantastic spatial memory and navigation skills — perhaps because their experience of an event is so well-defined by the directions it took place in. [Continue reading…]

Facebooktwittermail

Is mirror-touch synesthesia a superpower or a curse?

Erika Hayasaki writes: No one, it seemed, knew what the patient clutching the stuffed blue bunny was feeling. At 33, he looked like a bewildered boy, staring at the doctors who crowded into his room in Massachusetts General Hospital. Lumpy oyster-sized growths shrouded his face, the result of a genetic condition that causes benign tumors to develop on the skin, in the brain, and on organs, hindering the patient’s ability to walk, talk, and feel normally. He looked like he was grimacing in pain, but his mother explained that her son, Josh, did not have a clear threshold for pain or other sensations. If Josh felt any discomfort at all, he was nearly incapable of expressing it.

“Any numbness?” asked Joel Salinas, a soft-spoken doctor in the Harvard Neurology Residency Program, a red-tipped reflex hammer in his doctor’s coat pocket. “Like it feels funny?” Josh did not answer. Salinas pulled up a blanket, revealing Josh’s atrophied legs. He thumped Josh’s left leg with the reflex hammer. Again, Josh barely reacted. But Salinas felt something: The thump against Josh’s left knee registered on Salinas’s own left knee as a tingly tap. Not just a thought of what the thump might feel like, but a distinct physical sensation.

That’s because Salinas himself has a rare medical condition, one that stands in marked contrast to his patients’: While Josh appeared unresponsive even to his own sensations, Salinas is peculiarly attuned to the sensations of others. If he sees someone slapped across the cheek, Salinas feels a hint of the slap against his own cheek. A pinch on a stranger’s right arm might become a tickle on his own. “If a person is touched, I feel it, and then I recognize that it’s touch,” Salinas says.

The condition is called mirror-touch synesthesia, and it has aroused significant interest among neuroscientists in recent years because it appears to be an extreme form of a basic human trait. In all of us, mirror neurons in the premotor cortex and other areas of the brain activate when we watch someone else’s behaviors and actions. Our brains map the regions of the body where we see someone else caressed, jabbed, or whacked, and they mimic just a shade of that feeling on the same spots on our own bodies. For mirror-touch synesthetes like Salinas, that mental simulacrum is so strong that it crosses a threshold into near-tactile sensation, sometimes indistinguishable from one’s own. Neuroscientists regard the condition as a state of “heightened empathic ability.” [Continue reading…]

Facebooktwittermail

Is consciousness an engineering problem?

Michael Graziano writes: The brain is a machine: a device that processes information. That’s according to the last 100 years of neuroscience. And yet, somehow, it also has a subjective experience of at least some of that information. Whether we’re talking about the thoughts and memories swirling around on the inside, or awareness of the stuff entering through the senses, somehow the brain experiences its own data. It has consciousness. How can that be?

That question has been called the ‘hard problem’ of consciousness, where ‘hard’ is a euphemism for ‘impossible’. For decades, it was a disreputable topic among scientists: if you can’t study it or understand it or engineer it, then it isn’t science. On that view, neuroscientists should stick to the mechanics of how information is processed in the brain, not the spooky feeling that comes along with the information. And yet, one can’t deny that the phenomenon exists. What exactly is this consciousness stuff?

Here’s a more pointed way to pose the question: can we build it? Artificial intelligence is growing more intelligent every year, but we’ve never given our machines consciousness. People once thought that if you made a computer complicated enough it would just sort of ‘wake up’ on its own. But that hasn’t panned out (so far as anyone knows). Apparently, the vital spark has to be deliberately designed into the machine. And so the race is on to figure out what exactly consciousness is and how to build it.

I’ve made my own entry into that race, a framework for understanding consciousness called the Attention Schema theory. The theory suggests that consciousness is no bizarre byproduct – it’s a tool for regulating information in the brain. And it’s not as mysterious as most people think. As ambitious as it sounds, I believe we’re close to understanding consciousness well enough to build it.

In this article I’ll conduct a thought experiment. Let’s see if we can construct an artificial brain, piece by hypothetical piece, and make it conscious. The task could be slow and each step might seem incremental, but with a systematic approach we could find a path that engineers can follow. [Continue reading…]

Facebooktwittermail

On the value of not knowing everything

James McWilliams writes: In January 2010, while driving from Chicago to Minneapolis, Sam McNerney played an audiobook and had an epiphany. The book was Jonah Lehrer’s How We Decide, and the epiphany was that consciousness could reside in the brain. The quest for an empirical understanding of consciousness has long preoccupied neurobiologists. But McNerney was no neurobiologist. He was a twenty-year-old philosophy major at Hamilton College. The standard course work — ancient, modern, and contemporary philosophy — enthralled him. But after this drive, after he listened to Lehrer, something changed. “I had to rethink everything I knew about everything,” McNerney said.

Lehrer’s publisher later withdrew How We Decide for inaccuracies. But McNerney was mentally galvanized for good reason. He had stumbled upon what philosophers call the “Hard Problem” — the quest to understand the enigma of the gap between mind and body. Intellectually speaking, what McNerney experienced was like diving for a penny in a pool and coming up with a gold nugget.

The philosopher Thomas Nagel drew popular attention to the Hard Problem four decades ago in an influential essay titled “What Is It Like to Be a Bat?” Frustrated with the “recent wave of reductionist euphoria,” Nagel challenged the reductive conception of mind — the idea that consciousness resides as a physical reality in the brain — by highlighting the radical subjectivity of experience. His main premise was that “an organism has conscious mental states if and only if there is something that it is like to be that organism.”

If that idea seems elusive, consider it this way: A bat has consciousness only if there is something that it is like for that bat to be a bat. Sam has consciousness only if there is something it is like for Sam to be Sam. You have consciousness only if there is something that it is like for you to be you (and you know that there is). And here’s the key to all this: Whatever that “like” happens to be, according to Nagel, it necessarily defies empirical verification. You can’t put your finger on it. It resists physical accountability.

McNerney returned to Hamilton intellectually turbocharged. This was an idea worth pondering. “It took hold of me,” he said. “It chose me — I know you hear that a lot, but that’s how it felt.” He arranged to do research in cognitive science as an independent study project with Russell Marcus, a trusted professor. Marcus let him loose to write what McNerney calls “a seventy-page hodgepodge of psychological research and philosophy and everything in between.” Marcus remembered the project more charitably, as “a huge, ambitious, wide-ranging, smart, and engaging paper.” Once McNerney settled into his research, Marcus added, “it was like he had gone into a phone booth and come out as a super-student.”

When he graduated in 2011, McNerney was proud. “I pulled it off,” he said about earning a degree in philosophy. Not that he had any hard answers to any big problems, much less the Hard Problem. Not that he had a job. All he knew was that he “wanted to become the best writer and thinker I could be.”

So, as one does, he moved to New York City.

McNerney is the kind of young scholar adored by the humanities. He’s inquisitive, open-minded, thrilled by the world of ideas, and touched with a tinge of old-school transcendentalism. What Emerson said of Thoreau — “he declined to give up his large ambition of knowledge and action for any narrow craft or profession” — is certainly true of McNerney. [Continue reading…]

Facebooktwittermail

On not being there: The data-driven body at work and at play

Rebecca Lemov writes: The protagonist of William Gibson’s 2014 science-fiction novel The Peripheral, Flynne Fisher, works remotely in a way that lends a new and fuller sense to that phrase. The novel features a double future: One set of characters inhabits the near future, ten to fifteen years from the present, while another lives seventy years on, after a breakdown of the climate and multiple other systems that has apocalyptically altered human and technological conditions around the world.

In that “further future,” only 20 percent of the Earth’s human population has survived. Each of these fortunate few is well off and able to live a life transformed by healing nanobots, somaticized e-mail (which delivers messages and calls to the roof of the user’s mouth), quantum computing, and clean energy. For their amusement and profit, certain “hobbyists” in this future have the Borgesian option of cultivating an alternative path in history — it’s called “opening up a stub” — and mining it for information as well as labor.

Flynne, the remote worker, lives on one of those paths. A young woman from the American Southeast, possibly Appalachia or the Ozarks, she favors cutoff jeans and resides in a trailer, eking out a living as a for-hire sub playing video games for wealthy aficionados. Recruited by a mysterious entity that is beta-testing drones that are doing “security” in a murky skyscraper in an unnamed city, she thinks at first that she has been taken on to play a kind of video game in simulated reality. As it turns out, she has been employed to work in the future as an “information flow” — low-wage work, though the pay translates to a very high level of remuneration in the place and time in which she lives.

What is of particular interest is the fate of Flynne’s body. Before she goes to work she must tend to its basic needs (nutrition and elimination), because during her shift it will effectively be “vacant.” Lying on a bed with a special data-transmitting helmet attached to her head, she will be elsewhere, inhabiting an ambulatory robot carapace — a “peripheral” — built out of bio-flesh that can receive her consciousness.

Bodies in this data-driven economic backwater of a future world economy are abandoned for long stretches of time — disposable, cheapened, eerily vacant in the temporary absence of “someone at the helm.” Meanwhile, fleets of built bodies, grown from human DNA, await habitation.

Alex Rivera explores similar territory in his Mexican sci-fi film The Sleep Dealer (2008), set in a future world after a wall erected on the US–Mexican border has successfully blocked migrants from entering the United States. Digital networks allow people to connect to strangers all over the world, fostering fantasies of physical and emotional connection. At the same time, low-income would-be migrant workers in Tijuana and elsewhere can opt to do remote work by controlling robots building a skyscraper in a faraway city, locking their bodies into devices that transmit their labor to the site. In tank-like warehouses, lined up in rows of stalls, they “jack in” by connecting data-transmitting cables to nodes implanted in their arms and backs. Their bodies are in Mexico, but their work is in New York or San Francisco, and while they are plugged in and wearing their remote-viewing spectacles, their limbs move like the appendages of ghostly underwater creatures. Their life force drained by the taxing labor, these “sleep dealers” end up as human discards.

What is surprising about these sci-fi conceits, from “transitioning” in The Peripheral to “jacking in” in The Sleep Dealer, is how familiar they seem, or at least how closely they reflect certain aspects of contemporary reality. Almost daily, we encounter people who are there but not there, flickering in and out of what we think of as presence. A growing body of research explores the question of how users interact with their gadgets and media outlets, and how in turn these interactions transform social relationships. The defining feature of this heavily mediated reality is our presence “elsewhere,” a removal of at least part of our conscious awareness from wherever our bodies happen to be. [Continue reading…]

Facebooktwittermail

Each of us is, genetically, more microbial than human

The New York Times reports: Since 2007, when scientists announced plans for a Human Microbiome Project to catalog the micro-organisms living in our body, the profound appreciation for the influence of such organisms has grown rapidly with each passing year. Bacteria in the gut produce vitamins and break down our food; their presence or absence has been linked to obesity, inflammatory bowel disease and the toxic side effects of prescription drugs. Biologists now believe that much of what makes us human depends on microbial activity. The two million unique bacterial genes found in each human microbiome can make the 23,000 genes in our cells seem paltry, almost negligible, by comparison. “It has enormous implications for the sense of self,” Tom Insel, the director of the National Institute of Mental Health, told me. “We are, at least from the standpoint of DNA, more microbial than human. That’s a phenomenal insight and one that we have to take seriously when we think about human development.”

Given the extent to which bacteria are now understood to influence human physiology, it is hardly surprising that scientists have turned their attention to how bacteria might affect the brain. Micro-organisms in our gut secrete a profound number of chemicals, and researchers like [Mark] Lyte have found that among those chemicals are the same substances used by our neurons to communicate and regulate mood, like dopamine, serotonin and gamma-aminobutyric acid (GABA). [Continue reading…]

Facebooktwittermail

How technology is damaging our brains

The New York Times reports: When one of the most important e-mail messages of his life landed in his in-box a few years ago, Kord Campbell overlooked it.

Not just for a day or two, but 12 days. He finally saw it while sifting through old messages: a big company wanted to buy his Internet start-up.

“I stood up from my desk and said, ‘Oh my God, oh my God, oh my God,’ ” Mr. Campbell said. “It’s kind of hard to miss an e-mail like that, but I did.”

The message had slipped by him amid an electronic flood: two computer screens alive with e-mail, instant messages, online chats, a Web browser and the computer code he was writing.

While he managed to salvage the $1.3 million deal after apologizing to his suitor, Mr. Campbell continues to struggle with the effects of the deluge of data. Even after he unplugs, he craves the stimulation he gets from his electronic gadgets. He forgets things like dinner plans, and he has trouble focusing on his family.

His wife, Brenda, complains, “It seems like he can no longer be fully in the moment.”

This is your brain on computers.

Scientists say juggling e-mail, phone calls and other incoming information can change how people think and behave. They say our ability to focus is being undermined by bursts of information.

These play to a primitive impulse to respond to immediate opportunities and threats. The stimulation provokes excitement — a dopamine squirt — that researchers say can be addictive. In its absence, people feel bored.

The resulting distractions can have deadly consequences, as when cellphone-wielding drivers and train engineers cause wrecks. And for millions of people like Mr. Campbell, these urges can inflict nicks and cuts on creativity and deep thought, interrupting work and family life.

While many people say multitasking makes them more productive, research shows otherwise. Heavy multitaskers actually have more trouble focusing and shutting out irrelevant information, scientists say, and they experience more stress.

And scientists are discovering that even after the multitasking ends, fractured thinking and lack of focus persist. In other words, this is also your brain off computers. [Continue reading…]

Facebooktwittermail

The rhythm of consciousness

Gregory Hickok writes: In 1890, the American psychologist William James famously likened our conscious experience to the flow of a stream. “A ‘river’ or a ‘stream’ are the metaphors by which it is most naturally described,” he wrote. “In talking of it hereafter, let’s call it the stream of thought, consciousness, or subjective life.”

While there is no disputing the aptness of this metaphor in capturing our subjective experience of the world, recent research has shown that the “stream” of consciousness is, in fact, an illusion. We actually perceive the world in rhythmic pulses rather than as a continuous flow.

Some of the first hints of this new understanding came as early as the 1920s, when physiologists discovered brain waves: rhythmic electrical currents measurable on the surface of the scalp by means of electroencephalography. Subsequent research cataloged a spectrum of such rhythms (alpha waves, delta waves and so on) that correlated with various mental states, such as calm alertness and deep sleep.

Researchers also found that the properties of these rhythms varied with perceptual or cognitive events. The phase and amplitude of your brain waves, for example, might change if you saw or heard something, or if you increased your concentration on something, or if you shifted your attention.

But those early discoveries themselves did not change scientific thinking about the stream-like nature of conscious perception. Instead, brain waves were largely viewed as a tool for indexing mental experience, much like the waves that a ship generates in the water can be used to index the ship’s size and motion (e.g., the bigger the waves, the bigger the ship).

Recently, however, scientists have flipped this thinking on its head. We are exploring the possibility that brain rhythms are not merely a reflection of mental activity but a cause of it, helping shape perception, movement, memory and even consciousness itself. [Continue reading…]

Facebooktwittermail

A Norwegian campaign to legitimize the use of psychedelics

The New York Times reports: In a country so wary of drug abuse that it limits the sale of aspirin, Pal-Orjan Johansen, a Norwegian researcher, is pushing what would seem a doomed cause: the rehabilitation of LSD.

It matters little to him that the psychedelic drug has been banned here and around the world for more than 40 years. Mr. Johansen pitches his effort not as a throwback to the hippie hedonism of the 1960s, but as a battle for human rights and good health.

In fact, he also wants to manufacture MDMA and psilocybin, the active ingredients in two other prohibited substances, Ecstasy and so-called magic mushrooms.

All of that might seem quixotic at best, if only Mr. Johansen and EmmaSofia, the psychedelics advocacy group he founded with his American-born wife and fellow scientist, Teri Krebs, had not already won some unlikely supporters, including a retired Norwegian Supreme Court judge who serves as their legal adviser.

The group, whose name derives from street slang for MDMA and the Greek word for wisdom, stands in the vanguard of a global movement now pushing to revise drug policies set in the 1970s. That it has gained traction in a country so committed to controlling drug use shows how much old orthodoxies have crumbled. [Continue reading…]

Facebooktwittermail

The former death squad fighters who are now training as yoga teachers

Lila MacLellan writes: In New York or L.A., it’s pretty common to learn that a yoga teacher used to be a dancer, an actor, or even a former Wall Street banker. In Bogota and Medellin, the same is true. Except that here, the teacher may also be an ex-member of a Colombian death squad.

Since 2010, a local organization called Dunna: Alternativas Creativas Para la Paz (Dunna: Creative Alternatives for Peace) has been gradually introducing the basic poses to two groups for whom yoga has been a foreign concept: the poor, mostly rural victims of Colombia’s brutal, half-century conflict, and the guerilla fighters who once terrorized them.

Hundreds of ­ex-militants have already taken the offered yoga courses. A dozen now plan to teach yoga to others.

To stay calm, yoga-teacher-in-training Edifrando Valderrama Holguin turns off the television whenever he sees news broadcasts about young people being recruited into terror groups like ISIL. Valderrama was 12 when he was recruited into the Revolutionary Armed Forces of Colombia (FARC). He was given a gun, basic training and a heavy dose of leftist ideology. “In the mountains, if I saw someone who was not part of our group, I had to kill him,” he says. “If I had questioned the ideology of the FARC, they would have called me an infiltrator and killed me.”

Now 28, Valderrama lives in the city of Medellin. He works afternoon shifts for a supplier to one of Colombia’s major meat companies, and practices yoga at home in the mornings. Until the program stopped last year, he attended Dunna’s yoga classes, rolling out his mat with former members of both the FARC and Autodefensas Unidas de Colombia (AUC) — a paramilitary army he once fought against. Although initially surprised that he could feel so much peace lying in corpse pose, Valderrama now hopes to become a yoga teacher, so that he can introduce the healing asanas to ex-militants in Colombia, or even overseas.

Samuel Urueña Lievano, 46, was raped and then recruited into the rival AUC by a relative when he was 15. “They used my anger and hatred to get me to join. I have so much remorse for the things I did during that period,” he tells me as he begins to cry.

Urueña, now a law student in Bogota, takes medications to manage his anxiety and still has nightmares. He calls yoga his closest friend. Practicing the poses every day for two hours has made it possible for him to handle occasional feelings of panic, impatience and frustration, he says. “It has helped me identify who I am. It has given me myself back.” [Continue reading…]

Facebooktwittermail

How music takes possession of our perception of time

Jonathan Berger writes: One evening, some 40 years ago, I got lost in time. I was at a performance of Schubert’s String Quintet in C major. During the second movement I had the unnerving feeling that time was literally grinding to a halt. The sensation was powerful, visceral, overwhelming. It was a life-changing moment, or, as it felt at the time, a life-changing eon.

It has been my goal ever since to compose music that usurps the perceived flow of time and commandeers the sense of how time passes. Although I’ve learned to manipulate subjective time, I still stand in awe of Schubert’s unparalleled power. Nearly two centuries ago, the composer anticipated the neurological underpinnings of time perception that science has underscored in the past few decades.

The human brain, we have learned, adjusts and recalibrates temporal perception. Our ability to encode and decode sequential information, to integrate and segregate simultaneous signals, is fundamental to human survival. It allows us to find our place in, and navigate, our physical world. But music also demonstrates that time perception is inherently subjective — and an integral part of our lives. “For the time element in music is single,” wrote Thomas Mann in his novel, The Magic Mountain. “Into a section of mortal time music pours itself, thereby inexpressibly enhancing and ennobling what it fills.” [Continue reading…]

Facebooktwittermail