Category Archives: Attention to the Unseen

Jawbone’s discovery sheds light on early human ancestor

ASU News: The earliest evidence of our human genus – Homo – was found in Ethiopia by a team of Arizona State University scientists and students during field research in 2013.

The fossil, the left side of a lower jaw with five teeth, has been dated to 2.8 million years ago, which predates the previously known fossils of the Homo lineage by approximately 400,000 years.

The discovery is being published for the first time in the March 4 online version of the journal Science.

For decades, scientists who study the origins of modern-day humans have been searching for fossils documenting the earliest phases of the Homo lineage.

Researchers have found fossils that are 3 million years old and older. The most famous example of those human ancestors is the skeleton of Lucy, found in northeastern Africa in 1974 by ASU researcher Donald Johanson. Lucy and her relatives, though they walked on two feet, were smaller-brained and more apelike than later members of the human family tree.

Scientists have also found fossils that are 2.3 million years old and younger. These ancestors are in the genus Homo and are closer to modern day humans.


But very little had been found in between – that 700,000-year gap had turned up few fossils with which to determine the evolution from Lucy to the genus Homo. Because of that gap, there has been little agreement on the time of origin of the Homo lineage.

With this find, that mysterious time period has gotten a little clearer. [Continue reading…]

The Los Angeles Times adds: The significance of this discovery, according to some researchers, is that it firmly fixes the origins of Homo in East Africa and fits the hypothesis that climate change drove key developments in a variety of mammals, including our early forebears.

When Lucy roamed Ethiopia roughly 3.2 million years ago, the region enjoyed long rainy seasons that supported the growth of many trees and a wide variety of vegetation, according to researchers.

By the time of Homo’s first established appearance in the Horn of Africa, however, things had become much drier and the landscape had transformed into a vast, treeless expanse of grasslands with a few rivers and lakes — a scene very similar to today’s Serengeti plains or Kalahari.

It was an unforgiving climate when it came to survival.

But the hallmark of the genus that includes Homo sapiens is resourcefulness. Larger brains, the ability to fashion stone tools, and teeth suited to chewing a variety of foods would have given our early ancestors the flexibility to live in an inflexible environment, researchers say. [Continue reading…]

Facebooktwittermail

How music takes possession of our perception of time

Jonathan Berger writes: One evening, some 40 years ago, I got lost in time. I was at a performance of Schubert’s String Quintet in C major. During the second movement I had the unnerving feeling that time was literally grinding to a halt. The sensation was powerful, visceral, overwhelming. It was a life-changing moment, or, as it felt at the time, a life-changing eon.

It has been my goal ever since to compose music that usurps the perceived flow of time and commandeers the sense of how time passes. Although I’ve learned to manipulate subjective time, I still stand in awe of Schubert’s unparalleled power. Nearly two centuries ago, the composer anticipated the neurological underpinnings of time perception that science has underscored in the past few decades.

The human brain, we have learned, adjusts and recalibrates temporal perception. Our ability to encode and decode sequential information, to integrate and segregate simultaneous signals, is fundamental to human survival. It allows us to find our place in, and navigate, our physical world. But music also demonstrates that time perception is inherently subjective — and an integral part of our lives. “For the time element in music is single,” wrote Thomas Mann in his novel, The Magic Mountain. “Into a section of mortal time music pours itself, thereby inexpressibly enhancing and ennobling what it fills.” [Continue reading…]

Facebooktwittermail

Your gut tells your mind, more than you may imagine

Charles Schmidt writes: The notion that the state of our gut governs our state of mind dates back more than 100 years. Many 19th- and early 20th-century scientists believed that accumulating wastes in the colon triggered a state of “auto-intoxication,” whereby poisons emanating from the gut produced infections that were in turn linked with depression, anxiety and psychosis. Patients were treated with colonic purges and even bowel surgeries until these practices were dismissed as quackery.

The ongoing exploration of the human microbiome promises to bring the link between the gut and the brain into clearer focus. Scientists are increasingly convinced that the vast assemblage of microfauna in our intestines may have a major impact on our state of mind. The gut-brain axis seems to be bidirectional — the brain acts on gastrointestinal and immune functions that help to shape the gut’s microbial makeup, and gut microbes make neuroactive compounds, including neurotransmitters and metabolites that also act on the brain. These interactions could occur in various ways: microbial compounds communicate via the vagus nerve, which connects the brain and the digestive tract, and microbially derived metabolites interact with the immune system, which maintains its own communication with the brain. Sven Pettersson, a microbiologist at the Karolinska Institute in Stockholm, has recently shown that gut microbes help to control leakage through both the intestinal lining and the blood-brain barrier, which ordinarily protects the brain from potentially harmful agents.

Microbes may have their own evolutionary reasons for communicating with the brain. They need us to be social, says John Cryan, a neuroscientist at University College Cork in Ireland, so that they can spread through the human population. Cryan’s research shows that when bred in sterile conditions, germ-free mice lacking in intestinal microbes also lack an ability to recognize other mice with whom they interact. In other studies, disruptions of the microbiome induced mice behavior that mimics human anxiety, depression and even autism. In some cases, scientists restored more normal behavior by treating their test subjects with certain strains of benign bacteria. Nearly all the data so far are limited to mice, but Cryan believes the findings provide fertile ground for developing analogous compounds, which he calls psychobiotics, for humans. “That dietary treatments could be used as either adjunct or sole therapy for mood disorders is not beyond the realm of possibility,” he says. [Continue reading…]

Facebooktwittermail

Neurological conductors that keep the brain in time and tune

Harvard Gazette: Like musical sounds, different states of mind are defined by distinct, characteristic waveforms, recognizable frequencies and rhythms in the brain’s electrical field. When the brain is alert and performing complex computations, the cerebral cortex — the wrinkled outer surface of the brain — thrums with cortical band oscillations in the gamma wavelength. In some neurological disorders like schizophrenia, however, these waves are out of tune and the rhythm is out of sync.

New research led by Harvard Medical School (HMS) scientists at the VA Boston Healthcare System (VABHS) has identified a specific class of neurons — basal forebrain GABA parvalbumin neurons, or PV neurons — that trigger these waves, acting as neurological conductors that trigger the cortex to hum rhythmically and in tune. (GABA is gamma-amniobutyric acid, a major neurotransmitter in the brain.)

The results appear this week in the journal Proceedings of the National Academy of Sciences.

“This is a move toward a unified theory of consciousness control,” said co-senior author Robert McCarley, HMS professor of psychiatry and head of the Department of Psychiatry at VA Boston Healthcare. “We’ve known that the basal forebrain is important in turning consciousness on and off in sleep and wake, but now we’ve found that these specific cells also play a key role in triggering the synchronized rhythms that characterize conscious thought, perception, and problem-solving.” [Continue reading…]

Facebooktwittermail

The genetic code is less like a blueprint than a first draft

Nessa Carey writes: When President Obama delivered a speech at MIT in 2009, he used a common science metaphor: “We have always been about innovation,” he said. “We have always been about discovery. That’s in our DNA.” Deoxyribonucleic acid, the chemical into which our genes are encoded, has become the metaphor of choice for a whole constellation of ideas about essence and identity. A certain mystique surrounds it. As Evelyn Fox Keller argues in her book The Century of the Gene, the genome is, in the popular imagination at least, the secret of life, the holy grail. It is a master builder, the ultimate computer program, and a modern-day echo of the soul, all wrapped up in one. This fantasy does not sit easily, however, with geneticists who have grown more aware over the last several decades that the relationship between genes and biological traits is much less than certain.

The popular understanding of DNA as a blueprint for organisms, with a one-to-one correspondence between genes and traits (called phenotypes), is the legacy of the early history of genetics. The term “gene” was coined in 1909 to refer to abstract units of inheritance, predating the discovery of DNA by forty years. Biologists came to think of genes like beads on a string that lined up neatly into chromosomes, with each gene determining a single phenotype. But, while some genes do correspond to traits in a straightforward way, as in eye color or blood group, most phenotypes are far more complex, set in motion by many different genes as well as by the environment in which the organism lives.

It turns out that the genetic code is less like a blueprint and more like a movie script, subject to revision and reinterpretation by a director. This process is called epigenetic modification (“epi” meaning “above” or “in addition to”). Just as a script can be altered with crossed-out words, sentences or scenes, epigenetic editing allows entire sections of DNA to be activated or de-activated. Genes can be as finely tuned as actors responding to stage directions to shout, whisper, or cackle. [Continue reading…]

Facebooktwittermail

How to rewild our language of landscape

Robert Macfarlane writes: Eight years ago, in the coastal township of Shawbost on the Outer Hebridean island of Lewis, I was given an extraordinary document. It was entitled “Some Lewis Moorland Terms: A Peat Glossary”, and it listed Gaelic words and phrases for aspects of the tawny moorland that fills Lewis’s interior. Reading the glossary, I was amazed by the compressive elegance of its lexis, and its capacity for fine discrimination: a caochan, for instance, is “a slender moor-stream obscured by vegetation such that it is virtually hidden from sight”, while a feadan is “a small stream running from a moorland loch”, and a fèith is “a fine vein-like watercourse running through peat, often dry in the summer”. Other terms were striking for their visual poetry: rionnach maoim means “the shadows cast on the moorland by clouds moving across the sky on a bright and windy day”; èit refers to “the practice of placing quartz stones in streams so that they sparkle in moonlight and thereby attract salmon to them in the late summer and autumn”, and teine biorach is “the flame or will-o’-the-wisp that runs on top of heather when the moor burns during the summer”.

The “Peat Glossary” set my head a-whirr with wonder-words. It ran to several pages and more than 120 terms – and as that modest “Some” in its title acknowledged, it was incomplete. “There’s so much language to be added to it,” one of its compilers, Anne Campbell, told me. “It represents only three villages’ worth of words. I have a friend from South Uist who said her grandmother would add dozens to it. Every village in the upper islands would have its different phrases to contribute.” I thought of Norman MacCaig’s great Hebridean poem “By the Graveyard, Luskentyre”, where he imagines creating a dictionary out of the language of Donnie, a lobster fisherman from the Isle of Harris. It would be an impossible book, MacCaig concluded:

A volume thick as the height of the Clisham,

A volume big as the whole of Harris,

A volume beyond the wit of scholars.

The same summer I was on Lewis, a new edition of the Oxford Junior Dictionary was published. A sharp-eyed reader noticed that there had been a culling of words concerning nature. Under pressure, Oxford University Press revealed a list of the entries it no longer felt to be relevant to a modern-day childhood. The deletions included acorn, adder, ash, beech, bluebell, buttercup, catkin, conker, cowslip, cygnet, dandelion, fern, hazel, heather, heron, ivy, kingfisher, lark, mistletoe, nectar, newt, otter, pasture and willow. The words taking their places in the new edition included attachment, block-graph, blog, broadband, bullet-point, celebrity, chatroom, committee, cut-and-paste, MP3 player and voice-mail. As I had been entranced by the language preserved in the prose‑poem of the “Peat Glossary”, so I was dismayed by the language that had fallen (been pushed) from the dictionary. For blackberry, read Blackberry.

I have long been fascinated by the relations of language and landscape – by the power of strong style and single words to shape our senses of place. And it has become a habit, while travelling in Britain and Ireland, to note down place words as I encounter them: terms for particular aspects of terrain, elements, light and creaturely life, or resonant place names. I’ve scribbled these words in the backs of notebooks, or jotted them down on scraps of paper. Usually, I’ve gleaned them singly from conversations, maps or books. Now and then I’ve hit buried treasure in the form of vernacular word-lists or remarkable people – troves that have held gleaming handfuls of coinages, like the Lewisian “Peat Glossary”.

Not long after returning from Lewis, and spurred on by the Oxford deletions, I resolved to put my word-collecting on a more active footing, and to build up my own glossaries of place words. It seemed to me then that although we have fabulous compendia of flora, fauna and insects (Richard Mabey’s Flora Britannica and Mark Cocker’s Birds Britannica chief among them), we lack a Terra Britannica, as it were: a gathering of terms for the land and its weathers – terms used by crofters, fishermen, farmers, sailors, scientists, miners, climbers, soldiers, shepherds, poets, walkers and unrecorded others for whom particularised ways of describing place have been vital to everyday practice and perception. It seemed, too, that it might be worth assembling some of this terrifically fine-grained vocabulary – and releasing it back into imaginative circulation, as a way to rewild our language. I wanted to answer Norman MacCaig’s entreaty in his Luskentyre poem: “Scholars, I plead with you, / Where are your dictionaries of the wind … ?” [Continue reading…]

Facebooktwittermail

Darwin learned more about evolution from plants than Galapagos Finches

Henry Nicholls writes: When the HMS Beagle dropped anchor on San Cristobal, the easternmost island in the Galapagos archipelago, in September 1835, the ship’s naturalist Charles Darwin eagerly went ashore to gather samples of the insects, birds, reptiles, and plants living there. At first, he didn’t think much of the arid landscape, which appeared to be “covered by stunted, sun-burnt brushwood…as leafless as our trees during winter” But this did not put him off. By the time the Beagle left these islands some five weeks later, he had amassed a spectacular collection of Galapagos plants.

It is fortunate that he took such trouble. Most popular narratives of Darwin and the Galapagos concentrate on the far more celebrated finches or the giant tortoises. Yet when he finally published On the Origin of Species almost 25 years later, Darwin made no mention of these creatures. In his discussion of the Galapagos, he dwelt almost exclusively on the islands’ plants.

By the early 19th century, there was increasing interest in what we now refer to as biogeography, the study of the distribution of species around the globe. Many people still imagined that God had been involved in the creation of species, putting fully formed versions down on Earth that continued to reproduce themselves, dispersing from a divine “center of creation” to occupy their current habitats. To explain how the plants and animals reached far-flung places such as the isolated Galapagos, several naturalists imagined that there had to have been land bridges, long-since subsided, that had once connected them to a continent. But in the wake of the Beagle voyage, the collection of Galapagos plants suggested an alternate scenario.

Even if there had once been a land bridge to the islands, it could not account for the fact that half of the plant species Darwin collected were unique to the Galapagos, and that most of them were particular to just one island. “I never dreamed that islands, about fifty or sixty miles apart, and most of them in sight of each other, formed of precisely the same rocks, placed under a quite similar climate, rising to a nearly equal height, would have been differently tenanted,” wrote Darwin in his Journal of Researches. His observations could be best explained if species were not fixed in nature but somehow changed as the seeds traveled to different locations. [Continue reading…]

Facebooktwittermail

DNA contains no information

Regan Penaluna writes: When we talk about genes, we often use expressions inherited from a few influential geneticists and evolutionary biologists, including Francis Crick, James Watson, and Richard Dawkins. These expressions depict DNA as a kind of code telling bodies how to form. We speak about genes similarly to how we speak about language, as symbolic and imbued with meaning. There is “gene-editing,” and there are “translation tables” for decoding sequences of nucleic acid. When DNA replicates, it is said to “transcribe” itself. We speak about a message — such as, build a tiger! or construct a female! — being communicated between microscopic materials. But this view of DNA has come with a price, argue some thinkers. It is philosophically misguided, they say, and has even led to scientific blunders. Scratch the surface of this idea, and below you’ll find a key contradiction.

Since the earliest days of molecular biology, scientists describe genetic material to be unlike all other biological material, because it supposedly carries something that more workaday molecules don’t: information. In a 1958 paper, Crick presented his ideas on the importance of proteins for inheritance, and said that they were composed of energy, matter, and information. Watson called DNA the “repository” of information.

Less than a decade later, George Williams, an influential evolutionary biologist, elaborated on this idea. He described genes to have a special status distinct from DNA, and to be the message that the DNA delivers. In a later work, he likened genes to ideas contained in books. A book can be destroyed, but the story inside is not identical to the physical book. “The same information can be recorded by a variety of patterns in many different kinds of material. A message is always coded in some medium, but the medium is really not the message.” In his book The Blind Watchmaker, Dawkins gives perhaps the most forthright description of this view: “airborne willow seeds… are, literally, spreading instructions for making themselves… It is raining instructions out there; it’s raining programs; it’s raining tree-growing, fluff-spreading, algorithms. That is not a metaphor, it is the plain truth. It couldn’t be any plainer if it were raining floppy discs.”

But do genes truly contain information in the same sense as words, books, or floppy discs? It depends on what we mean by information. If it’s the meaning represented by the words, books, or floppy disks, then no. Many philosophers agree that this kind of semantic information requires communication: an agent to create the message and another to interpret it. “Genes don’t carry semantic information, though. They weren’t made as part of an act of communication. So genes don’t literally represent anything, as people sometimes say,” explains Peter Godfrey-Smith, a professor of philosophy at CUNY. [Continue reading…]

Facebooktwittermail

Oliver Sacks on his approaching death

A few weeks ago, Oliver Sacks learned that he has terminal cancer: Over the last few days, I have been able to see my life as from a great altitude, as a sort of landscape, and with a deepening sense of the connection of all its parts. This does not mean I am finished with life.

On the contrary, I feel intensely alive, and I want and hope in the time that remains to deepen my friendships, to say farewell to those I love, to write more, to travel if I have the strength, to achieve new levels of understanding and insight.

This will involve audacity, clarity and plain speaking; trying to straighten my accounts with the world. But there will be time, too, for some fun (and even some silliness, as well).

I feel a sudden clear focus and perspective. There is no time for anything inessential. I must focus on myself, my work and my friends. I shall no longer look at “NewsHour” every night. I shall no longer pay any attention to politics or arguments about global warming.

This is not indifference but detachment — I still care deeply about the Middle East, about global warming, about growing inequality, but these are no longer my business; they belong to the future. I rejoice when I meet gifted young people — even the one who biopsied and diagnosed my metastases. I feel the future is in good hands. [Continue reading…]

Facebooktwittermail

The quantum mechanics of fate

George Musser writes: The objective world simply is, it does not happen,” wrote mathematician and physicist Hermann Weyl in 1949. From his point of view, the universe is laid out in time as surely as it is laid out in space. Time does not pass, and the past and future are as real as the present. If your common sense rebels against this idea, it is probably for a single reason: the arrow of causality. Events in the past cause events in the present which cause events in the future. If time really is like space, then shouldn’t events from the future influence the present and past, too?

They actually might. Physicists as renowned as John Wheeler, Richard Feynman, Dennis Sciama, and Yakir Aharonov have speculated that causality is a two-headed arrow and the future might influence the past. Today, the leading advocate of this position is Huw Price, a University of Cambridge philosopher who specializes in the physics of time. “The answer to the question, ‘Could the world be such that we do have a limited amount of control over the past,’ ” Price says, “is yes.” What’s more, Price and others argue that the evidence for such control has been staring at us for more than half a century.

That evidence, they say, is something called entanglement, a signature feature of quantum mechanics. The word “entanglement” has the same connotations as a romantic entanglement: a special, and potentially troublesome, relationship. Entangled particles start off in close proximity when they are produced in the laboratory. Then, when they are separated, they behave like a pair of magic dice. You can “roll” one in Las Vegas (or make a measurement on it), your friend can roll the other in Atlantic City, N.J., and each die will land on a random side. But whatever those two sides are, they will have a consistent relationship to each other: They could be identical, for example, or always differ by one. If you ever saw this happen, you might assume the dice were loaded or fixed before they were rolled. But no crooked dice could behave this way. After all, the Atlantic City die changes its behavior depending on what is going on with the Las Vegas die and vice versa, even if you roll them at the same moment.

The standard interpretation of entanglement is that there is some kind of instant communication happening between the two particles. Any communication between them would have to travel the intervening distance instantaneously—that is, infinitely fast. That is plainly faster than light, a speed of communication prohibited by the theory of relativity. According to Einstein, nothing at all should be able to do that, leading him to think that some new physics must be operating, beyond the scope of quantum mechanics itself. [Continue reading…]

Facebooktwittermail

Too many worlds

Philip Ball writes: In July 2011, participants at a conference on the placid shore of Lake Traunsee in Austria were polled on what they thought the meeting was about. You might imagine that this question would have been settled in advance, but since the broad theme was quantum theory, perhaps a degree of uncertainty was to be expected. The title of the conference was ‘Quantum Physics and the Nature of Reality’. The poll, completed by 33 of the participating physicists, mathematicians and philosophers, posed a range of unresolved questions about the relationship between those two things, one of which was: ‘What is your favourite interpretation of quantum mechanics?’

The word ‘favourite’ speaks volumes. Isn’t science supposed to be decided by experiment and observation, free from personal preferences? But experiments in quantum physics have been obstinately silent on what it means. All we can do is develop hunches, intuitions and, yes, cherished ideas. Of these, the survey offered no fewer than 11 to choose from (as well as ‘other’ and ‘none’).

The most popular (supported by 42 per cent of the very small sample) was basically the view put forward by Niels Bohr, Werner Heisenberg and their colleagues in the early days of quantum theory. Today it is known as the Copenhagen Interpretation. More on that below. You might not recognise most of the other alternatives, such as Quantum Bayesianism, Relational Quantum Mechanics, and Objective Collapse (which is not, as you might suppose, just saying ‘what the hell’). Maybe you haven’t heard of the Copenhagen Interpretation either. But in third place (18 per cent) was the Many Worlds Interpretation (MWI), and I suspect you do know something about that, since the MWI is the one with all the glamour and publicity. It tells us that we have multiple selves, living other lives in other universes, quite possibly doing all the things that we dream of but will never achieve (or never dare). Who could resist such an idea?

Yet resist we should. We should resist not just because MWI is unlikely to be true, or even because, since no one knows how to test it, the idea is perhaps not truly scientific at all. Those are valid criticisms, but the main reason we should hold out is that it is incoherent, both philosophically and logically. There could be no better contender for Wolfgang Pauli’s famous put-down: it is not even wrong. [Continue reading…]

Facebooktwittermail

Does the unstructured web need structure?

Alex Wright writes: The Earth may not be flat, but the web certainly is.

“There is no ‘top’ to the World-Wide Web,” declared a 1992 foundational document from the World Wide Web Consortium — meaning that there is no central server or organizational authority to determine what does or does not get published. It is, like Borges’ famous Library of Babel, theoretically infinite, stitched together with hyperlinks rather than top-down, Dewey Decimal-style categories.1 It is also famously open—built atop a set of publicly available industry standards.

While these features have connected untold millions and created new forms of social organization, they also come at a cost. Material seems to vanish almost as quickly as it is created, disappearing amid broken links or into the constant flow of the social media “stream.” It can be hard to distinguish fact from falsehood. Corporations have stepped into this confusion, organizing our browsing and data in decidedly closed, non-transparent ways. Did it really have to turn out this way?

The web has played such a powerful role in shaping our world that it can sometimes seem like a fait accompli — the inevitable result of progress and enlightened thinking. A deeper look into the historical record, though, reveals a different story: The web in its current state was by no means inevitable. Not only were there competing visions for how a global knowledge network might work, divided along cultural and philosophical lines, but some of those discarded hypotheses are coming back into focus as researchers start to envision the possibilities of a more structured, less volatile web. [Continue reading…]

Facebooktwittermail

Charles Darwin, natural novelist

Adam Gopnik writes: Darwin’s Delay is by now nearly as famous as Hamlet’s, and involves a similar cast of characters: a family ghost, an unhappy lover, and a lot of men digging up old bones. Although it ends with vindication and fame, rather than with slaughter and self-knowledge, it was resolved by language, too — by inner soliloquy forcing itself out into the world, except that in this case the inner voice had the certainties and the outer one the hesitations.

The delay set in between Darwin’s first intimations of his Great Idea, the idea of evolution by natural selection, in the eighteen-thirties (he was already toying with it during his famous voyage on the H.M.S. Beagle), and the publication of “On the Origin of Species,” in 1859. By legend, the two events were in the long run one: Darwin saw the adapted beaks of his many finches, brooded on what they meant, came up with a theory, sought evidence for it, and was prodded into print at last by an unwelcome letter from an obscure naturalist named Alfred Russel Wallace, who had managed to arrive at the same idea.

It seems to have been more complicated than that. One reason Darwin spent so long getting ready to write his masterpiece without getting it written was that he knew what it would mean for faith and life, and, as Janet Browne’s now standard biography makes plain, he was frightened about being attacked by the powerful and the bigoted. Darwin was not a brave man — had the Inquisition been in place in Britain, he never would have published — but he wasn’t a humble man or a cautious thinker, either. He sensed that his account would end any intellectually credible idea of divine creation, and he wanted to break belief without harming the believer, particularly his wife, Emma, whom he loved devotedly and with whom he had shared, before he sat down to write, a private tragedy that seemed tolerable to her only through faith. The problem he faced was also a rhetorical one: how to say something that had never been said before in a way that made it sound like something everybody had always known — how to make an idea potentially scary and subversive sound as sane and straightforward as he believed it to be.

He did it, and doing it was, in some part, a triumph of style. Darwin is the one indisputably great scientist whose scientific work is still read by amateurs. [Continue reading…]

Facebooktwittermail

Crows understand analogies

crow

Scientific American reports: People are fascinated by the intelligence of animals. In fact, cave paintings dating back some 40,000 years suggest that we have long harbored keen interest in animal behavior and cognition. Part of that interest may have been practical: animals can be dangerous, they can be sources of food and clothing, and they can serve as sentries or mousers.

But, another part of that fascination is purely theoretical. Because animals resemble us in form, perhaps they also resemble us in thought. For many philosophers — including René Descartes and John Locke — granting intelligence to animals was a bridge too far. They especially deemed abstract reasoning to be uniquely human and to perfectly distinguish people from “brutes.” Why? Because animals do not speak, they must have no thoughts.

Nevertheless, undeterred by such pessimistic pronouncements, informed by Darwin’s theory of evolution, and guided by the maxim that “actions speak more loudly than words,” researchers today are fashioning powerful behavioral tests that provide nonverbal ways for animals to disclose their intelligence to us. Although animals may not use words, their behavior may serve as a suitable substitute; its study may allow us to jettison the stale convention that thought without language is impossible. [Continue reading…]

Facebooktwittermail

Facing death: What Fyodor Dostoevsky wrote upon being granted a stay of execution

Lapham’s Quarterly: Brother, my precious friend! All is settled! I am sentenced to four years’ hard labor in the fortress (of Orenburg, I believe), and after that to serve as a private. Today, the twenty-second of December, we were taken to the Semyonov drill ground. There the sentence of death was read to all of us, we were told to kiss the cross, our swords were broken over our heads, and our last toilet was made (white shirts). Then three were tied to the pillar for execution. I was the sixth. Three at a time were called out; consequently, I was in the second batch and no more than a minute was left me to live.

I remembered you, brother, and all yours; during the last minute you, you alone, were in my mind, only then I realized how I love you, dear brother mine! I also managed to embrace Pleshcheyev and Durov, who stood close to me, and to say goodbye to them. Finally the retreat was sounded and those tied to the pillar were led back, and it was announced to us that His Imperial Majesty granted us our lives. Then followed the present sentences. Palm alone has been pardoned, and returns with his old rank to the army.

I was just told, dear brother, that today or tomorrow we are to be sent off. I asked to see you. But I was told that this was impossible; I may only write you this letter: make haste and give me a reply as soon as you can.

I am afraid that you may somehow have got to know of our death sentence. From the windows of the prison van, when we were taken to the Semyonov drill ground, I saw a multitude of people; perhaps the news reached you, and you suffered for me. Now you will be easier on my account.

Brother! I have not become downhearted or low-spirited. Life is everywhere life, life in ourselves, not in what is outside us. There will be people near me, and to be a man among people and remain a man forever, not to be downhearted nor to fall in whatever misfortunes may befall me — this is life; this is the task of life. I have realized this. This idea has entered into my flesh and into my blood. [Continue reading…]

Facebooktwittermail

Music permeates our brain

Jonathan Berger writes: Neurological research has shown that vivid musical hallucinations are more than metaphorical. They don’t just feel real, they are, from a cognitive perspective, entirely real. In the absence of sound waves, brain activation is strikingly similar to that triggered by external auditory sounds. Why should that be?

Music, repetitive and patterned by nature, provides structure within which we find anchors, context, and a basis for organizing time. In the prehistory of civilization, humans likely found comfort in the audible patterns and structures that accompanied their circadian rhythms — from the coo of a morning dove to the nocturnal chirps of crickets. With the evolution of music a more malleable framework for segmenting and structuring time developed. Humans generated predictable and replicable temporal patterns by drumming, vocalizing, blowing, and plucking. This metered, temporal framework provides an internal world in which we construct predictions about the future — what will happen next, and when it will happen.

This process spotlights the brain itself. The composer Karlheinz Stockhausen hyphenated the term for his craft to underscore the literal meaning of “com-pose” — to put together elements, from com (“with” or “together”) and pose (“put” or “place”). When we imagine music, we literally compose — sometimes recognizable tunes, other times novel combinations of patterns and musical ideas. Toddlers sing themselves to sleep with vocalizations of musical snippets they are conjuring up in their imagination. Typically, these “spontaneous melodies,” as they are referred to by child psychologists, comprise fragments of salient features of multiple songs that the baby is piecing together. In short, we do not merely retrieve music that we store in memory. Rather, a supremely complex web of associations can be stirred and generated as we compose music in our minds.

Today, amid widely disseminated music, we are barraged by a cacophony of disparate musical patterns — more often than not uninvited and unwanted — and likely spend more time than ever obsessing over imagined musical fragments. The brain is a composer whose music orchestrates our lives. And right now the brain is working overtime. [Continue reading…]

Facebooktwittermail

Meet Walter Pitts, the homeless genius who revolutionized artificial intelligence

Amanda Gefter writes: Walter Pitts was used to being bullied. He’d been born into a tough family in Prohibition-era Detroit, where his father, a boiler-maker, had no trouble raising his fists to get his way. The neighborhood boys weren’t much better. One afternoon in 1935, they chased him through the streets until he ducked into the local library to hide. The library was familiar ground, where he had taught himself Greek, Latin, logic, and mathematics—better than home, where his father insisted he drop out of school and go to work. Outside, the world was messy. Inside, it all made sense.

Not wanting to risk another run-in that night, Pitts stayed hidden until the library closed for the evening. Alone, he wandered through the stacks of books until he came across Principia Mathematica, a three-volume tome written by Bertrand Russell and Alfred Whitehead between 1910 and 1913, which attempted to reduce all of mathematics to pure logic. Pitts sat down and began to read. For three days he remained in the library until he had read each volume cover to cover — nearly 2,000 pages in all — and had identified several mistakes. Deciding that Bertrand Russell himself needed to know about these, the boy drafted a letter to Russell detailing the errors. Not only did Russell write back, he was so impressed that he invited Pitts to study with him as a graduate student at Cambridge University in England. Pitts couldn’t oblige him, though — he was only 12 years old. But three years later, when he heard that Russell would be visiting the University of Chicago, the 15-year-old ran away from home and headed for Illinois. He never saw his family again. [Continue reading…]

Facebooktwittermail