Michiko Kakutani writes: It’s no coincidence that so many of the qualities that made Oliver Sacks such a brilliant writer are the same qualities that made him an ideal doctor: keen powers of observation and a devotion to detail, deep reservoirs of sympathy, and an intuitive understanding of the fathomless mysteries of the human brain and the intricate connections between the body and the mind.
Dr. Sacks, who died on Sunday at 82, was a polymath and an ardent humanist, and whether he was writing about his patients, or his love of chemistry or the power of music, he leapfrogged among disciplines, shedding light on the strange and wonderful interconnectedness of life — the connections between science and art, physiology and psychology, the beauty and economy of the natural world and the magic of the human imagination.
In his writings, as he once said of his mentor, the great Soviet neuropsychologist and author A. R. Luria, “science became poetry.” [Continue reading…]
Roc Morin writes: One of the first words that Koko used to describe herself was Queen. The gorilla was only a few years old when she first made the gesture — sweeping a paw diagonally across her chest as if tracing a royal sash.
“It was a sign we almost never used!” Koko’s head-caretaker Francine Patterson laughed. “Koko understands that she’s special because of all the attention she’s had from professors, and caregivers, and the media.”
The cause of the primate’s celebrity is her extraordinary aptitude for language. Over the past 43 years, since Patterson began teaching Koko at the age of 1, the gorilla has learned more than 1,000 words of modified American Sign Language—a vocabulary comparable to that of a 3-year-old human child. While there have been many attempts to teach human languages to animals, none have been more successful than Patterson’s achievement with Koko.
If Koko is a queen, then her kingdom is a sprawling research facility in the mountains outside Santa Cruz, California. It was there, under a canopy of stately redwoods, that I met research-assistant Lisa Holliday.
“You came on a good day,” Holliday smiled. “Koko’s in a good mood. She was playing the spoon game all morning! That’s when she takes the spoon and runs off with it so you can’t give her another bite. She’s an active girl. She’s always got her dolls, and in the afternoon, her kittens — or as we call them, her kids.”
It was a winding stroll up a sun-spangled trail toward the cabin where Patterson was busy preparing a lunch of diced apples and nuts for Koko. The gorilla’s two kitten playmates romped in a crate by her feet. We would go deliver the meal together shortly, but first I had some questions for the 68-year-old researcher. I wanted to understand more about her famous charge and the rest of our closest living relatives. [Continue reading…]
Nature reports: Don’t trust everything you read in the psychology literature. In fact, two thirds of it should probably be distrusted.
In the biggest project of its kind, Brian Nosek, a social psychologist and head of the Center for Open Science in Charlottesville, Virginia, and 269 co-authors repeated work reported in 98 original papers from three psychology journals, to see if they independently came up with the same results.
The studies they took on ranged from whether expressing insecurities perpetuates them to differences in how children and adults respond to fear stimuli, to effective ways to teach arithmetic. [Continue reading…]
Only about 5% of the universe consists of ordinary matter such as protons and electrons, with the rest being filled with mysterious substances known as dark matter and dark energy. So far, scientists have failed to detect these elusive materials, despite spending decades searching for them. But now, two new studies may be able to turn things around as they have narrowed down the search significantly.
Dark matter was first proposed more than 70 years ago to explain why the force of gravity in galaxy clusters is so much stronger than expected. If the clusters contained only the stars and gas we observe, their gravity should be much weaker, leading scientists to assume there is some sort of matter hidden there that we can’t see. Such dark matter would provide additional mass to these large structures, increasing their gravitational pull. The main contender for the substance is a type of hypothetical particle known as a “weakly interacting massive particle” (WIMP).
To probe the nature of dark matter, physicists look for evidence of its interactions beyond gravity. If the WIMP hypothesis is correct, dark matter particles could be detected through their scattering off atomic nuclei or electrons on Earth. In such “direct” detection experiments, a WIMP collision would cause these charged particles to recoil, producing light that we can observe.
Jason Coppola reports: It’s a crisis point in history for Native American languages. Without a concerted effort to revitalize them, many will soon go extinct, succumbing to the generations-long effort to destroy them.
“You could reasonably say every single Native American language, including the large ones, are endangered,” said linguist K. David Harrison, a National Geographic fellow teaching at Swarthmore College. “There’s no room for complacency whatsoever.”
The Maori people of New Zealand are one of many groups that have struggled against the violent effects of colonization on their languages. In 1840, the Maori came under the rule of the British Crown as more and more European settlers arrived and more land was needed to accommodate them. Land conflicts eventually broke out into all-out war, ending with huge tracts of Maori land being confiscated by the government. Displacement, poverty and racism became commonplace. Their struggle now reflects that of other Indigenous peoples and nations across the globe fighting to preserve their knowledge, culture and traditional way of life. [Continue reading…]
New Scientist: In an early Star Trek episode, the Enterprise is boarded by human-like aliens, with lives lived so fast that the crew can’t see them. For their part, the aliens see Captain Kirk and his crew as near-static beings whose every action seems to take an age to complete.
Now think about how we view plants. With their slow-lane responsiveness, they could be ticking the boxes for behavioural brightness but they seem too slow, and too different, to register as intelligent.
This is the core of Brilliant Green by Stefano Mancuso and Alessandra Viola and Plant Sensing and Communication by Richard Karban. Plants are smart, they say, but to notice we have to overcome our ingrained cultural biases. As Karban writes: “Ask a child about the differences between plants and animals… They’ll say, ‘Plants can’t move’ or ‘Plants don’t do anything’.”
And, as both books point out, it is but a short intellectual step to allying apparent immobility with a form of mechanistic half-life of simple growth and response – a flatlined existence devoid of subtlety, strategy and learning.
Islam doesn’t consider plants alive at all, Mancuso and Viola remind us. It has a rich tradition of plant and flower illustration, alongside a ban on the physical depiction of living things. And until recently, Western medicine used “vegetative state” to describe people considered to have lost the ability to think or be aware.
Clearly, we will never play chess with a rose, nor ask the orchid on our windowsill for advice. But that is the point: humans are guilty of serious parochialism, of defining intelligence in terms of a nervous system and muscle-based speed that enables things to be done fast, say all three authors.
Plants and animals face similar challenges: to find resources and mates, and avoid predators, pathogens and abiotic stresses. In response, says Karban, “plants communicate, signaling within [themselves], eavesdropping on neighboring individuals, and exchanging information with other organisms”. They have adaptive responses that, if they happened at speeds humans understand, would reveal them to be “brilliant at solving problems related to their existence”. [Continue reading…]
In a review of Notes On The Death Of Culture, Anne Haverty writes: We may not be living in the worst of times, although a case might very well be made for it, but anyone with a thought in their head would be entitled to say that we’re living in the stupidest. Mario Vargas Llosa, the Nobel Prize-winning novelist, certainly believes we are. In this series of coruscating and passionate essays on the state of culture he argues that we have, en masse, capitulated to idiocy. And it is leading us to melancholy and despair.
This is a book of mourning. What Vargas Llosa writes is a lament for how things used to be and how they are now in all aspects of life from the political to the spiritual. Like TS Eliot in his essay Notes Towards the Definition of Culture, written in 1948, he takes the concept of culture in the general sense as a shared sensibility, a way of life.
Eliot too saw culture decaying around him and foresaw a time in which there would be no culture. This time, Vargas Llosa argues, is ours. Eliot has since been under attack for what his critics often describe as his elitist attitudes – as well as much else – and Vargas Llosa will probably also be tarred with the same brush for his pains.
But we must be grateful to him for describing in a relatively orderly manner the chaos of hypocrisy and emptiness into which our globalised culture has plunged and to which we seem to have little option but to subscribe.
It’s not easy, however, to be orderly on such an all-encompassing and sensitive subject as the way we live now. On some aspects, such as the art business, Vargas Llosa practically foams at the mouth. The art world is “rotten to the core”, a world in which artists cynically contrive “cheap stunts”. Stars like Damien Hirst are purveyors of “con-tricks”, and their “boring, farcical and bleak” productions are aided by “half-witted critics”.
We have abandoned the former minority culture, which was truth-seeking, profound, quiet and subtle, in favour of mainstream or mass entertainment, which has to be accessible – and how brave if foolhardy of anyone these days to cast aspersions on accessibility – as well as sensation-loving and frivolous.
Value-free, this kind of culture is essentially valueless. [Continue reading…]
Emily Singer writes: Genes, like people, have families — lineages that stretch back through time, all the way to a founding member. That ancestor multiplied and spread, morphing a bit with each new iteration.
For most of the last 40 years, scientists thought that this was the primary way new genes were born — they simply arose from copies of existing genes. The old version went on doing its job, and the new copy became free to evolve novel functions.
Certain genes, however, seem to defy that origin story. They have no known relatives, and they bear no resemblance to any other gene. They’re the molecular equivalent of a mysterious beast discovered in the depths of a remote rainforest, a biological enigma seemingly unrelated to anything else on earth.
The mystery of where these orphan genes came from has puzzled scientists for decades. But in the past few years, a once-heretical explanation has quickly gained momentum — that many of these orphans arose out of so-called junk DNA, or non-coding DNA, the mysterious stretches of DNA between genes. “Genetic function somehow springs into existence,” said David Begun, a biologist at the University of California, Davis. [Continue reading…]
Frank Pasquale writes: In a recent podcast series called Instaserfs, a former Uber driver named Mansour gave a chilling description of the new, computer-mediated workplace. First, the company tried to persuade him to take a predatory loan to buy a new car. Apparently a number cruncher deemed him at high risk of defaulting. Second, Uber would never respond in person to him – it just sent text messages and emails. This style of supervision was a series of take-it-or-leave-it ultimatums – a digital boss coded in advance.
Then the company suddenly took a larger cut of revenues from him and other drivers. And finally, what seemed most outrageous to Mansour: his job could be terminated without notice if a few passengers gave him one-star reviews, since that could drag his average below 4.7. According to him, Uber has no real appeal recourse or other due process in play for a rating system that can instantly put a driver out of work – it simply crunches the numbers.
Mansour’s story compresses long-standing trends in credit and employment – and it’s by no means unique. Online retailers live in fear of a ‘Google Death Penalty’ – a sudden, mysterious drop in search-engine rankings if they do something judged fraudulent by Google’s spam detection algorithms. Job applicants at Walmart in the US and other large companies take mysterious ‘personality tests’, which process their responses in undisclosed ways. And white-collar workers face CV-sorting software that may understate, or entirely ignore, their qualifications. One algorithmic CV analyser found all 29,000 people who applied for a ‘reasonably standard engineering position’ unqualified.
The infancy of the internet is over. As online spaces mature, Facebook, Google, Apple, Amazon, and other powerful corporations are setting the rules that govern competition among journalists, writers, coders, and e-commerce firms. Uber and Postmates and other platforms are adding a code layer to occupations like driving and service work. Cyberspace is no longer an escape from the ‘real world’. It is now a force governing it via algorithms: recipe-like sets of instructions to solve problems. From Google search to OkCupid matchmaking, software orders and weights hundreds of variables into clean, simple interfaces, taking us from query to solution. Complex mathematics govern such answers, but it is hidden from plain view, thanks either to secrecy imposed by law, or to complexity outsiders cannot unravel. [Continue reading…]
Ezekiel Kwedu writes: I was driving north up the coast of California, back to my home in the Bay Area. It was 12 days after Sandra Bland was pulled over and arrested by a police officer in Waller County after failing to signal a lane change. Nine days after she was found dead in her jail cell, a plastic bag wrapped around her neck. It was five days after a police officer pulled over Samuel DuBose for having his front license plate in the glove compartment. Five days after he was shot point blank in the head, safety belt fastened, his hands up. As I drove, I idly brainstormed a new protocol to follow if I were stopped by the police.
If stopped by the police, I thought to myself, I would set my phone to record audio and put it on the passenger seat. I would send a tweet that I was being stopped and had every intention of complying with the police officer. I would turn on Periscope and livestream the stop, crowdsourcing witnesses. I would text my family and tell them that I was not feeling angry or suicidal, that I was looking forward to seeing them soon. There would not be time to do all of these things, but maybe if I prepared in advance I could pull off one or two of them. What all of these plans had in common were that none of them were meant to secure my safety, but rather to ensure that my death looked suspicious enough to question.
I was figuring out how to enter evidence into the inquiry of my own death. [Continue reading…]
Dan Kahan writes: It’s well established that there is no meaningful correlation between what a person says he or she “believes” about evolution and having the rudimentary understanding of natural selection, random mutation, and genetic variance necessary to pass a high school biology exam (Bishop & Anderson 1990; Shtulman 2006).
There is a correlation between “belief” in evolution and possession of the kinds of substantive knowledge and reasoning skills essential to science comprehension generally.
But what the correlation is depends on religiosity: a relatively nonreligious person is more likely to say he or she “believes in” evolution, but a relatively religious person less likely to do so, as their science comprehension capacity goes up (Kahan 2015).
That’s what “belief in” evolution of the sort measured in a survey item signifies: who one is, not what one knows.
Americans don’t disagree about evolution because they have different understandings of or commitments to science. They disagree because they subscribe to competing cultural worldviews that invest positions on evolution with identity-expressive significance. [Continue reading…]
Claire Cameron writes: English speakers and others are highly egocentric when it comes to orienting themselves in the world. Objects and people exist to the left, right, in front, and to the back of you. You move forward and backward in relation to the direction you are facing. For an aboriginal tribe in north Queensland, Australia, called the Guugu Ymithirr, such a “me me me” approach to spatial information makes no sense. Instead, they use cardinal directions to express spatial information (pdf). So rather than “Can you move to my left?” they would say “Can you move to the west?”
Linguist Guy Deustcher says that Guugu Ymithirr speakers have a kind of “internal compass” that is imprinted from an extremely young age. In the same way that English-speaking infants learn to use different tenses when they speak, so do Guugu Ymithirr children learn to orient themselves along compass lines, not relative to themselves. In fact, says Deustcher, if a Guugu Ymithirr speaker wants to direct your attention to the direction behind him, he “points through himself, as if he were thin air and his own existence were irrelevant.” Whether that translates into less egocentric worldviews is a matter for further study and debate.
Other studies have shown that speakers of languages that use cardinal directions to express locations have fantastic spatial memory and navigation skills — perhaps because their experience of an event is so well-defined by the directions it took place in. [Continue reading…]
Nature reports: Researchers want to understand how the cephalopods, a class of free-floating molluscs, produced a creature that is clever enough to navigate highly complex mazes and open jars filled with tasty crabs.
Surprisingly, the octopus genome turned out to be almost as large as a human’s and to contain a greater number of protein-coding genes — some 33,000, compared with fewer than 25,000 in Homo sapiens.
This excess results mostly from the expansion of a few specific gene families, [neurobiologist Clifton] Ragsdale says. One of the most remarkable gene groups is the protocadherins, which regulate the development of neurons and the short-range interactions between them. The octopus has 168 of these genes — more than twice as many as mammals. This resonates with the creature’s unusually large brain and the organ’s even-stranger anatomy. Of the octopus’s half a billion neurons — six times the number in a mouse — two-thirds spill out from its head through its arms, without the involvement of long-range fibres such as those in vertebrate spinal cords. The independent computing power of the arms, which can execute cognitive tasks even when dismembered, have made octopuses an object of study for neurobiologists such as Hochner and for roboticists who are collaborating on the development of soft, flexible robots.
A gene family that is involved in development, the zinc-finger transcription factors, is also highly expanded in octopuses. At around 1,800 genes, it is the second-largest gene family to be discovered in an animal, after the elephant’s 2,000 olfactory-receptor genes. [Continue reading…]
When Morgan Spurlock famously spent a month eating large portions of McDonalds for the purposes of his documentary Supersize Me, he gained weight, damaged his liver and claimed to have suffered addictive withdrawal symptoms. This was popularly attributed to the toxic mix of carbs and fat plus the added chemicals and preservatives in junk foods. But could there be another explanation?
We may have forgotten others who really don’t enjoy fast food. These are the poor creatures that live in the dark in our guts. These are the hundred trillion microbes that outnumber our total human cells ten to one and digest our food, provide many vitamins and nutrients and keep us healthy. Until recently we have viewed them as harmful – but those (like salmonella) are a tiny minority and most are essential for us.
Studies in lab mice have shown that when fed an intensive high fat diet their microbes change dramatically and for the worse. This can be partly prevented by using probiotics; but there are obvious differences between us and lab mice, as well as our natural microbes.
Discover a society with no absolutes, populated by the ultimate empiricists — people happy without God
Daniel Everett summarizes the lesson for linguistics from his research of the Pirahã people and their language:
The lesson is that language is not something mysterious that is outside the bounds of natural selection, or just popped into being through some mutated gene. But that language is a human invention to solve a human problem. Other creatures can’t use it for the same reason they can’t use a shovel: it was invented by humans, for humans and its success is judged by humans.
Julie Beck writes: In Paul Murray’s novel Skippy Dies, there’s a point where the main character, Howard, has an existential crisis.“‘It’s just not how I expected my life would be,'” he says.
“‘What did you expect?’” a friend responds.
“Howard ponders this. ‘I suppose—this sounds stupid, but I suppose I thought there’d be more of a narrative arc.’”
But it’s not stupid at all. Though perhaps the facts of someone’s life, presented end to end, wouldn’t much resemble a narrative to the outside observer, the way people choose to tell the stories of their lives, to others and — crucially — to themselves, almost always does have a narrative arc. In telling the story of how you became who you are, and of who you’re on your way to becoming, the story itself becomes a part of who you are.
“Life stories do not simply reflect personality. They are personality, or more accurately, they are important parts of personality, along with other parts, like dispositional traits, goals, and values,” writes Dan McAdams, a professor of psychology at Northwestern University, along with Erika Manczak, in a chapter for the APA Handbook of Personality and Social Psychology. [Continue reading…]
Shannon Hall writes: It begins with the smallest anomaly. The first exoplanets were the slightest shifts in a star’s light. The Higgs boson was just a bump in the noise. And the Big Bang sprung from a few rapidly moving galaxies that should have been staying put. Great scientific discoveries are born from puny signals that prompt attention.
And now, another tantalizing, result is gathering steam, stirring the curiosity of physicists worldwide. It’s a bump in the data gathered by the Large Hadron Collider (LHC), the world’s most powerful particle accelerator. If the bump matures into a clearer peak during the LHC’s second run, it could indicate the existence of a new, unexpected particle that’s 2,000 times heavier than the proton. Ultimately, it could provoke a major update to our understanding of physics.
Or it could simply be a statistical fluke, doomed to disappear over time. But the bump currently has a significance level of three sigma, meaning that this little guy just might be here to stay. The rule of thumb in physics is that a one-sigma result could easily be due to random fluctuations, like the fair coin that flipped tails twice. A three-sigma result counts as an observation, worth discussing and publishing. But for physicists to proclaim a discovery, a finding that rewrites textbooks, a result has to be at the five-sigma level. At that point, the chance of the signal arising randomly is only one in a million.
There’s no knowing if the LHC researchers’ new finding is real until they gather more data. And even bigger would-be discoveries — those with five-sigma results and better — have led physicists astray before, raising hopes for new insights into the Universe before being disproved by other data. When pushing the very limits of what we can possibly measure, false positives are always a danger. Here are five examples where seemingly solid findings came undone. [Continue reading…]
Barbara J. King writes: The idea that our oceans teem with cultural animals — and have for millions of years — is the central conclusion of a new book by two whale scientists. And it’s a convincing one.
Whales and dolphins, as they forage for food and interact with each other in their social units, may learn specific ways of doing things from their mothers or their pod-mates.
Certain killer whales (orcas), for example, learn to hunt communally with such precision that they cause waves to wash seals — of only certain species, because other seals are rejected as prey — off their ice floes and into the sea. And the complex patterned songs of humpback whales evolve so quickly over time and space that only learning can explain it.
“The song being sung at any location can change dramatically into an entirely new form, with new units, new phrases, and new themes within less than a year,” write authors Hal Whitehead and Luke Rendell in their book The Cultural Lives of Whales and Dolphins. “A revolution, rather than an evolution.”
The two scientists, who have been studying sperm whales for a collective half century, offer this working definition of culture: Behavior that is shared by some identifiable group such as a family, community or population, and that is acquired by learning from others.
In order for culture to be ruled in as the primary explanation for some behavior, then, genetics and features of the habitat in which the marine mammals live should be ruled out. [Continue reading…]