Author Archives: Attention to the Unseen

Colliding black holes tell new story of stars

Natalie Wolchover writes: At a talk last month in Santa Barbara, California, addressing some of the world’s leading astrophysicists, Selma de Mink cut to the chase. “How did they form?” she began.

“They,” as everybody knew, were the two massive black holes that, more than 1 billion years ago and in a remote corner of the cosmos, spiraled together and merged, making waves in the fabric of space and time. These “gravitational waves” rippled outward and, on Sept. 14, 2015, swept past Earth, strumming the ultrasensitive detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO). LIGO’s discovery, announced in February, triumphantly vindicated Albert Einstein’s 1916 prediction that gravitational waves exist. By tuning in to these tiny tremors in space-time and revealing for the first time the invisible activity of black holes — objects so dense that not even light can escape their gravitational pull — LIGO promised to open a new window on the universe, akin, some said, to when Galileo first pointed a telescope at the sky.

Already, the new gravitational-wave data has shaken up the field of astrophysics. In response, three dozen experts spent two weeks in August sorting through the implications at the Kavli Institute for Theoretical Physics (KITP) in Santa Barbara.[Continue reading…]

Facebooktwittermail

Evidence rebuts Chomsky’s theory of language learning

Paul Ibbotson and Michael Tomasello write: The idea that we have brains hardwired with a mental template for learning grammar — famously espoused by Noam Chomsky of the Massachusetts Institute of Technology — has dominated linguistics for almost half a century. Recently, though, cognitive scientists and linguists have abandoned Chomsky’s “universal grammar” theory in droves because of new research examining many different languages—and the way young children learn to understand and speak the tongues of their communities. That work fails to support Chomsky’s assertions.

The research suggests a radically different view, in which learning of a child’s first language does not rely on an innate grammar module. Instead the new research shows that young children use various types of thinking that may not be specific to language at all — such as the ability to classify the world into categories (people or objects, for instance) and to understand the relations among things. These capabilities, coupled with a unique hu­­­man ability to grasp what others intend to communicate, allow language to happen. The new findings indicate that if researchers truly want to understand how children, and others, learn languages, they need to look outside of Chomsky’s theory for guidance.

This conclusion is important because the study of language plays a central role in diverse disciplines — from poetry to artificial intelligence to linguistics itself; misguided methods lead to questionable results. Further, language is used by humans in ways no animal can match; if you understand what language is, you comprehend a little bit more about human nature. [Continue reading…]

Facebooktwittermail

Beware the bad big wolf: why you need to put your adjectives in the right order

By Simon Horobin, University of Oxford

Unlikely as it sounds, the topic of adjective use has gone “viral”. The furore centres on the claim, taken from Mark Forsyth’s book The Elements of Eloquence, that adjectives appearing before a noun must appear in the following strict sequence: opinion, size, age, shape, colour, origin, material, purpose, Noun. Even the slightest attempt to disrupt this sequence, according to Forsyth, will result in the speaker sounding like a maniac. To illustrate this point, Forsyth offers the following example: “a lovely little old rectangular green French silver whittling knife”.

But is the “rule” worthy of an internet storm – or is it more of a ripple in a teacup? Well, certainly the example is a rather unlikely sentence, and not simply because whittling knives are not in much demand these days – ignoring the question of whether they can be both green and silver. This is because it is unusual to have a string of attributive adjectives (ones that appear before the noun they describe) like this.

More usually, speakers of English break up the sequence by placing some of the adjectives in predicative position – after the noun. Not all adjectives, however, can be placed in either position. I can refer to “that man who is asleep” but it would sound odd to refer to him as “that asleep man”; we can talk about the “Eastern counties” but not the “counties that are Eastern”. Indeed, our distribution of adjectives both before and after the noun reveals another constraint on adjective use in English – a preference for no more than three before a noun. An “old brown dog” sounds fine, a “little old brown dog” sounds acceptable, but a “mischievous little old brown dog” sounds plain wrong.

Continue reading

Facebooktwittermail

Torturing animals injures humanity

tufted-capuchin-monkey

John P. Gluck writes: Five years ago, the National Institutes of Health all but ended biomedical and behavioral research on chimpanzees, concluding that, as the closest human relative, they deserved “special consideration and respect.”

But chimpanzees were far from the only nonhuman primates used in research then, or now. About 70,000 other primates are still living their lives as research subjects in labs across the United States.

On Wednesday, the N.I.H. will hold a workshop on “continued responsible research” with these animals. This sounds like a positive development. But as someone who spent decades working almost daily with macaque monkeys in primate research laboratories, I know firsthand that “responsible” research is not enough. What we really need to examine is the very moral ground of animal research itself.

Like many researchers, I once believed that intermittent scientific gains justified methods that almost always did harm. As a graduate student in the late 1960s, I came to see that my natural recoil from intentionally harming animals was a hindrance to how I understood scientific progress. I told myself that we were being responsible by providing good nutrition, safe cages, skilled and caring caretakers and veterinarians for the animals — and, crucially, that what we stood to learn outweighed any momentary or prolonged anguish these animals might experience. The potential for a medical breakthrough, the excitement of research and discovering whether my hypotheses were correct — and let’s not leave out smoldering ambition — made my transition to a more “rigorous” stance easier than I could have imagined.

One of my areas of study focused on the effects of early social deprivation on the intellectual abilities of rhesus monkeys. We kept young, intelligent monkeys separated from their families and others of their kind for many months in soundproof cages that remained lit 24 hours a day, then measured how their potential for complex social and intellectual lives unraveled. All the while, I comforted myself with the idea that these monkeys were my research partners, and that by creating developmental disorders in monkeys born in a lab, we could better understand these disorders in humans.

But it was impossible to fully quell my repugnance at all that I continued to witness and to inflict. At the same time, in the classroom, I began to face questions from students who had become increasingly concerned about the predicament of lab animals. [Continue reading…]

Facebooktwittermail

Forget software — now hackers are exploiting physics

Andy Greenberg reports: Practically every word we use to describe a computer is a metaphor. “File,” “window,” even “memory” all stand in for collections of ones and zeros that are themselves representations of an impossibly complex maze of wires, transistors and the electrons moving through them. But when hackers go beyond those abstractions of computer systems and attack their actual underlying physics, the metaphors break.

Over the last year and a half, security researchers have been doing exactly that: honing hacking techniques that break through the metaphor to the actual machine, exploiting the unexpected behavior not of operating systems or applications, but of computing hardware itself—in some cases targeting the actual electricity that comprises bits of data in computer memory. And at the Usenix security conference earlier this month, two teams of researchers presented attacks they developed that bring that new kind of hack closer to becoming a practical threat.

Both of those new attacks use a technique Google researchers first demonstrated last March called “Rowhammer.” The trick works by running a program on the target computer, which repeatedly overwrites a certain row of transistors in its DRAM flash memory, “hammering” it until a rare glitch occurs: Electric charge leaks from the hammered row of transistors into an adjacent row. The leaked charge then causes a certain bit in that adjacent row of the computer’s memory to flip from one to zero or vice versa. That bit flip gives you access to a privileged level of the computer’s operating system.

It’s messy. And mind-bending. And it works. [Continue reading…]

Facebooktwittermail

Forget ideology, liberal democracy’s newest threats come from technology and bioscience

John Naughton writes: The BBC Reith Lectures in 1967 were given by Edmund Leach, a Cambridge social anthropologist. “Men have become like gods,” Leach began. “Isn’t it about time that we understood our divinity? Science offers us total mastery over our environment and over our destiny, yet instead of rejoicing we feel deeply afraid.”

That was nearly half a century ago, and yet Leach’s opening lines could easily apply to today. He was speaking before the internet had been built and long before the human genome had been decoded, and so his claim about men becoming “like gods” seems relatively modest compared with the capabilities that molecular biology and computing have subsequently bestowed upon us. Our science-based culture is the most powerful in history, and it is ceaselessly researching, exploring, developing and growing. But in recent times it seems to have also become plagued with existential angst as the implications of human ingenuity begin to be (dimly) glimpsed.

The title that Leach chose for his Reith Lecture – A Runaway World – captures our zeitgeist too. At any rate, we are also increasingly fretful about a world that seems to be running out of control, largely (but not solely) because of information technology and what the life sciences are making possible. But we seek consolation in the thought that “it was always thus”: people felt alarmed about steam in George Eliot’s time and got worked up about electricity, the telegraph and the telephone as they arrived on the scene. The reassuring implication is that we weathered those technological storms, and so we will weather this one too. Humankind will muddle through.

But in the last five years or so even that cautious, pragmatic optimism has begun to erode. There are several reasons for this loss of confidence. One is the sheer vertiginous pace of technological change. Another is that the new forces at loose in our society – particularly information technology and the life sciences – are potentially more far-reaching in their implications than steam or electricity ever were. And, thirdly, we have begun to see startling advances in these fields that have forced us to recalibrate our expectations.[Continue reading…]

Facebooktwittermail

It is not what you believe, but what you do that matters

Steven Nadler writes: In July 1656, the 23-year-old Bento de Spinoza was excommunicated from the Portuguese-Jewish congregation of Amsterdam. It was the harshest punishment of herem (ban) ever issued by that community. The extant document, a lengthy and vitriolic diatribe, refers to the young man’s ‘abominable heresies’ and ‘monstrous deeds’. The leaders of the community, having consulted with the rabbis and using Spinoza’s Hebrew name, proclaim that they hereby ‘expel, excommunicate, curse, and damn Baruch de Spinoza’. He is to be ‘cast out from all the tribes of Israel’ and his name is to be ‘blotted out from under heaven’.

Over the centuries, there have been periodic calls for the herem against Spinoza to be lifted. Even David Ben-Gurion, when he was prime minister of Israel, issued a public plea for ‘amending the injustice’ done to Spinoza by the Amsterdam Portuguese community. It was not until early 2012, however, that the Amsterdam congregation, at the insistence of one of its members, formally took up the question of whether it was time to rehabilitate Spinoza and welcome him back into the congregation that had expelled him with such prejudice. There was, though, one thing that they needed to know: should we still regard Spinoza as a heretic?

Unfortunately, the herem document fails to mention specifically what Spinoza’s offences were – at the time he had not yet written anything – and so there is a mystery surrounding this seminal event in the future philosopher’s life. And yet, for anyone who is familiar with Spinoza’s mature philosophical ideas, which he began putting in writing a few years after the excommunication, there really is no such mystery. By the standards of early modern rabbinic Judaism – and especially among the Sephardic Jews of Amsterdam, many of whom were descendants of converso refugees from the Iberian Inquisitions and who were still struggling to build a proper Jewish community on the banks of the Amstel River – Spinoza was a heretic, and a dangerous one at that.

What is remarkable is how popular this heretic remains nearly three and a half centuries after his death, and not just among scholars. Spinoza’s contemporaries, René Descartes and Gottfried Leibniz, made enormously important and influential contributions to the rise of modern philosophy and science, but you won’t find many committed Cartesians or Leibnizians around today. The Spinozists, however, walk among us. They are non-academic devotees who form Spinoza societies and study groups, who gather to read him in public libraries and in synagogues and Jewish community centres. Hundreds of people, of various political and religious persuasions, will turn out for a day of lectures on Spinoza, whether or not they have ever read him. There have been novels, poems, sculptures, paintings, even plays and operas devoted to Spinoza. This is all a very good thing.

It is also a very curious thing. Why should a 17th-century Portuguese-Jewish philosopher whose dense and opaque writings are notoriously difficult to understand incite such passionate devotion, even obsession, among a lay audience in the 21st century? Part of the answer is the drama and mystery at the centre of his life: why exactly was Spinoza so harshly punished by the community that raised and nurtured him? Just as significant, I suspect, is that everyone loves an iconoclast – especially a radical and fearless one that suffered persecution in his lifetime for ideas and values that are still so important to us today. Spinoza is a model of intellectual courage. Like a prophet, he took on the powers-that-be with an unflinching honesty that revealed ugly truths about his fellow citizens and their society. [Continue reading…]

Facebooktwittermail