Boston public schools map switch aims to amend 500 years of distortion

The Guardian reports: When Boston public schools introduced a new standard map of the world this week, some young students’ felt their jaws drop. In an instant, their view of the world had changed.

The USA was small. Europe too had suddenly shrunk. Africa and South America appeared narrower but also much larger than usual. And what had happened to Alaska?

In an age of “fake news” and “alternative facts”, city authorities are confident their new map offers something closer to the geographical truth than that of traditional school maps, and hope it can serve an example to schools across the nation and even the world.

For almost 500 years, the Mercator projection has been the norm for maps of the world, ubiquitous in atlases, pinned on peeling school walls.

Gerardus Mercator, a renowned Flemish cartographer, devised his map in 1569, principally to aid navigation along colonial trade routes by drawing straight lines across the oceans. An exaggeration of the whole northern hemisphere, his depiction made North America and Europe bigger than South America and Africa. He also placed western Europe in the middle of his map.

Mercator’s distortions affect continents as well as nations. For example, South America is made to look about the same size as Europe, when in fact it is almost twice as large, and Greenland looks roughly the size of Africa when it is actually about 14 times smaller. Alaska looks bigger than Mexico and Germany is in the middle of the picture, not to the north – because Mercator moved the equator.

Three days ago, Boston’s public schools began phasing in the lesser-known Peters projection, which cuts the US, Britain and the rest of Europe down to size. Teachers put contrasting maps of the world side by side and let the students study them. [Continue reading…]

Facebooktwittermail

‘I didn’t know you could have emotions out loud’

Laura Collins-Hughes writes: Stephan Wolfert was drunk when he hopped off an Amtrak train somewhere in Montana, toting a rucksack of clothes and a cooler stocked with ice, peanut butter, bread and Miller High Life — bottles, not cans. It was 1991, he was 24, and he had recently seen his best friend fatally wounded in a military training exercise.

His mind in need of a salve, he went to a play: “Richard III,” the story of a king who was also a soldier. In Shakespeare’s words, he heard an echo of his own experience, and though he had been raised to believe that being a tough guy was the only way to be a man, something cracked open inside him.

“I was sobbing,” Mr. Wolfert, now 50 and an actor, said recently over coffee in Chelsea. “I didn’t know you could have emotions out loud.”

That road-to-Damascus moment — not coming to Jesus, but coming to Shakespeare — is part of the story that Mr. Wolfert tells in his solo show, “Cry Havoc!,” which starts performances Wednesday, March 15, at the New Ohio Theater. Taking its title from Mark Antony’s speech over the slain Caesar in “Julius Caesar,” it intercuts Mr. Wolfert’s own memories with text borrowed from Shakespeare. Decoupling those lines from their plays, Mr. Wolfert uses them to explore strength and duty, bravery and trauma, examining what it is to be in the military and what it is to carry that experience back into civilian life. [Continue reading…]

Facebooktwittermail

When Earth became a ‘mote of dust’

Shannon Stirone writes: We glimpsed Earth’s curvature in 1946, via a repurposed German V-2 rocket that flew 65 miles above the surface. Year-by-year, we climbed a little higher, engineering a means to comprehend the magnitude of our home.

In 1968, Apollo 8 lunar module pilot William Anders captured the iconic Earthrise photo. We contemplated the beauty of our home.

But on Valentine’s Day 27 years ago, Voyager 1, from 4 billion miles away, took one final picture before switching off its camera forever. In the image, Earth, Carl Sagan said, was merely “a mote of dust suspended in a sunbeam.” So we pondered the insignificance of our home. The image inspired Sagan to write his book “The Pale Blue Dot,” and it continues to cripple human grandiosity. [Continue reading…]

Facebooktwittermail

Was the first song a lullaby?

Tom Jacobs writes: Why do humans play, and listen, to music? The question has long baffled evolutionary theorists. Some suggest it had its origins in courtship rituals, while others contend it had (and has) a unique ability to bond people together to work toward a common goal.

Now, a couple of Harvard University researchers have proposed a new concept: They argue that the earliest music  —  and perhaps the prototype for everything from Bach to rap  — may just have been the songs mothers sing to their infants.

Maybe the first musical genre wasn’t the love song, but rather the lullaby.

“The evolution of music must be a complex, multi-step process, with different features developing for different reasons,” says Samuel Mehr, who co-authored the paper with psychologist Max Krasnow. “Our theory raises the possibility that infant-directed song is the starting point for all that.”

Mothers vocalize to their babies “across many, if not all, cultures,” the researches note in the journal Evolution and Human Behavior. Its ubiquity suggests this activity plays a positive role in the parent-child relationship, presumably soothing infants by proving that someone is there and paying attention to them. [Continue reading…]

Facebooktwittermail

Newfound 3.77-billion-year-old fossils could be earliest evidence of life on Earth

The Washington Post reports: Tiny, tubular structures uncovered in ancient Canadian rocks could be remnants of some of the earliest life on Earth, scientists say.

The straw-shaped “microfossils,” narrower than the width of a human hair and invisible to the naked eye, are believed to come from ancient microbes, according to a new study in the journal Nature. Scientists debate the age of the specimens, but the authors’ youngest estimate — 3.77 billion years — would make these fossils the oldest ever found.

Claims of ancient fossils are always contentious. Rocks as old as the ones in the new study rarely survive the weathering, erosion, subduction and deformation of our geologically active Earth. Any signs of life in the rocks that do survive are difficult to distinguish, let alone prove. Other researchers in the field expressed skepticism about whether the structures were really fossils, and whether the rocks that contain them are as old as the study authors say.

But the scientists behind the new finding believe their analysis should hold up to scrutiny. In addition to structures that look like fossil microbes, the rocks contain a cocktail of chemical compounds they say is almost certainly the result of biological processes. [Continue reading…]

Facebooktwittermail

People can’t think straight

Elizabeth Kolbert writes: In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones.

Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times. Others discovered that they were hopeless. They identified the real note in only ten instances.

As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine — they’d been obtained from the Los Angeles County coroner’s office — the scores were fictitious. The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.

In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well — significantly better than the average student — even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student — a conclusion that was equally unfounded.

“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.” [Continue reading…]

Facebooktwittermail

Bees learn to play golf and show off how clever they really are

New Scientist reports: It’s a hole in one! Bumblebees have learned to push a ball into a hole to get a reward, stretching what was thought possible for small-brained creatures.

Plenty of previous studies have shown that bees are no bumbling fools, but these have generally involved activities that are somewhat similar to their natural foraging behaviour.

For example, bees were able to learn to pull a string to reach an artificial flower containing sugar solution. Bees sometimes have to pull parts of flowers to access nectar, so this isn’t too alien to them.

So while these tasks might seem complex, they don’t really show a deeper level of learning, says Olli Loukola at Queen Mary University of London, an author of that study.

Loukola and his team decided the next challenge was whether bees could learn to move an object that was not attached to the reward.

They built a circular platform with a small hole in the centre filled with sugar solution, into which bees had to move a ball to get a reward. A researcher showed them how to do this by using a plastic bee on a stick to push the ball.

The researchers then took three groups of other bees and trained them in different ways. One group observed a previously trained bee solving the task; another was shown the ball moving into the hole, pulled by a hidden magnet; and a third group was given no demonstration, but was shown the ball already in the hole containing the reward.

The bees then did the task themselves. Those that had watched other bees do it were most successful and took less time than those in the other groups to solve the task. Bees given the magnetic demonstration were also more successful than those not given one. [Continue reading…]

Facebooktwittermail

Low-status chimps revealed as trendsetters

Science News reports: Chimps with little social status influence their comrades’ behavior to a surprising extent, a new study suggests.

In groups of captive chimps, a method for snagging food from a box spread among many individuals who saw a low-ranking female peer demonstrate the technique, say primatologist Stuart Watson of the University of St. Andrews in Fife, Scotland, and colleagues. But in other groups where an alpha male introduced the same box-opening technique, relatively few chimps copied the behavior, the researchers report online February 7 in the American Journal of Primatology.

“I suspect that even wild chimpanzees are motivated to copy obviously rewarding behaviors of low-ranking individuals, but the limited spread of rewarding behaviors demonstrated by alpha males was quite surprising,” Watson says. Previous research has found that chimps in captivity more often copy rewarding behaviors of dominant versus lower-ranking group mates. The researchers don’t understand why in this case the high-ranking individuals weren’t copied as much. [Continue reading…]

Facebooktwittermail

That the world is not solid but made up of tiny particles is a very ancient insight

Carlo Rovelli writes: According to tradition, in the year 450 BCE, a man embarked on a 400-mile sea voyage from Miletus in Anatolia to Abdera in Thrace, fleeing a prosperous Greek city that was suddenly caught up in political turmoil. It was to be a crucial journey for the history of knowledge. The traveller’s name was Leucippus; little is known about his life, but his intellectual spirit proved indelible. He wrote the book The Great Cosmology, in which he advanced new ideas about the transient and permanent aspects of the world. On his arrival in Abdera, Leucippus founded a scientific and philosophical school, to which he soon affiliated a young disciple, Democritus, who cast a long shadow over the thought of all subsequent times.

Together, these two thinkers have built the majestic cathedral of ancient atomism. Leucippus was the teacher. Democritus, the great pupil who wrote dozens of works on every field of knowledge, was deeply venerated in antiquity, which was familiar with these works. ‘The most subtle of the Ancients,’ Seneca called him. ‘Who is there whom we can compare with him for the greatness, not merely of his genius, but also of his spirit?’ asks Cicero.

What Leucippus and Democritus had understood was that the world can be comprehended using reason. They had become convinced that the variety of natural phenomena must be attributable to something simple, and had tried to understand what this something might be. They had conceived of a kind of elementary substance from which everything was made. Anaximenes of Miletus had imagined this substance could compress and rarefy, thus transforming from one to another of the elements from which the world is constituted. It was a first germ of physics, rough and elementary, but in the right direction. An idea was needed, a great idea, a grand vision, to grasp the hidden order of the world. Leucippus and Democritus came up with this idea.

The idea of Democritus’s system is extremely simple: the entire universe is made up of a boundless space in which innumerable atoms run. Space is without limits; it has neither an above nor a below; it is without a centre or a boundary. Atoms have no qualities at all, apart from their shape. They have no weight, no colour, no taste. ‘Sweetness is opinion, bitterness is opinion; heat, cold and colour are opinion: in reality only atoms, and vacuum,’ said Democritus. Atoms are indivisible; they are the elementary grains of reality, which cannot be further subdivided, and everything is made of them. They move freely in space, colliding with one another; they hook on to and push and pull one another. Similar atoms attract one another and join.

This is the weave of the world. This is reality. Everything else is nothing but a by-product – random and accidental – of this movement, and this combining of atoms. The infinite variety of the substances of which the world is made derives solely from this combining of atoms. [Continue reading…]

Facebooktwittermail

The neurology for reaching a destination

Moheb Costandi writes: How do humans and other animals find their way from A to B? This apparently simple question has no easy answer. But after decades of extensive research, a picture of how the brain encodes space and enables us to navigate through it is beginning to emerge. Earlier, neuroscientists had found that the mammalian brain contains at least three different cell types, which cooperate to encode neural representations of an animal’s location and movements.
But that picture has just grown far more complex. New research now points to the existence of two more types of brain cells involved in spatial navigation — and suggests previously unrecognized neural mechanisms underlying the way mammals make their way about the world.

Earlier work, performed in freely moving rodents, revealed that neurons called place cells fire when an animal is in a specific location. Another type — grid cells — activate periodically as an animal moves around. Finally, head direction cells fire when a mouse or rat moves in a particular direction. Together, these cells, which are located in and around a deep brain structure called the hippocampus, appear to encode an animal’s current location within its environment by tracking the distance and direction of its movements.

This process is fine for simply moving around, but it does not explain exactly how a traveler gets to a specific destination. The question of how the brain encodes the endpoint of a journey has remained unanswered. To investigate this, Ayelet Sarel of the Weismann Institute of Science in Israel and her colleagues trained three Egyptian fruit bats to fly in complicated paths and then land at a specific location where they could eat and rest. The researchers recorded the activity of a total of 309 hippocampal neurons with a wireless electrode array. About a third of these neurons exhibited the characteristics of place cells, each of them firing only when the bat was in a specific area of the large flight room. But the researchers also identified 58 cells that fired only when the bats were flying directly toward the landing site. [Continue reading…]

Facebooktwittermail

How diversity makes us smarter

Katherine W. Phillips writes: The first thing to acknowledge about diversity is that it can be difficult. In the U.S., where the dialogue of inclusion is relatively advanced, even the mention of the word “diversity” can lead to anxiety and conflict. Supreme Court justices disagree on the virtues of diversity and the means for achieving it. Corporations spend billions of dollars to attract and manage diversity both internally and externally, yet they still face discrimination lawsuits, and the leadership ranks of the business world remain predominantly white and male.

It is reasonable to ask what good diversity does us. Diversity of expertise confers benefits that are obvious — you would not think of building a new car without engineers, designers and quality-control experts — but what about social diversity? What good comes from diversity of race, ethnicity, gender and sexual orientation? Research has shown that social diversity in a group can cause discomfort, rougher interactions, a lack of trust, greater perceived interpersonal conflict, lower communication, less cohesion, more concern about disrespect, and other problems. So what is the upside?

The fact is that if you want to build teams or organizations capable of innovating, you need diversity. Diversity enhances creativity. It encourages the search for novel information and perspectives, leading to better decision making and problem solving. Diversity can improve the bottom line of companies and lead to unfettered discoveries and breakthrough innovations. Even simply being exposed to diversity can change the way you think. This is not just wishful thinking: it is the conclusion I draw from decades of research from organizational scientists, psychologists, sociologists, economists and demographers.
[Continue reading…]

Facebooktwittermail

Giant atoms could help unveil ‘dark matter’ and other cosmic secrets

By Diego A. Quiñones, University of Leeds

The universe is an astonishingly secretive place. Mysterious substances known as dark matter and dark energy account for some 95% of it. Despite huge effort to find out what they are, we simply don’t know.

We know dark matter exists because of the gravitational pull of galaxy clusters – the matter we can see in a cluster just isn’t enough to hold it together by gravity. So there must be some extra material there, made up by unknown particles that simply aren’t visible to us. Several candidate particles have already been proposed.

Scientists are trying to work out what these unknown particles are by looking at how they affect the ordinary matter we see around us. But so far it has proven difficult, so we know it interacts only weakly with normal matter at best. Now my colleague Benjamin Varcoe and I have come up with a new way to probe dark matter that may just prove successful: by using atoms that have been stretched to be 4,000 times larger than usual.

[Read more…]

Facebooktwittermail

Spontaneity is at the heart of science

Henry Cowles writes: There is a theory in psychology called the theory theory. It’s a theory about theories. While this might sound obvious, the theory theory leads to counterintuitive conclusions. A quarter-century ago, psychologists began to point out important links between the development of scientific theories and how everyday thinking, including children’s thinking, works. According to theory theorists, a child learns by constructing a theory of the world and testing it against experience. In this sense, children are little scientists – they hypothesise on the basis of observations, test their hypotheses experimentally, and then revise their views in light of the evidence they gather.

According to Alison Gopnik, a theory theorist at the University of California, Berkeley, the analogy works both ways. It’s not just that ‘children are little scientists’, she wrote in her paper ‘The Scientist as Child’ (1996), ‘but that scientists are big children.’ Depending on where you look, you can see the scientific method in a child, or spot the inner child in a scientist. Either way, the theory theory makes it easy to see connections between elementary learning and scientific theorising.

This should be pretty surprising. After all, scientists go through a lot of training in order to think the way they do. Their results are exact; their methods exacting. Most of us share the sense that scientific thinking is difficult, even for scientists. This perceived difficulty has bolstered (at least until recently) the collective respect for scientific expertise on which the support of cutting-edge research depends. It’s also what gives the theory theory its powerful punch. If science is so hard, how can children – and, some theory theorists argue, even infants – think like scientists in any meaningful sense? Indeed, in the age of what Erik M. Conway and Naomi Oreskes call “the merchants of doubt” (not to say in the age of Trump), isn’t it dangerous to suggest that science is a matter of child’s play?

To gain purchase on this question, let’s take a step back. Claims that children are scientists rest on a certain idea about what science is. For theory theorists – and for many of the rest of us – science is about producing theories. How we do that is often represented as a short list of steps, such as ‘observe’, ‘hypothesise’, and ‘test’, steps that have been emblazoned on posters and recited in debates for the past century. But where did this idea that science is a set of steps – a method – come from? As it turns out, we don’t need to go back to Isaac Newton or the Scientific Revolution to find the history of ‘the scientific method’ in this sense. The image of science that most of us hold, even most scientists, comes from a surprising place: modern child psychology. The scientific method as we know it today comes from psychological studies of children only a century ago. [Continue reading…]

Facebooktwittermail

Roads have sliced the world into 600,000 pieces

Nathaniel Scharping writes: Ever since our ancestors cut rough paths through the wilderness, humanity has been laying down trails. From footpaths to highways, a global network of roads binds communities and facilitates the exchange of goods and ideas. But there is a flip side to this creeping tangle of pathways: The roads that bring us closer also serve divide ecosystems into smaller parcels, turning vast expanses into a jigsaw of human mobility.

In a study published in Science, an international team of researchers attempted to quantify the extent to which roads have sliced up the globe. They used data from OpenStreetMap, a crowd-sourced mapping project, to chart how much land is covered by roads. For the purposes of their project, they defined a roadway as everything within a kilometer of the physical road itself (studies have shown measurable impacts on the environment extending out at least that far).

They estimated that roughly 20 percent of land is occupied by roads, not including Greenland and Antarctica. Although that leaves 80 percent as open space, this land is far from whole. Transected by highways and streets, the road-free areas are cut up into some 600,000 individual parcels. Half of these are less than a square mile, while only 7 percent span more than 60 square miles. The true impact of roads seems to be the gradual tessellation of once-cohesive landscapes. [Continue reading…]

Facebooktwittermail

How a guy from a Montana trailer park overturned 150 years of biology

Ed Yong writes: In 1995, if you had told Toby Spribille that he’d eventually overthrow a scientific idea that’s been the stuff of textbooks for 150 years, he would have laughed at you. Back then, his life seemed constrained to a very different path. He was raised in a Montana trailer park, and home-schooled by what he now describes as a “fundamentalist cult.” At a young age, he fell in love with science, but had no way of feeding that love. He longed to break away from his roots and get a proper education.

At 19, he got a job at a local forestry service. Within a few years, he had earned enough to leave home. His meager savings and non-existent grades meant that no American university would take him, so Spribille looked to Europe.

Thanks to his family background, he could speak German, and he had heard that many universities there charged no tuition fees. His missing qualifications were still a problem, but one that the University of Gottingen decided to overlook. “They said that under exceptional circumstances, they could enroll a few people every year without transcripts,” says Spribille. “That was the bottleneck of my life.”

Throughout his undergraduate and postgraduate work, Spribille became an expert on the organisms that had grabbed his attention during his time in the Montana forests — lichens.

You’ve seen lichens before, but unlike Spribille, you may have ignored them. They grow on logs, cling to bark, smother stones. At first glance, they look messy and undeserving of attention. On closer inspection, they are astonishingly beautiful. They can look like flecks of peeling paint, or coralline branches, or dustings of powder, or lettuce-like fronds, or wriggling worms, or cups that a pixie might drink from. They’re also extremely tough. They grow in the most inhospitable parts of the planet, where no plant or animal can survive.

Lichens have an important place in biology. In the 1860s, scientists thought that they were plants. But in 1868, a Swiss botanist named Simon Schwendener revealed that they’re composite organisms, consisting of fungi that live in partnership with microscopic algae. This “dual hypothesis” was met with indignation: it went against the impetus to put living things in clear and discrete buckets. The backlash only collapsed when Schwendener and others, with good microscopes and careful hands, managed to tease the two partners apart.

Schwendener wrongly thought that the fungus had “enslaved” the alga, but others showed that the two cooperate. The alga uses sunlight to make nutrients for the fungus, while the fungus provides minerals, water, and shelter. This kind of mutually beneficial relationship was unheard of, and required a new word. Two Germans, Albert Frank and Anton de Bary, provided the perfect one — symbiosis, from the Greek for ‘together’ and ‘living’. [Continue reading…]

Facebooktwittermail

Humans have been altering Earth for millennia, but only now are we wise to what we’re doing

David Grinspoon writes: As a planetary astrobiologist, I am focused on the major transitions in planetary evolution and the evolving relationship between planets and life. The scientific community is converging on the idea that we have entered a new epoch of Earth history, one in which the net activity of humans has become an agent of global change as powerful as the great forces of nature that shape continents and propel the evolution of species. This concept has garnered a lot of attention, and justly so. Thinking about the new epoch – often called the Anthropocene, or the age of humanity – challenges us to look at ourselves in the mirror of deep time, measured not in centuries or even in millennia, but over millions and billions of years. And yet much of the recent discussion and debate over the Anthropocene still does not come to terms with its full meaning and importance.

Various markers have been proposed for the starting date of the Anthropocene, such as the rise in CO2, isotopes from nuclear tests, the ‘Columbian exchange’ of species between hemispheres when Europeans colonised the Americas, or more ancient human modifications of the landscape or climate. The question in play here is: when did our world gain a quality that is uniquely human? Many species have had a major influence on the globe, but they don’t each get their own planetary transition in the geologic timescale. When did humans begin changing things in a way that no other species has ever changed Earth before? Making massive changes in landscapes is not unique to us. Beavers do plenty of that, for example, when they build dams, alter streams, cut down forests and create new meadows. Even changing global climate and initiating mass extinction is not a human first. Photosynthetic bacteria did that some 2.5 billion years ago.

What distinguishes humans from other world-changing organisms must be related to our great cleverness and adaptability; the power that comes from communicating, planning and working in social groups; transmitting knowledge from one generation to the next; and applying these skills toward altering our surroundings and expanding our habitable domains. However, people have been engaged in these activities for tens of thousands of years, and have produced many different environmental modifications proposed as markers of the Anthropocene’s beginning. Therefore, those definitions strike me as incomplete. Until now, the people causing the disturbances had no way of recognising or even conceiving of a global change. Yes, humans have been altering our planet for millennia, but there is something going on now that was not happening when we started doing all that world-changing. [Continue reading…]

Facebooktwittermail