David Krakauer writes: On Dec. 2, 1942, just over three years into World War II, President Roosevelt was sent the following enigmatic cable: “The Italian navigator has landed in the new world.” The accomplishments of Christopher Columbus had long since ceased to be newsworthy. The progress of the Italian physicist, Enrico Fermi, navigator across the territories of Lilliputian matter — the abode of the microcosm of the atom — was another thing entirely. Fermi’s New World, discovered beneath a Midwestern football field in Chicago, was the province of newly synthesized radioactive elements. And Fermi’s landing marked the earliest sustained and controlled nuclear chain reaction required for the construction of an atomic bomb.
This physical chain reaction was one of the links of scientific and cultural chain reactions initiated by the Hungarian physicist, Leó Szilárd. The first was in 1933, when Szilárd proposed the idea of a neutron chain reaction. Another was in 1939, when Szilárd and Einstein sent the now famous “Szilárd-Einstein” letter to Franklin D. Roosevelt informing him of the destructive potential of atomic chain reactions: “This new phenomenon would also lead to the construction of bombs, and it is conceivable — though much less certain — that extremely powerful bombs of a new type may thus be constructed.”
This scientific information in turn generated political and policy chain reactions: Roosevelt created the Advisory Committee on Uranium which led in yearly increments to the National Defense Research Committee, the Office of Scientific Research and Development, and finally, the Manhattan Project.
Life itself is a chain reaction. Consider a cell that divides into two cells and then four and then eight great-granddaughter cells. Infectious diseases are chain reactions. Consider a contagious virus that infects one host that infects two or more susceptible hosts, in turn infecting further hosts. News is a chain reaction. Consider a report spread from one individual to another, who in turn spreads the message to their friends and then on to the friends of friends.
These numerous connections that fasten together events are like expertly arranged dominoes of matter, life, and culture. As the modernist designer Charles Eames would have it, “Eventually everything connects — people, ideas, objects. The quality of the connections is the key to quality per se.”
Dominoes, atoms, life, infection, and news — all yield domino effects that require a sensitive combination of distances between pieces, physics of contact, and timing. When any one of these ingredients is off-kilter, the propagating cascade is likely to come to a halt. Premature termination is exactly what we might want to happen to a deadly infection, but it is the last thing that we want to impede an idea. [Continue reading…]
Is the Earth now spinning through the “Age of Humans?” More than a few scientists think so. They’ve suggested, in fact, that we modify the name of the current geological epoch (the Holocene, which began roughly 12,000 years ago) to the “Anthropocene.” It’s a term first put into wide circulation by Nobel-Prize winning atmospheric chemist Paul Crutzen in an article published in Nature in 2002. And it’s stirring up a good deal of debate, not only among geologists.
The idea is that we needed a new planetary marker to account for the scale of human changes to the Earth: extensive land transformation, mass extinctions, control of the nitrogen cycle, large-scale water diversion, and especially change of the atmosphere through the emission of greenhouse gases. Although naming geological epochs isn’t usually a controversial act, the Anthropocene proposal is radical because it means that what had been an environmental fixture against which people acted, the geological record, is now just another expression of the human presence.
It seems to be a particularly bitter pill to swallow for nature preservationists, heirs to the American tradition led by writers, scientists and activists such as John Muir, Aldo Leopold, David Brower, Rachel Carson and Edward Abbey. That’s because some have argued the traditional focus on the goal of wilderness protection rests on a view of “pristine” nature that is simply no longer viable on a planet hurtling toward nine billion human inhabitants.
Given this situation, we felt the time was ripe to explore the impact of the Anthropocene on the idea and practice of nature preservation. Our plan was to create a salon, a kind of literary summit. But we wanted to cut to the chase: What does it mean to “save American nature” in the age of humans?
We invited a distinguished group of environmental writers – scientists, philosophers, historians, journalists, agency administrators and activists – to give it their best shot. The essays appear in the new collection, After Preservation: Saving American Nature in the Age of Humans.
Zeynep Tufekci writes: The machine hums along, quietly scanning the slides, generating Pap smear diagnostics, just the way a college-educated, well-compensated lab technician might.
A robot with emotion-detection software interviews visitors to the United States at the border. In field tests, this eerily named “embodied avatar kiosk” does much better than humans in catching those with invalid documentation. Emotional-processing software has gotten so good that ad companies are looking into “mood-targeted” advertising, and the government of Dubai wants to use it to scan all its closed-circuit TV feeds.
Yes, the machines are getting smarter, and they’re coming for more and more jobs.
Not just low-wage jobs, either.
Today, machines can process regular spoken language and not only recognize human faces, but also read their expressions. They can classify personality types, and have started being able to carry out conversations with appropriate emotional tenor.
Machines are getting better than humans at figuring out who to hire, who’s in a mood to pay a little more for that sweater, and who needs a coupon to nudge them toward a sale. In applications around the world, software is being used to predict whether people are lying, how they feel and whom they’ll vote for.
To crack these cognitive and emotional puzzles, computers needed not only sophisticated, efficient algorithms, but also vast amounts of human-generated data, which can now be easily harvested from our digitized world. The results are dazzling. Most of what we think of as expertise, knowledge and intuition is being deconstructed and recreated as an algorithmic competency, fueled by big data.
But computers do not just replace humans in the workplace. They shift the balance of power even more in favor of employers. Our normal response to technological innovation that threatens jobs is to encourage workers to acquire more skills, or to trust that the nuances of the human mind or human attention will always be superior in crucial ways. But when machines of this capacity enter the equation, employers have even more leverage, and our standard response is not sufficient for the looming crisis. [Continue reading…]
John Upton writes: When a San Francisco panel began mulling rules about building public projects near changing shorelines, its self-described science translator, David Behar, figured he would just turn to the U.N.’s most recent climate assessment for guidance on future sea levels.
Nor could Behar, leader of the city utility department’s climate program, get what he needed from a 2012 National Research Council report dealing with West Coast sea level rise projections. A National Climate Assessment paper dealing with sea level rise didn’t seem to have what he needed, either. Even after reviewing two California government reports dealing with sea level rise, Behar says he had to telephone climate scientists and review a journal paper summarizing the views of 90 experts before he felt confident that he understood science’s latest projections for hazards posed by the onslaught of rising seas.
“You sometimes have to interview the authors of these reports to actually understand what they’re saying,” Behar said. “On the surface,” the assessments and reports that Behar turned to “all look like they’re saying different things,” he said. “But when you dive deeper — with the help of the authors, in most cases — they don’t disagree with one another very much.”
Governments around the world, from Madison, Wis., and New York City to the Obama Administration and the European Union have begun striving in recent years to adapt to the growing threats posed by climate change. But the burst of adaptation planning threatens to be hobbled by cultural and linguistic divides between those who practice science and those who prepare policy.[Continue reading…]
Thomas Lin writes: As a boy in Shanghai, China, Yitang Zhang believed he would someday solve a great problem in mathematics. In 1964, at around the age of nine, he found a proof of the Pythagorean theorem, which describes the relationship between the lengths of the sides of any right triangle. He was 10 when he first learned about two famous number theory problems, Fermat’s last theorem and the Goldbach conjecture. While he was not yet aware of the centuries-old twin primes conjecture, he was already taken with prime numbers, often described as indivisible “atoms” that make up all other natural numbers.
But soon after, the anti-intellectual Cultural Revolution shuttered schools and sent him and his mother to the countryside to work in the fields. Because of his father’s troubles with the Communist Party, Zhang was also unable to attend high school. For 10 years, he worked as a laborer, reading books on math, history and other subjects when he could.
Not long after the revolution ended, Zhang, then 23, enrolled at Peking University and became one of China’s top math students. After completing his master’s at the age of 29, he was recruited by T. T. Moh to pursue a doctorate at Purdue University in Lafayette, Ind. But, promising though he was, after defending his dissertation in 1991 he could not find academic work as a mathematician.
In George Csicsery’s new documentary film Counting From Infinity, Zhang discusses his difficulties at Purdue and in the years that followed. He says his doctoral adviser never wrote recommendation letters for him. (Moh has written that Zhang did not ask for any.) Zhang admits that his shy, quiet demeanor didn’t help in building relationships or making himself known to the wider math community. During this initial job-hunting period, Zhang sometimes lived in his car, according to his friend Jacob Chi, music director of the Pueblo Symphony in Colorado. In 1992, Zhang began working at another friend’s Subway sandwich restaurant. For about seven years he worked odd jobs for various friends.
In 1999, at 44, Zhang caught a break. [Continue reading…]
Ray Jayawardhana writes: Joni Mitchell beat Carl Sagan to the punch. She sang “we are stardust, billion-year-old carbon” in her 1970 song “Woodstock.” That was three years before Mr. Sagan wrote about humans’ being made of “star-stuff” in his book “The Cosmic Connection” — a point he would later convey to a far larger audience in his 1980 television series, “Cosmos.”
By now, “stardust” and “star-stuff” have nearly turned cliché. But that does not make the reality behind those words any less profound or magical: The iron in our blood, the calcium in our bones and the oxygen we breathe are the physical remains — ashes, if you will — of stars that lived and died long ago.
That discovery is relatively recent. Four astrophysicists developed the idea in a landmark paper published in 1957. They argued that almost all the elements in the periodic table were cooked up over time through nuclear reactions inside stars — rather than in the first instants of the Big Bang, as previously thought. The stuff of life, in other words, arose in places and times somewhat more accessible to our telescopic investigations.
Since most of us spend our lives confined to a narrow strip near Earth’s surface, we tend to think of the cosmos as a lofty, empyrean realm far beyond our reach and relevance. We forget that only a thin sliver of atmosphere separates us from the rest of the universe. [Continue reading…]
Climate Central: The world must move quickly to make electric vehicles more climate-friendly, or the world may not be able to meet its climate goals.
That’s the conclusion of a University of Toronto paper published in the March edition of Nature Climate Change, which argues that countries need to reduce the carbon intensity of their electric power supply in order to make electric transportation systems and other infrastructure an effective strategy for combating climate change.
Think about it this way: Every Nissan Leaf might run on electric power, but how that electricity was generated determines what greenhouse gas emissions the car is responsible for. If the car is charged on solar or geothermal power, the carbon footprint may be miniscule. If it’s charged on electricity generated using coal, it might prove as bad or worse for the climate than burning gasoline. (Climate Central created a road map for climate-friendly cars in 2013 showing where driving electric vehicles is most climate friendly in the U.S.)
The University of Toronto paper establishes an emissions threshold to help governments and consumers better understand whether it helps the climate to push for electric cars and the electrification of other modes of transportation based on the carbon intensity of the electricity those vehicles use. [Continue reading…]
Alice Bell writes: An investigation by Greenpeace and the Climate Investigations Centre reported in the Guardian and New York Times this weekend showed that the work of Willie Soon — an apparently ‘scientific’ voice for climate scepticism — had accepted more than $1.2 million from the fossil-fuel industry over the 14 years.
As Suzanne Goldenberg’s report stresses, although those seeking to delay action to curb carbon emissions were keen to cite and fund Soon’s Harvard-Smithsonian credentials, he did not enjoy the same sort of recognition from the scientific community. He did not receive grants from Nasa or the the National Science Foundation, for example — the sorts of institutions who funded his colleagues at the Center for Astrophysics. Moreover, it appears that Soon violated ethical guidelines of the journals that published his work by not disclosing such funding. It seems to be a story of someone working outside the usual codes of modern science.
But Soon is not a singular aberration in the story of science’s relationship with the fossil fuel industry. It goes deeper than that.
Science and engineering is suffused with oil, gas and, yes, even coal. We must look this squarely in the eye if we’re going to tackle climate change.
The fossil fuel industry is sometimes labelled anti-science, but that’s far from the truth. It loves science — or at least particular bits of science — indeed it needs science. The fossil fuel industry needs the science and engineering community to train staff, to gather information and help develop new techniques. Science and engineering also provides the industry with cultural credibility and can open up powerful political spaces within which to lobby. [Continue reading…]
Philip Ball writes: In July 2011, participants at a conference on the placid shore of Lake Traunsee in Austria were polled on what they thought the meeting was about. You might imagine that this question would have been settled in advance, but since the broad theme was quantum theory, perhaps a degree of uncertainty was to be expected. The title of the conference was ‘Quantum Physics and the Nature of Reality’. The poll, completed by 33 of the participating physicists, mathematicians and philosophers, posed a range of unresolved questions about the relationship between those two things, one of which was: ‘What is your favourite interpretation of quantum mechanics?’
The word ‘favourite’ speaks volumes. Isn’t science supposed to be decided by experiment and observation, free from personal preferences? But experiments in quantum physics have been obstinately silent on what it means. All we can do is develop hunches, intuitions and, yes, cherished ideas. Of these, the survey offered no fewer than 11 to choose from (as well as ‘other’ and ‘none’).
The most popular (supported by 42 per cent of the very small sample) was basically the view put forward by Niels Bohr, Werner Heisenberg and their colleagues in the early days of quantum theory. Today it is known as the Copenhagen Interpretation. More on that below. You might not recognise most of the other alternatives, such as Quantum Bayesianism, Relational Quantum Mechanics, and Objective Collapse (which is not, as you might suppose, just saying ‘what the hell’). Maybe you haven’t heard of the Copenhagen Interpretation either. But in third place (18 per cent) was the Many Worlds Interpretation (MWI), and I suspect you do know something about that, since the MWI is the one with all the glamour and publicity. It tells us that we have multiple selves, living other lives in other universes, quite possibly doing all the things that we dream of but will never achieve (or never dare). Who could resist such an idea?
Yet resist we should. We should resist not just because MWI is unlikely to be true, or even because, since no one knows how to test it, the idea is perhaps not truly scientific at all. Those are valid criticisms, but the main reason we should hold out is that it is incoherent, both philosophically and logically. There could be no better contender for Wolfgang Pauli’s famous put-down: it is not even wrong. [Continue reading…]
New Scientist reports: The Old Ones were already ancient when the Earth was born. Five small planets orbit an 11.2 billion-year-old star, making them about 80 per cent as old as the universe itself. That means our galaxy started building rocky planets earlier than we thought.
“Now that we know that these planets can be twice as old as Earth, this opens the possibility for the existence of ancient life in the galaxy,” says Tiago Campante at the University of Birmingham in the UK.
NASA’s Kepler space telescope spotted the planets around an orange dwarf star called Kepler 444, which is 117 light years away and about 25 per cent smaller than the sun.
Orange dwarfs are considered good candidates for hosting alien life because they can stay stable for up to 30 billion years, compared to the sun’s 10 billion years, the time it takes these stars to consume all their hydrogen. For context, the universe is currently 13.8 billion years old.
Since, as far as we know, life begins by chance, older planets would have had more time to allow life to get going and evolve. But it was unclear whether planets around such an old star could be rocky – life would have a harder time on gassy planets without a solid surface. [Continue reading…]
The Economist: In 1752 Camillo Paderni, an artist who had been put in charge of the growing pile of antiquities being dug up at Herculaneum, a seaside town near Naples, wrote to a certain Dr Mead, who then wrote to the Royal Society in London reporting that “there were found many volumes of papyrus but turned to a sort of charcoal, and so brittle, that being touched, it fell to ashes. Yet by His Majesty’s orders he made many trials to open them, but all to no purpose; excepting some scraps containing some words.”
The excavation at Herculaneum — which, like nearby Pompeii, was buried in 79AD under ash from Mount Vesuvius — had uncovered a literary time capsule. What came to be called the Villa of the Papyri contained a library of perhaps 2,000 books, the only such collection known to have been preserved from antiquity.
Actually reading these scrolls has, however, proved both tricky and destructive — until now. For a paper just published in Nature Communications, by Vito Mocella of the Institute for Microelectronics and Microsystems, in Naples, describes a way to decipher them without unrolling them.
Bryan Appleyard writes: The greatest story of our time may also be the greatest mistake. This is the story of our universe from the Big Bang to now with its bizarre, Dickensian cast of characters – black holes, tiny vibrating strings, the warped space-time continuum, trillions of companion universes and particles that wink in and out of existence.
It is the story told by a long list of officially accredited geniuses from Isaac Newton to Stephen Hawking. It is also the story that is retold daily in popular science fiction from Star Trek to the latest Hollywood sci-fi blockbuster Interstellar. Thanks to the movies, the physicist standing in front of a vast blackboard covered in equations became our age’s symbol of genius. The universe is weird, the TV shows and films tell us, and almost anything can happen.
But it is a story that many now believe is pointless, wrong and riddled with wishful thinking and superstition.
“Stephen Hawking,” says philosopher Roberto Mangabeira Unger, “is not part of the solution, he is part of the problem.”
The equations on the blackboard may be the problem. Mathematics, the language of science, may have misled the scientists.
“The idea,” says physicist Lee Smolin, “that the truth about nature can be wrestled from pure thought through mathematics is overdone… The idea that mathematics is prophetic and that mathematical structure and beauty are a clue to how nature ultimately works is just wrong.”
And in an explosive essay published last week in the science journal Nature astrophysicists George Ellis and Joe Silk say that the wild claims of theoretical physicists are threatening the authority of science itself.
“This battle for the heart and soul of physics,” they write, “is opening up at a time when scientific results — in topics from climate change to the theory of evolution — are being questioned by some politicians and religious fundamentalists. Potential damage to public confidence in science and to the nature of fundamental physics needs to be contained by deeper dialogue between scientists and philosophers….The imprimatur of science should be awarded only to a theory that is testable. Only then can we defend science from attack.”
Unger and Smolin have also just gone into print with a monumental book – The Singular Universe and the Reality of Time – which systematically takes apart contemporary physics and exposes much of it as, in Unger’s words, “an inferno of allegorical fabrication.” The book says it is time to return to real science which is tested against nature rather than constructed out of mathematics. Physics should no longer be seen as the ultimate science, underwriting all others. The true queen of the sciences should be history – the biography of the cosmos. [Continue reading…]
Before any physicists stop by to question whether I really understand what a wormhole is, I will without hesitation make it clear: I have no idea. It just seems like a suitable metaphor — better, say, than rabbit hole.
NBC News: In celebration of its upcoming 25th anniversary in April, the Hubble Space Telescope has returned to the site of what may be its most famous image, the wispy columns of the Eagle Nebula, and produced a stunning new picture. “The Pillars of Creation,” located 6,500 light-years away in area M16 of the distant nebula, were photographed in visible and near-infrared light with the Hubble’s upgraded equipment, and the result is as astonishing now as the original was in 1995. Hubble went online in 1990.
Michael Hanlon writes: We live in a golden age of technological, medical, scientific and social progress. Look at our computers! Look at our phones! Twenty years ago, the internet was a creaky machine for geeks. Now we can’t imagine life without it. We are on the verge of medical breakthroughs that would have seemed like magic only half a century ago: cloned organs, stem-cell therapies to repair our very DNA. Even now, life expectancy in some rich countries is improving by five hours a day. A day! Surely immortality, or something very like it, is just around the corner.
The notion that our 21st-century world is one of accelerating advances is so dominant that it seems churlish to challenge it. Almost every week we read about ‘new hopes’ for cancer sufferers, developments in the lab that might lead to new cures, talk of a new era of space tourism and super-jets that can fly round the world in a few hours. Yet a moment’s thought tells us that this vision of unparalleled innovation can’t be right, that many of these breathless reports of progress are in fact mere hype, speculation – even fantasy.
Yet there once was an age when speculation matched reality. It spluttered to a halt more than 40 years ago. Most of what has happened since has been merely incremental improvements upon what came before. That true age of innovation – I’ll call it the Golden Quarter – ran from approximately 1945 to 1971. Just about everything that defines the modern world either came about, or had its seeds sown, during this time. The Pill. Electronics. Computers and the birth of the internet. Nuclear power. Television. Antibiotics. Space travel. Civil rights.
There is more. Feminism. Teenagers. The Green Revolution in agriculture. Decolonisation. Popular music. Mass aviation. The birth of the gay rights movement. Cheap, reliable and safe automobiles. High-speed trains. We put a man on the Moon, sent a probe to Mars, beat smallpox and discovered the double-spiral key of life. The Golden Quarter was a unique period of less than a single human generation, a time when innovation appeared to be running on a mix of dragster fuel and dilithium crystals.
Today, progress is defined almost entirely by consumer-driven, often banal improvements in information technology. [Continue reading…]
The New York Times reports: They have been called the Dead Sea Scrolls of physics. Since 1986, the Princeton University Press and the Hebrew University of Jerusalem, to whom Albert Einstein bequeathed his copyright, have been engaged in a mammoth effort to study some 80,000 documents he left behind.
Starting on Friday, when Digital Einstein is introduced, anyone with an Internet connection will be able to share in the letters, papers, postcards, notebooks and diaries that Einstein left scattered in Princeton and in other archives, attics and shoeboxes around the world when he died in 1955.
The Einstein Papers Project, currently edited by Diana Kormos-Buchwald, a professor of physics and the history of science at the California Institute of Technology, has already published 13 volumes in print out of a projected 30. [Continue reading…]
In the minds of many humans, empathy is the signature of humanity and yet if this empathy extends further and includes non-humans we may be suspected of indulging in anthropomorphism — a sentimental projection of our own feelings into places where similar feelings supposedly cannot exist.
But the concept of anthropomorphism is itself a strange idea since it seems to invalidate what should be one of the most basic assumptions we can reasonably make about living creatures: that without the capacity to suffer, nothing would survive.
Just as the deadening of sensation makes people more susceptible to injury, an inability to feel pain would impede any creature’s need to avoid harm.
The seemingly suicidal draw of the moth to a flame is the exception rather than the rule. Moreover the insect is driven by a mistake, not a death wish. It is drawn towards the light, not the heat, oblivious that the two are one.
If humans indulge in projections about the feelings of others — human and non-human — perhaps we more commonly engage in negative projections: choosing to assume that feelings are absent where it would cause us discomfort to be attuned to their presence.
Our inclination is to avoid feeling too much and thus we construct neat enclosures for our concerns.
These enclosures shut out the feelings of strangers and then by extension seal away boundless life from which we have become even more estranged.
Heather Swan writes: It was a warm day in early spring when I had my first long conversation with the entomologist and science studies scholar Sainath Suryanarayanan. We met over a couple of hives I had recently inherited. One was thriving. Piles of dead bees filled the other. Parts of the comb were covered with mould and oozing something that looked like molasses.
Having recently attended a class for hobby beekeepers with Marla Spivak, an entomologist at the University of Minnesota, I was aware of the many different diseases to which bees are susceptible. American foulbrood, which was a mean one, concerned me most. Beekeepers recommended burning all of your equipment if you discovered it in your hives. Some of these bees were alive, but obviously in low spirits, and I didn’t want to destroy them unnecessarily. I called Sainath because I thought he could help me with the diagnosis.
Beekeeping, these days, is riddled with risks. New viruses, habitat loss, pesticides and mites all contribute to creating a deadly labyrinth through which nearly every bee must travel. Additionally, in 2004, mysterious bee disappearances began to plague thousands of beekeepers. Seemingly healthy bees started abandoning their homes. This strange disappearing act became known as colony collapse disorder (CCD).
Since then, the world has seen the decline of many other pollinating species, too. Because honeybees and other pollinators are responsible for pollinating at least one-third of all the food we eat, this is a serious problem globally. Diagnosing bee problems is not simple, but some answers are emerging. A ubiquitous class of pesticides called neonicotinoids have been implicated in pollinator decline, which has fuelled conversations among beekeepers, scientists, policy-makers and growers. A beekeeper facing a failing hive now has to consider not only the health of the hive itself, but also the health of the landscape around the hive. Dead bees lead beekeepers down a path of many questions. And some beekeepers have lost so many hives, they feel like giving up.
When we met at my troubled hives, Sainath brought his own hive tool and veil. He had already been down a path of many questions about bee deaths, one that started in his youth with a fascination for observing insects. When he was 14, he began his ‘Amateur Entomologist’s Record’, where he kept taxonomic notes on such things as wing textures, body shapes, colour patterns and behaviours. But the young scientist’s approach occasionally slipped to include his exuberance, describing one moment as ‘a stupefying experience!’ All this led him to study biology and chemistry in college, then to work on the behavioural ecology of paper wasps during his doctoral studies, and eventually to Minnesota to help Spivak investigate the role of pesticides in CCD.
Sainath had spent several years doing lab and field experiments with wasps and bees, but ultimately wanted to shift from traditional practices in entomology to research that included human/insect relationships. It was Sainath who made me wonder about the role of emotion in science – both in the scientists themselves and in the subjects of their experiments. I had always thought of emotion as something excised from science, but this was impossible for some scientists. What was the role of empathy in experimentation? How do we, with our human limitations, understand something as radically different from us as the honeybee? Did bees have feelings, too? If so, what did that mean for the scientist? For the science? [Continue reading…]
The New York Times reports: Two groups of scientists, working independently, have created artificial intelligence software capable of recognizing and describing the content of photographs and videos with far greater accuracy than ever before, sometimes even mimicking human levels of understanding.
Until now, so-called computer vision has largely been limited to recognizing individual objects. The new software, described on Monday by researchers at Google and at Stanford University, teaches itself to identify entire scenes: a group of young men playing Frisbee, for example, or a herd of elephants marching on a grassy plain.
The software then writes a caption in English describing the picture. Compared with human observations, the researchers found, the computer-written descriptions are surprisingly accurate.
The advances may make it possible to better catalog and search for the billions of images and hours of video available online, which are often poorly described and archived. At the moment, search engines like Google rely largely on written language accompanying an image or video to ascertain what it contains. [Continue reading…]