Category Archives: Science/Technology
Why the singularity is greatly exaggerated
In 1968, Marvin Minsky said, “Within a generation we will have intelligent computers like HAL in the film, 2001.” What made him and other early AI proponents think machines would think like humans?
Even before Moore’s law there was the idea that computers are going to get faster and their clumsy behavior is going to get a thousand times better. It’s what Ray Kurzweil now claims. He says, “OK, we’re moving up this curve in terms of the number of neurons, number of processing units, so by this projection we’re going to be at super-human levels of intelligence.” But that’s deceptive. It’s a fallacy. Just adding more speed or neurons or processing units doesn’t mean you end up with a smarter or more capable system. What you need are new algorithms, new ways of understanding a problem. In the area of creativity, it’s not at all clear that a faster computer is going to get you there. You’re just going to come up with more bad, bland, boring things. That ability to distinguish, to filter out what’s interesting, that’s still elusive.
Today’s computers, though, can generate an awful lot of connections in split seconds.
But generating is fairly easy and testing pretty hard. In Robert Altman’s movie, The Player, they try to combine two movies to make a better one. You can imagine a computer that just takes all movie titles and tries every combination of pairs, like Reservoir Dogs meets Casablanca. I could write that program right now on my laptop and just let it run. It would instantly generate all possible combinations of movies and there will be some good ones. But recognizing them, that’s the hard part.
That’s the part you need humans for.
Right, the Tim Robbins movie exec character says, “I listen to stories and decide if they’ll make good movies or not.” The great majority of combinations won’t work, but every once in a while there’s one that is both new and interesting. In early AI it seemed like the testing was going to be easy. But we haven’t been able to figure out the filtering.
Can’t you write a creativity algorithm?
If you want to do variations on a theme, like Thomas Kinkade, sure. Take our movie machine. Let’s say there have been 10,000 movies — that’s 10,000 squared, or 100 million combinations of pairs of movies. We can build a classifier that would look at lots of pairs of successful movies and do some kind of inference on it so that it could learn what would be successful again. But it would be looking for patterns that are already existent. It wouldn’t be able to find that new thing that was totally out of left field. That’s what I think of as creativity — somebody comes up with something really new and clever. [Continue reading…]
We are ignoring the new machine age at our peril
John Naughton writes: As a species, we don’t seem to be very good at dealing with nonlinearity. We cope moderately well with situations and environments that are changing gradually. But sudden, major discontinuities – what some people call “tipping points” – leave us spooked. That’s why we are so perversely relaxed about climate change, for example: things are changing slowly, imperceptibly almost, but so far there hasn’t been the kind of sharp, catastrophic change that would lead us seriously to recalibrate our behaviour and attitudes.
So it is with information technology. We know – indeed, it has become a cliche – that computing power has been doubling at least every two years since records of these things began. We know that the amount of data now generated by our digital existence is expanding annually at an astonishing rate. We know that our capacity to store digital information has been increasing exponentially. And so on. What we apparently have not sussed, however, is that these various strands of technological progress are not unconnected. Quite the contrary, and therein lies our problem.
The thinker who has done most to explain the consequences of connectedness is a Belfast man named W Brian Arthur, an economist who was the youngest person ever to occupy an endowed chair at Stanford University and who in later years has been associated with the Santa Fe Institute, one of the world’s leading interdisciplinary research institutes. In 2009, he published a remarkable book, The Nature of Technology, in which he formulated a coherent theory of what technology is, how it evolves and how it spurs innovation and industry. Technology, he argued, “builds itself organically from itself” in ways that resemble chemistry or even organic life. And implicit in Arthur’s conception of technology is the idea that innovation is not linear, but what mathematicians call “combinatorial”, ie one driven by a whole bunch of things. And the significant point about combinatorial innovation is that it brings about radical discontinuities that nobody could have anticipated. [Continue reading…]
The complexity of science
Leonard Mlodinow writes: The other week I was working in my garage office when my 14-year-old daughter, Olivia, came in to tell me about Charles Darwin. Did I know that he discovered the theory of evolution after studying finches on the Galápagos Islands? I was steeped in what felt like the 37th draft of my new book, which is on the development of scientific ideas, and she was proud to contribute this tidbit of history that she had just learned in class.
Sadly, like many stories of scientific discovery, that commonly recounted tale, repeated in her biology textbook, is not true.
The popular history of science is full of such falsehoods. In the case of evolution, Darwin was a much better geologist than ornithologist, at least in his early years. And while he did notice differences among the birds (and tortoises) on the different islands, he didn’t think them important enough to make a careful analysis. His ideas on evolution did not come from the mythical Galápagos epiphany, but evolved through many years of hard work, long after he had returned from the voyage. (To get an idea of the effort involved in developing his theory, consider this: One byproduct of his research was a 684-page monograph on barnacles.)
The myth of the finches obscures the qualities that were really responsible for Darwin’s success: the grit to formulate his theory and gather evidence for it; the creativity to seek signs of evolution in existing animals, rather than, as others did, in the fossil record; and the open-mindedness to drop his belief in creationism when the evidence against it piled up.
The mythical stories we tell about our heroes are always more romantic and often more palatable than the truth. But in science, at least, they are destructive, in that they promote false conceptions of the evolution of scientific thought. [Continue reading…]
Chain reactions spreading ideas through science and culture
David Krakauer writes: On Dec. 2, 1942, just over three years into World War II, President Roosevelt was sent the following enigmatic cable: “The Italian navigator has landed in the new world.” The accomplishments of Christopher Columbus had long since ceased to be newsworthy. The progress of the Italian physicist, Enrico Fermi, navigator across the territories of Lilliputian matter — the abode of the microcosm of the atom — was another thing entirely. Fermi’s New World, discovered beneath a Midwestern football field in Chicago, was the province of newly synthesized radioactive elements. And Fermi’s landing marked the earliest sustained and controlled nuclear chain reaction required for the construction of an atomic bomb.
This physical chain reaction was one of the links of scientific and cultural chain reactions initiated by the Hungarian physicist, Leó Szilárd. The first was in 1933, when Szilárd proposed the idea of a neutron chain reaction. Another was in 1939, when Szilárd and Einstein sent the now famous “Szilárd-Einstein” letter to Franklin D. Roosevelt informing him of the destructive potential of atomic chain reactions: “This new phenomenon would also lead to the construction of bombs, and it is conceivable — though much less certain — that extremely powerful bombs of a new type may thus be constructed.”
This scientific information in turn generated political and policy chain reactions: Roosevelt created the Advisory Committee on Uranium which led in yearly increments to the National Defense Research Committee, the Office of Scientific Research and Development, and finally, the Manhattan Project.
Life itself is a chain reaction. Consider a cell that divides into two cells and then four and then eight great-granddaughter cells. Infectious diseases are chain reactions. Consider a contagious virus that infects one host that infects two or more susceptible hosts, in turn infecting further hosts. News is a chain reaction. Consider a report spread from one individual to another, who in turn spreads the message to their friends and then on to the friends of friends.
These numerous connections that fasten together events are like expertly arranged dominoes of matter, life, and culture. As the modernist designer Charles Eames would have it, “Eventually everything connects — people, ideas, objects. The quality of the connections is the key to quality per se.”
Dominoes, atoms, life, infection, and news — all yield domino effects that require a sensitive combination of distances between pieces, physics of contact, and timing. When any one of these ingredients is off-kilter, the propagating cascade is likely to come to a halt. Premature termination is exactly what we might want to happen to a deadly infection, but it is the last thing that we want to impede an idea. [Continue reading…]
What does it mean to preserve nature in the Age of Humans?
By Ben A Minteer, Arizona State University and Stephen Pyne, Arizona State University
Is the Earth now spinning through the “Age of Humans?” More than a few scientists think so. They’ve suggested, in fact, that we modify the name of the current geological epoch (the Holocene, which began roughly 12,000 years ago) to the “Anthropocene.” It’s a term first put into wide circulation by Nobel-Prize winning atmospheric chemist Paul Crutzen in an article published in Nature in 2002. And it’s stirring up a good deal of debate, not only among geologists.
The idea is that we needed a new planetary marker to account for the scale of human changes to the Earth: extensive land transformation, mass extinctions, control of the nitrogen cycle, large-scale water diversion, and especially change of the atmosphere through the emission of greenhouse gases. Although naming geological epochs isn’t usually a controversial act, the Anthropocene proposal is radical because it means that what had been an environmental fixture against which people acted, the geological record, is now just another expression of the human presence.
It seems to be a particularly bitter pill to swallow for nature preservationists, heirs to the American tradition led by writers, scientists and activists such as John Muir, Aldo Leopold, David Brower, Rachel Carson and Edward Abbey. That’s because some have argued the traditional focus on the goal of wilderness protection rests on a view of “pristine” nature that is simply no longer viable on a planet hurtling toward nine billion human inhabitants.
Given this situation, we felt the time was ripe to explore the impact of the Anthropocene on the idea and practice of nature preservation. Our plan was to create a salon, a kind of literary summit. But we wanted to cut to the chase: What does it mean to “save American nature” in the age of humans?
We invited a distinguished group of environmental writers – scientists, philosophers, historians, journalists, agency administrators and activists – to give it their best shot. The essays appear in the new collection, After Preservation: Saving American Nature in the Age of Humans.
Why your employer would like to replace you with a machine
Zeynep Tufekci writes: The machine hums along, quietly scanning the slides, generating Pap smear diagnostics, just the way a college-educated, well-compensated lab technician might.
A robot with emotion-detection software interviews visitors to the United States at the border. In field tests, this eerily named “embodied avatar kiosk” does much better than humans in catching those with invalid documentation. Emotional-processing software has gotten so good that ad companies are looking into “mood-targeted” advertising, and the government of Dubai wants to use it to scan all its closed-circuit TV feeds.
Yes, the machines are getting smarter, and they’re coming for more and more jobs.
Not just low-wage jobs, either.
Today, machines can process regular spoken language and not only recognize human faces, but also read their expressions. They can classify personality types, and have started being able to carry out conversations with appropriate emotional tenor.
Machines are getting better than humans at figuring out who to hire, who’s in a mood to pay a little more for that sweater, and who needs a coupon to nudge them toward a sale. In applications around the world, software is being used to predict whether people are lying, how they feel and whom they’ll vote for.
To crack these cognitive and emotional puzzles, computers needed not only sophisticated, efficient algorithms, but also vast amounts of human-generated data, which can now be easily harvested from our digitized world. The results are dazzling. Most of what we think of as expertise, knowledge and intuition is being deconstructed and recreated as an algorithmic competency, fueled by big data.
But computers do not just replace humans in the workplace. They shift the balance of power even more in favor of employers. Our normal response to technological innovation that threatens jobs is to encourage workers to acquire more skills, or to trust that the nuances of the human mind or human attention will always be superior in crucial ways. But when machines of this capacity enter the equation, employers have even more leverage, and our standard response is not sufficient for the looming crisis. [Continue reading…]
Climate scientists need to produce more ‘actionable science’
John Upton writes: When a San Francisco panel began mulling rules about building public projects near changing shorelines, its self-described science translator, David Behar, figured he would just turn to the U.N.’s most recent climate assessment for guidance on future sea levels.
He couldn’t.
Nor could Behar, leader of the city utility department’s climate program, get what he needed from a 2012 National Research Council report dealing with West Coast sea level rise projections. A National Climate Assessment paper dealing with sea level rise didn’t seem to have what he needed, either. Even after reviewing two California government reports dealing with sea level rise, Behar says he had to telephone climate scientists and review a journal paper summarizing the views of 90 experts before he felt confident that he understood science’s latest projections for hazards posed by the onslaught of rising seas.
“You sometimes have to interview the authors of these reports to actually understand what they’re saying,” Behar said. “On the surface,” the assessments and reports that Behar turned to “all look like they’re saying different things,” he said. “But when you dive deeper — with the help of the authors, in most cases — they don’t disagree with one another very much.”
Governments around the world, from Madison, Wis., and New York City to the Obama Administration and the European Union have begun striving in recent years to adapt to the growing threats posed by climate change. But the burst of adaptation planning threatens to be hobbled by cultural and linguistic divides between those who practice science and those who prepare policy.[Continue reading…]
How Yitang Zhang rose from obscurity and a disadvantaged youth to mathematical celebrity
Thomas Lin writes: As a boy in Shanghai, China, Yitang Zhang believed he would someday solve a great problem in mathematics. In 1964, at around the age of nine, he found a proof of the Pythagorean theorem, which describes the relationship between the lengths of the sides of any right triangle. He was 10 when he first learned about two famous number theory problems, Fermat’s last theorem and the Goldbach conjecture. While he was not yet aware of the centuries-old twin primes conjecture, he was already taken with prime numbers, often described as indivisible “atoms” that make up all other natural numbers.
But soon after, the anti-intellectual Cultural Revolution shuttered schools and sent him and his mother to the countryside to work in the fields. Because of his father’s troubles with the Communist Party, Zhang was also unable to attend high school. For 10 years, he worked as a laborer, reading books on math, history and other subjects when he could.
Not long after the revolution ended, Zhang, then 23, enrolled at Peking University and became one of China’s top math students. After completing his master’s at the age of 29, he was recruited by T. T. Moh to pursue a doctorate at Purdue University in Lafayette, Ind. But, promising though he was, after defending his dissertation in 1991 he could not find academic work as a mathematician.
In George Csicsery’s new documentary film Counting From Infinity, Zhang discusses his difficulties at Purdue and in the years that followed. He says his doctoral adviser never wrote recommendation letters for him. (Moh has written that Zhang did not ask for any.) Zhang admits that his shy, quiet demeanor didn’t help in building relationships or making himself known to the wider math community. During this initial job-hunting period, Zhang sometimes lived in his car, according to his friend Jacob Chi, music director of the Pueblo Symphony in Colorado. In 1992, Zhang began working at another friend’s Subway sandwich restaurant. For about seven years he worked odd jobs for various friends.
In 1999, at 44, Zhang caught a break. [Continue reading…]
Stardust
Ray Jayawardhana writes: Joni Mitchell beat Carl Sagan to the punch. She sang “we are stardust, billion-year-old carbon” in her 1970 song “Woodstock.” That was three years before Mr. Sagan wrote about humans’ being made of “star-stuff” in his book “The Cosmic Connection” — a point he would later convey to a far larger audience in his 1980 television series, “Cosmos.”
By now, “stardust” and “star-stuff” have nearly turned cliché. But that does not make the reality behind those words any less profound or magical: The iron in our blood, the calcium in our bones and the oxygen we breathe are the physical remains — ashes, if you will — of stars that lived and died long ago.
That discovery is relatively recent. Four astrophysicists developed the idea in a landmark paper published in 1957. They argued that almost all the elements in the periodic table were cooked up over time through nuclear reactions inside stars — rather than in the first instants of the Big Bang, as previously thought. The stuff of life, in other words, arose in places and times somewhat more accessible to our telescopic investigations.
Since most of us spend our lives confined to a narrow strip near Earth’s surface, we tend to think of the cosmos as a lofty, empyrean realm far beyond our reach and relevance. We forget that only a thin sliver of atmosphere separates us from the rest of the universe. [Continue reading…]
Electric cars that run on coal
Climate Central: The world must move quickly to make electric vehicles more climate-friendly, or the world may not be able to meet its climate goals.
That’s the conclusion of a University of Toronto paper published in the March edition of Nature Climate Change, which argues that countries need to reduce the carbon intensity of their electric power supply in order to make electric transportation systems and other infrastructure an effective strategy for combating climate change.
Think about it this way: Every Nissan Leaf might run on electric power, but how that electricity was generated determines what greenhouse gas emissions the car is responsible for. If the car is charged on solar or geothermal power, the carbon footprint may be miniscule. If it’s charged on electricity generated using coal, it might prove as bad or worse for the climate than burning gasoline. (Climate Central created a road map for climate-friendly cars in 2013 showing where driving electric vehicles is most climate friendly in the U.S.)
The University of Toronto paper establishes an emissions threshold to help governments and consumers better understand whether it helps the climate to push for electric cars and the electrification of other modes of transportation based on the carbon intensity of the electricity those vehicles use. [Continue reading…]
Science’s embarrassing fossil fuel problem
Alice Bell writes: An investigation by Greenpeace and the Climate Investigations Centre reported in the Guardian and New York Times this weekend showed that the work of Willie Soon — an apparently ‘scientific’ voice for climate scepticism — had accepted more than $1.2 million from the fossil-fuel industry over the 14 years.
As Suzanne Goldenberg’s report stresses, although those seeking to delay action to curb carbon emissions were keen to cite and fund Soon’s Harvard-Smithsonian credentials, he did not enjoy the same sort of recognition from the scientific community. He did not receive grants from Nasa or the the National Science Foundation, for example — the sorts of institutions who funded his colleagues at the Center for Astrophysics. Moreover, it appears that Soon violated ethical guidelines of the journals that published his work by not disclosing such funding. It seems to be a story of someone working outside the usual codes of modern science.
But Soon is not a singular aberration in the story of science’s relationship with the fossil fuel industry. It goes deeper than that.
Science and engineering is suffused with oil, gas and, yes, even coal. We must look this squarely in the eye if we’re going to tackle climate change.
The fossil fuel industry is sometimes labelled anti-science, but that’s far from the truth. It loves science — or at least particular bits of science — indeed it needs science. The fossil fuel industry needs the science and engineering community to train staff, to gather information and help develop new techniques. Science and engineering also provides the industry with cultural credibility and can open up powerful political spaces within which to lobby. [Continue reading…]
Too many worlds
Philip Ball writes: In July 2011, participants at a conference on the placid shore of Lake Traunsee in Austria were polled on what they thought the meeting was about. You might imagine that this question would have been settled in advance, but since the broad theme was quantum theory, perhaps a degree of uncertainty was to be expected. The title of the conference was ‘Quantum Physics and the Nature of Reality’. The poll, completed by 33 of the participating physicists, mathematicians and philosophers, posed a range of unresolved questions about the relationship between those two things, one of which was: ‘What is your favourite interpretation of quantum mechanics?’
The word ‘favourite’ speaks volumes. Isn’t science supposed to be decided by experiment and observation, free from personal preferences? But experiments in quantum physics have been obstinately silent on what it means. All we can do is develop hunches, intuitions and, yes, cherished ideas. Of these, the survey offered no fewer than 11 to choose from (as well as ‘other’ and ‘none’).
The most popular (supported by 42 per cent of the very small sample) was basically the view put forward by Niels Bohr, Werner Heisenberg and their colleagues in the early days of quantum theory. Today it is known as the Copenhagen Interpretation. More on that below. You might not recognise most of the other alternatives, such as Quantum Bayesianism, Relational Quantum Mechanics, and Objective Collapse (which is not, as you might suppose, just saying ‘what the hell’). Maybe you haven’t heard of the Copenhagen Interpretation either. But in third place (18 per cent) was the Many Worlds Interpretation (MWI), and I suspect you do know something about that, since the MWI is the one with all the glamour and publicity. It tells us that we have multiple selves, living other lives in other universes, quite possibly doing all the things that we dream of but will never achieve (or never dare). Who could resist such an idea?
Yet resist we should. We should resist not just because MWI is unlikely to be true, or even because, since no one knows how to test it, the idea is perhaps not truly scientific at all. Those are valid criticisms, but the main reason we should hold out is that it is incoherent, both philosophically and logically. There could be no better contender for Wolfgang Pauli’s famous put-down: it is not even wrong. [Continue reading…]
Ancient planets are almost as old as the universe
New Scientist reports: The Old Ones were already ancient when the Earth was born. Five small planets orbit an 11.2 billion-year-old star, making them about 80 per cent as old as the universe itself. That means our galaxy started building rocky planets earlier than we thought.
“Now that we know that these planets can be twice as old as Earth, this opens the possibility for the existence of ancient life in the galaxy,” says Tiago Campante at the University of Birmingham in the UK.
NASA’s Kepler space telescope spotted the planets around an orange dwarf star called Kepler 444, which is 117 light years away and about 25 per cent smaller than the sun.
Orange dwarfs are considered good candidates for hosting alien life because they can stay stable for up to 30 billion years, compared to the sun’s 10 billion years, the time it takes these stars to consume all their hydrogen. For context, the universe is currently 13.8 billion years old.
Since, as far as we know, life begins by chance, older planets would have had more time to allow life to get going and evolve. But it was unclear whether planets around such an old star could be rocky – life would have a harder time on gassy planets without a solid surface. [Continue reading…]
Trying to read scrolls that can’t be read
The Economist: In 1752 Camillo Paderni, an artist who had been put in charge of the growing pile of antiquities being dug up at Herculaneum, a seaside town near Naples, wrote to a certain Dr Mead, who then wrote to the Royal Society in London reporting that “there were found many volumes of papyrus but turned to a sort of charcoal, and so brittle, that being touched, it fell to ashes. Yet by His Majesty’s orders he made many trials to open them, but all to no purpose; excepting some scraps containing some words.”
The excavation at Herculaneum — which, like nearby Pompeii, was buried in 79AD under ash from Mount Vesuvius — had uncovered a literary time capsule. What came to be called the Villa of the Papyri contained a library of perhaps 2,000 books, the only such collection known to have been preserved from antiquity.
Actually reading these scrolls has, however, proved both tricky and destructive — until now. For a paper just published in Nature Communications, by Vito Mocella of the Institute for Microelectronics and Microsystems, in Naples, describes a way to decipher them without unrolling them.
Did physics get sucked down a wormhole?
Bryan Appleyard writes: The greatest story of our time may also be the greatest mistake. This is the story of our universe from the Big Bang to now with its bizarre, Dickensian cast of characters – black holes, tiny vibrating strings, the warped space-time continuum, trillions of companion universes and particles that wink in and out of existence.
It is the story told by a long list of officially accredited geniuses from Isaac Newton to Stephen Hawking. It is also the story that is retold daily in popular science fiction from Star Trek to the latest Hollywood sci-fi blockbuster Interstellar. Thanks to the movies, the physicist standing in front of a vast blackboard covered in equations became our age’s symbol of genius. The universe is weird, the TV shows and films tell us, and almost anything can happen.
But it is a story that many now believe is pointless, wrong and riddled with wishful thinking and superstition.
“Stephen Hawking,” says philosopher Roberto Mangabeira Unger, “is not part of the solution, he is part of the problem.”
The equations on the blackboard may be the problem. Mathematics, the language of science, may have misled the scientists.
“The idea,” says physicist Lee Smolin, “that the truth about nature can be wrestled from pure thought through mathematics is overdone… The idea that mathematics is prophetic and that mathematical structure and beauty are a clue to how nature ultimately works is just wrong.”
And in an explosive essay published last week in the science journal Nature astrophysicists George Ellis and Joe Silk say that the wild claims of theoretical physicists are threatening the authority of science itself.
“This battle for the heart and soul of physics,” they write, “is opening up at a time when scientific results — in topics from climate change to the theory of evolution — are being questioned by some politicians and religious fundamentalists. Potential damage to public confidence in science and to the nature of fundamental physics needs to be contained by deeper dialogue between scientists and philosophers….The imprimatur of science should be awarded only to a theory that is testable. Only then can we defend science from attack.”
Unger and Smolin have also just gone into print with a monumental book – The Singular Universe and the Reality of Time – which systematically takes apart contemporary physics and exposes much of it as, in Unger’s words, “an inferno of allegorical fabrication.” The book says it is time to return to real science which is tested against nature rather than constructed out of mathematics. Physics should no longer be seen as the ultimate science, underwriting all others. The true queen of the sciences should be history – the biography of the cosmos. [Continue reading…]
Before any physicists stop by to question whether I really understand what a wormhole is, I will without hesitation make it clear: I have no idea. It just seems like a suitable metaphor — better, say, than rabbit hole.
The Pillars of Creation
NBC News: In celebration of its upcoming 25th anniversary in April, the Hubble Space Telescope has returned to the site of what may be its most famous image, the wispy columns of the Eagle Nebula, and produced a stunning new picture. “The Pillars of Creation,” located 6,500 light-years away in area M16 of the distant nebula, were photographed in visible and near-infrared light with the Hubble’s upgraded equipment, and the result is as astonishing now as the original was in 1995. Hubble went online in 1990.
Why has progress stalled?
Michael Hanlon writes: We live in a golden age of technological, medical, scientific and social progress. Look at our computers! Look at our phones! Twenty years ago, the internet was a creaky machine for geeks. Now we can’t imagine life without it. We are on the verge of medical breakthroughs that would have seemed like magic only half a century ago: cloned organs, stem-cell therapies to repair our very DNA. Even now, life expectancy in some rich countries is improving by five hours a day. A day! Surely immortality, or something very like it, is just around the corner.
The notion that our 21st-century world is one of accelerating advances is so dominant that it seems churlish to challenge it. Almost every week we read about ‘new hopes’ for cancer sufferers, developments in the lab that might lead to new cures, talk of a new era of space tourism and super-jets that can fly round the world in a few hours. Yet a moment’s thought tells us that this vision of unparalleled innovation can’t be right, that many of these breathless reports of progress are in fact mere hype, speculation – even fantasy.
Yet there once was an age when speculation matched reality. It spluttered to a halt more than 40 years ago. Most of what has happened since has been merely incremental improvements upon what came before. That true age of innovation – I’ll call it the Golden Quarter – ran from approximately 1945 to 1971. Just about everything that defines the modern world either came about, or had its seeds sown, during this time. The Pill. Electronics. Computers and the birth of the internet. Nuclear power. Television. Antibiotics. Space travel. Civil rights.
There is more. Feminism. Teenagers. The Green Revolution in agriculture. Decolonisation. Popular music. Mass aviation. The birth of the gay rights movement. Cheap, reliable and safe automobiles. High-speed trains. We put a man on the Moon, sent a probe to Mars, beat smallpox and discovered the double-spiral key of life. The Golden Quarter was a unique period of less than a single human generation, a time when innovation appeared to be running on a mix of dragster fuel and dilithium crystals.
Today, progress is defined almost entirely by consumer-driven, often banal improvements in information technology. [Continue reading…]