Category Archives: technology

Technology doesn’t make us better people

Nicholas Carr writes: Welcome to the global village. It’s a nasty place.

On Easter Sunday, a man in Cleveland filmed himself murdering a random 74-year-old and posted the video on Facebook. The social network took the grisly clip down within two or three hours, but not before users shared it on other websites — where people around the world can still view it.

Surely incidents like this aren’t what Mark Zuckerberg had in mind. In 2012, as his company was preparing to go public, the Facebook founder wrote an earnest letter to would-be shareholders explaining that his company was more than just a business. It was pursuing a “social mission” to make the world a better place by encouraging self-expression and conversation. “People sharing more,” the young entrepreneur wrote, “creates a more open culture and leads to a better understanding of the lives and perspectives of others.”

Earlier this year, Zuckerberg penned another public letter, expressing even grander ambitions. Facebook, he announced, is expanding its mission from “connecting friends and family” to building “a global community that works for everyone.” The ultimate goal is to turn the already vast social network into a sort of supranational state “spanning cultures, nations and regions.”

But the murder in Cleveland, and any similar incidents that inevitably follow, reveal the hollowness of Silicon Valley’s promise that digital networks would bring us together in a more harmonious world.

Whether he knows it or not, Zuckerberg is part of a long tradition in Western thought. Ever since the building of the telegraph system in the 19th century, people have believed that advances in communication technology would promote social harmony. The more we learned about each other, the more we would recognize that we’re all one. In an 1899 article celebrating the laying of transatlantic Western Union cables, a New York Times columnist expressed the popular assumption well: “Nothing so fosters and promotes a mutual understanding and a community of sentiment and interests as cheap, speedy, and convenient communication.”

The great networks of the 20th century — radio, telephone, TV — reinforced this sunny notion. Spanning borders and erasing distances, they shrank the planet. Guglielmo Marconi declared in 1912 that his invention of radio would “make war impossible, because it will make war ridiculous.” AT&T’s top engineer, J.J. Carty, predicted in a 1923 interview that the telephone system would “join all the peoples of the earth in one brotherhood.” In his 1962 book “The Gutenberg Galaxy,” the media theorist Marshall McLuhan gave us the memorable term “global village” to describe the world’s “new electronic interdependence.” Most people took the phrase optimistically, as a prophecy of inevitable social progress. What, after all, could be nicer than a village?

If our assumption that communication brings people together were true, we should today be seeing a planetary outbreak of peace, love, and understanding. Thanks to the Internet and cellular networks, humanity is more connected than ever. Of the world’s 7 billion people, 6 billion have access to a mobile phone — a billion and a half more, the United Nations reports, than have access to a working toilet. Nearly 2 billion are on Facebook, more than a billion upload and download YouTube videos, and billions more converse through messaging apps like WhatsApp and WeChat. With smartphone in hand, everyone becomes a media hub, transmitting and receiving ceaselessly.

Yet we live in a fractious time, defined not by concord but by conflict. Xenophobia is on the rise. Political and social fissures are widening. From the White House down, public discourse is characterized by vitriol and insult. We probably shouldn’t be surprised. [Continue reading…]

Facebooktwittermail

Humans aren’t the only primates that can make sharp stone tools

 

The Guardian reports: Monkeys have been observed producing sharp stone flakes that closely resemble the earliest known tools made by our ancient relatives, proving that this ability is not uniquely human.

Previously, modifying stones to create razor-edged fragments was thought to be an activity confined to hominins, the family including early humans and their more primitive cousins. The latest observations re-write this view, showing that monkeys unintentionally produce almost identical artefacts simply by smashing stones together.

The findings put archaeologists on alert that they can no longer assume that stone flakes they discover are linked to the deliberate crafting of tools by early humans as their brains became more sophisticated.

Tomos Proffitt, an archaeologist at the University of Oxford and the study’s lead author, said: “At a very fundamental level – if you’re looking at a very simple flake – if you had a capuchin flake and a human flake they would be the same. It raises really important questions about what level of cognitive complexity is required to produce a sophisticated cutting tool.”

Unlike early humans, the flakes produced by the capuchins were the unintentional byproduct of hammering stones – an activity that the monkeys pursued decisively, but the purpose of which was not clear. Originally scientists thought the behaviour was a flamboyant display of aggression in response to an intruder, but after more extensive observations the monkeys appeared to be seeking out the quartz dust produced by smashing the rocks, possibly because it has a nutritional benefit. [Continue reading…]

Facebooktwittermail

Forget software — now hackers are exploiting physics

Andy Greenberg reports: Practically every word we use to describe a computer is a metaphor. “File,” “window,” even “memory” all stand in for collections of ones and zeros that are themselves representations of an impossibly complex maze of wires, transistors and the electrons moving through them. But when hackers go beyond those abstractions of computer systems and attack their actual underlying physics, the metaphors break.

Over the last year and a half, security researchers have been doing exactly that: honing hacking techniques that break through the metaphor to the actual machine, exploiting the unexpected behavior not of operating systems or applications, but of computing hardware itself—in some cases targeting the actual electricity that comprises bits of data in computer memory. And at the Usenix security conference earlier this month, two teams of researchers presented attacks they developed that bring that new kind of hack closer to becoming a practical threat.

Both of those new attacks use a technique Google researchers first demonstrated last March called “Rowhammer.” The trick works by running a program on the target computer, which repeatedly overwrites a certain row of transistors in its DRAM flash memory, “hammering” it until a rare glitch occurs: Electric charge leaks from the hammered row of transistors into an adjacent row. The leaked charge then causes a certain bit in that adjacent row of the computer’s memory to flip from one to zero or vice versa. That bit flip gives you access to a privileged level of the computer’s operating system.

It’s messy. And mind-bending. And it works. [Continue reading…]

Facebooktwittermail

Forget ideology, liberal democracy’s newest threats come from technology and bioscience

John Naughton writes: The BBC Reith Lectures in 1967 were given by Edmund Leach, a Cambridge social anthropologist. “Men have become like gods,” Leach began. “Isn’t it about time that we understood our divinity? Science offers us total mastery over our environment and over our destiny, yet instead of rejoicing we feel deeply afraid.”

That was nearly half a century ago, and yet Leach’s opening lines could easily apply to today. He was speaking before the internet had been built and long before the human genome had been decoded, and so his claim about men becoming “like gods” seems relatively modest compared with the capabilities that molecular biology and computing have subsequently bestowed upon us. Our science-based culture is the most powerful in history, and it is ceaselessly researching, exploring, developing and growing. But in recent times it seems to have also become plagued with existential angst as the implications of human ingenuity begin to be (dimly) glimpsed.

The title that Leach chose for his Reith Lecture – A Runaway World – captures our zeitgeist too. At any rate, we are also increasingly fretful about a world that seems to be running out of control, largely (but not solely) because of information technology and what the life sciences are making possible. But we seek consolation in the thought that “it was always thus”: people felt alarmed about steam in George Eliot’s time and got worked up about electricity, the telegraph and the telephone as they arrived on the scene. The reassuring implication is that we weathered those technological storms, and so we will weather this one too. Humankind will muddle through.

But in the last five years or so even that cautious, pragmatic optimism has begun to erode. There are several reasons for this loss of confidence. One is the sheer vertiginous pace of technological change. Another is that the new forces at loose in our society – particularly information technology and the life sciences – are potentially more far-reaching in their implications than steam or electricity ever were. And, thirdly, we have begun to see startling advances in these fields that have forced us to recalibrate our expectations.[Continue reading…]

Facebooktwittermail

China launches quantum satellite for ‘hack-proof’ communications

The Guardian reports: China says it has launched the world’s first quantum satellite, a project Beijing hopes will enable it to build a coveted “hack-proof” communications system with potentially significant military and commercial applications.

Xinhua, Beijing’s official news service, said Micius, a 600kg satellite that is nicknamed after an ancient Chinese philosopher, “roared into the dark sky” over the Gobi desert at 1.40am local time on Tuesday, carried by a Long March-2D rocket.

“The satellite’s two-year mission will be to develop ‘hack-proof’ quantum communications, allowing users to send messages securely and at speeds faster than light,” Xinhua reported.

The Quantum Experiments at Space Scale, or Quess, satellite programme is part of an ambitious space programme that has accelerated since Xi Jinping became Communist party chief in late 2012.

“There’s been a race to produce a quantum satellite, and it is very likely that China is going to win that race,” Nicolas Gisin, a professor and quantum physicist at the University of Geneva, told the Wall Street Journal. “It shows again China’s ability to commit to large and ambitious projects and to realise them.”

The satellite will be tasked with sending secure messages between Beijing and Urumqi, the capital of Xinjiang, a sprawling region of deserts and snow-capped mountains in China’s extreme west.

Highly complex attempts to build such a “hack-proof” communications network are based on the scientific principle of entanglement. [Continue reading…]

Facebooktwittermail

A society staring at machines

Jacob Weisberg writes: “As smoking gives us something to do with our hands when we aren’t using them, Time gives us something to do with our minds when we aren’t thinking,” Dwight Macdonald wrote in 1957. With smartphones, the issue never arises. Hands and mind are continuously occupied texting, e-mailing, liking, tweeting, watching YouTube videos, and playing Candy Crush.

Americans spend an average of five and a half hours a day with digital media, more than half of that time on mobile devices, according to the research firm eMarketer. Among some groups, the numbers range much higher. In one recent survey, female students at Baylor University reported using their cell phones an average of ten hours a day. Three quarters of eighteen-to-twenty-four-year-olds say that they reach for their phones immediately upon waking up in the morning. Once out of bed, we check our phones 221 times a day — an average of every 4.3 minutes — according to a UK study. This number actually may be too low, since people tend to underestimate their own mobile usage. In a 2015 Gallup survey, 61 percent of people said they checked their phones less frequently than others they knew.

Our transformation into device people has happened with unprecedented suddenness. The first touchscreen-operated iPhones went on sale in June 2007, followed by the first Android-powered phones the following year. Smartphones went from 10 percent to 40 percent market penetration faster than any other consumer technology in history. In the United States, adoption hit 50 percent only three years ago. Yet today, not carrying a smartphone indicates eccentricity, social marginalization, or old age. [Continue reading…]

It perhaps also indicates being at less risk of stumbling off a cliff.

Facebooktwittermail

Learning from nature: Record-efficiency turbine farms are being inspired by sealife

Alex Riley writes: As they drove on featureless dirt roads on the first Tuesday of 2010, John Dabiri, professor of aeronautics and bioengineering at the California Institute of Technology, and his then-student Robert Whittlesey, were inspecting a remote area of land that they hoped to purchase to test new concepts in wind power. They named their site FLOWE for Field Laboratory for Optimized Wind Energy. Situated between gentle knolls covered in sere vegetation, the four-acre parcel in Antelope Valley, California, was once destined to become a mall, but those plans fell through. The land was cheap. And, more importantly, it was windy.

Estimated at 250 trillion Watts, the amount of wind on Earth has the potential to provide more than 20 times our current global energy consumption. Yet, only four countries — Spain, Portugal, Ireland, and Denmark — generate more than 10 percent of their electricity this way. The United States, one of the largest, wealthiest, and windiest of countries, comes in at about 4 percent. There are reasons for that. Wind farm expansion brings with it huge engineering costs, unsightly countryside, loud noises, disruption to military radar, and death of wildlife. Recent estimates blamed turbines for killing 600,000 bats and up to 440,000 birds a year. On June 19, 2014, the American Bird Conservancy filed a lawsuit against the federal government asking it to curtail the impact of wind farms on the dwindling eagle populations. And while standalone horizontal-axis turbines harvest wind energy well, in a group they’re highly profligate. As their propeller-like blades spin, the turbines facing into the wind disrupt free-flowing air, creating a wake of slow-moving, infertile air behind them. [Continue reading…]

Facebooktwittermail

The growing risk of a war in space

Geoff Manaugh writes: In Ghost Fleet, a 2015 novel by security theorists Peter Singer and August Cole, the next world war begins in space.

Aboard an apparently civilian space station called the Tiangong, or “Heavenly Palace,” Chinese astronauts—taikonauts—maneuver a chemical oxygen iodine laser (COIL) into place. They aim their clandestine electromagnetic weapon at its first target, a U.S. Air Force communications satellite that helps to coordinate forces in the Pacific theater far below. The laser “fired a burst of energy that, if it were visible light instead of infrared, would have been a hundred thousand times brighter than the sun.” The beam melts through the external hull of the U.S. satellite and shuts down its sensitive inner circuitry.

From there, the taikonauts work their way through a long checklist of strategic U.S. space assets, disabling the nation’s military capabilities from above. It is a Pearl Harbor above the atmosphere, an invisible first strike.

“The emptiness of outer space might be the last place you’d expect militaries to vie over contested territory,” Lee Billings has written, “except that outer space isn’t so empty anymore.” It is not only science fiction, in other words, to suggest that the future of war could be offworld. The high ground of the global battlefield is no longer defined merely by a topographical advantage, but by strategic orbitals and potential weapons stationed in the skies above distant continents.

When China shot down one of its own weather satellites in January 2007, the event was, among other things, a clear demonstration to the United States that China could wage war beyond the Earth’s atmosphere. In the decade since, both China and the United States have continued to pursue space-based armaments and defensive systems. A November 2015 “Report to Congress,” for example, filed by the U.S.-China Economic and Security Review Commission (PDF), specifically singles out China’s “Counterspace Program” as a subject of needed study. China’s astral arsenal, the report explains, most likely includes “direct-ascent” missiles, directed-energy weapons, and also what are known as “co-orbital antisatellite systems.” [Continue reading…]

Facebooktwittermail

Artificial intelligence: ‘We’re like children playing with a bomb’

The Observer reports: You’ll find the Future of Humanity Institute down a medieval backstreet in the centre of Oxford. It is beside St Ebbe’s church, which has stood on this site since 1005, and above a Pure Gym, which opened in April. The institute, a research faculty of Oxford University, was established a decade ago to ask the very biggest questions on our behalf. Notably: what exactly are the “existential risks” that threaten the future of our species; how do we measure them; and what can we do to prevent them? Or to put it another way: in a world of multiple fears, what precisely should we be most terrified of?

When I arrive to meet the director of the institute, Professor Nick Bostrom, a bed is being delivered to the second-floor office. Existential risk is a round-the-clock kind of operation; it sleeps fitfully, if at all.

Bostrom, a 43-year-old Swedish-born philosopher, has lately acquired something of the status of prophet of doom among those currently doing most to shape our civilisation: the tech billionaires of Silicon Valley. His reputation rests primarily on his book Superintelligence: Paths, Dangers, Strategies, which was a surprise New York Times bestseller last year and now arrives in paperback, trailing must-read recommendations from Bill Gates and Tesla’s Elon Musk. (In the best kind of literary review, Musk also gave Bostrom’s institute £1m to continue to pursue its inquiries.)

The book is a lively, speculative examination of the singular threat that Bostrom believes – after years of calculation and argument – to be the one most likely to wipe us out. This threat is not climate change, nor pandemic, nor nuclear winter; it is the possibly imminent creation of a general machine intelligence greater than our own. [Continue reading…]

Facebooktwittermail

‘Gene drives’ that tinker with evolution are an unknown risk, researchers say

MIT Technology Review reports: With great power — in this case, a technology that can alter the rules of evolution — comes great responsibility. And since there are “considerable gaps in knowledge” about the possible consequences of releasing this technology, called a gene drive, into natural environments, it is not yet responsible to do so. That’s the major conclusion of a report published today by the National Academies of Science, Engineering, and Medicine.

Gene drives hold immense promise for controlling or eradicating vector-borne diseases like Zika virus and malaria, or in managing agricultural pests or invasive species. But the 200-page report, written by a committee of 16 experts, highlights how ill-equipped we are to assess the environmental and ecological risks of using gene drives. And it provides a glimpse at the challenges they will create for policymakers.

The technology is inspired by natural phenomena through which particular “selfish” genes are passed to offspring at higher rate than is normally allowed by nature in sexually reproducing organisms. There are multiple ways to make gene drives in the lab, but scientists are now using the gene-editing tool known as CRISPR to very rapidly and effectively do the trick. Evidence in mosquitoes, fruit flies, and yeast suggests that this could be used to spread a gene through nearly 100 percent of a population.

The possible ecological effects, intended or not, are far from clear, though. How long will gene drives persist in the environment? What is the chance that an engineered organism could pass the gene drive to an unintended recipient? How might these things affect the whole ecosystem? How much does all this vary depending on the particular organism and ecosystem?

Research on the molecular biology of gene drives has outpaced ecological research on how genes move through populations and between species, the report says, making it impossible to adequately answer these and other thorny questions. Substantially more laboratory research and confined field testing is needed to better grasp the risks. [Continue reading…]

Jim Thomas writes: If there is a prize for the fastest emerging tech controversy of the century the ‘gene drive’ may have just won it. In under eighteen months the sci-fi concept of a ‘mutagenic chain reaction’ that can drive a genetic trait through an entire species (and maybe eradicate that species too) has gone from theory to published proof of principle to massively-shared TED talk (apparently an important step these days) to the subject of a US National Academy of Sciences high profile study – complete with committees, hearings, public inputs and a glossy 216 page report release. Previous technology controversies have taken anywhere from a decade to over a century to reach that level of policy attention. So why were Gene Drives put on the turbo track to science academy report status? One word: leverage.

What a gene drive does is simple: it ensures that a chosen genetic trait will reliably be passed on to the next generation and every generation thereafter. This overcomes normal Mendelian genetics where a trait may be diluted or lost through the generations. The effect is that the engineered trait is driven through an entire population, re-engineering not just single organisms but enforcing the change in every descendant – re-shaping entire species and ecosystems at will.

It’s a perfect case of a very high-leverage technology. Archimedes famously said “Give me a lever long enough and a fulcrum on which to place it, and I shall move the world. ” Gene drive developers are in effect saying “Give me a gene drive and an organism to put it in and I can wipe out species, alter ecosystems and cause large-scale modifications.” Gene drive pioneer Kevin Esvelt calls gene drives “an experiment where if you screw up, it affects the whole world”. [Continue reading…]

Facebooktwittermail

Has the quantum era has begun?

IDG News Service reports: Quantum computing’s full potential may still be years away, but there are plenty of benefits to be realized right now.

So argues Vern Brownell, president and CEO of D-Wave Systems, whose namesake quantum system is already in its second generation.

Launched 17 years ago by a team with roots at Canada’s University of British Columbia, D-Wave introduced what it called “the world’s first commercially available quantum computer” back in 2010. Since then the company has doubled the number of qubits, or quantum bits, in its machines roughly every year. Today, its D-Wave 2X system boasts more than 1,000.

The company doesn’t disclose its full customer list, but Google, NASA and Lockheed-Martin are all on it, D-Wave says. In a recent experiment, Google reported that D-Wave’s technology outperformed a conventional machine by 100 million times. [Continue reading…]

Facebooktwittermail

The range of the mind’s eye is restricted by the skill of the hand

structure12

Jonathan Waldman writes: Sometime in 1882, a skinny, dark-haired, 11-year-old boy named Harry Brearley entered a steelworks for the first time. A shy kid — he was scared of the dark, and a picky eater — he was also curious, and the industrial revolution in Sheffield, England, offered much in the way of amusements. He enjoyed wandering around town — he later called himself a Sheffield Street Arab — watching road builders, bricklayers, painters, coal deliverers, butchers, and grinders. He was drawn especially to workshops; if he couldn’t see in a shop window, he would knock on the door and offer to run an errand for the privilege of watching whatever work was going on inside. Factories were even more appealing, and he had learned to gain access by delivering, or pretending to deliver, lunch or dinner to an employee. Once inside, he must have reveled, for not until the day’s end did he emerge, all grimy and gray but for his blue eyes. Inside the steelworks, the action compelled him so much that he spent hours sitting inconspicuously on great piles of coal, breathing through his mouth, watching brawny men shoveling fuel into furnaces, hammering white-hot ingots of iron.

There was one operation in particular that young Harry liked: a toughness test performed by the blacksmith. After melting and pouring a molten mixture from a crucible, the blacksmith would cast a bar or two of that alloy, and after it cooled, he would cut notches in the ends of those bars. Then he’d put the bars in a vise, and hammer away at them.

The effort required to break the metal bars, as interpreted through the blacksmith’s muscles, could vary by an order of magnitude, but the result of the test was expressed qualitatively. The metal was pronounced on the spot either rotten or darned good stuff. The latter was simply called D.G.S. The aim of the men at that steelworks, and every other, was to produce D.G.S., and Harry took that to heart.

In this way, young Harry became familiar with steelmaking long before he formally taught himself as much as there was to know about the practice. It was the beginning of a life devoted to steel, without the distractions of hobbies, vacations, or church. It was the origin of a career in which Brearley wrote eight books on metals, five of which contain the word steel in the title; in which he could argue about steelmaking — but not politics — all night; and in which the love and devotion he bestowed upon inanimate metals exceeded that which he bestowed upon his parents or wife or son. Steel was Harry’s true love. It would lead, eventually, to the discovery of stainless steel.

Harry Brearley was born on Feb. 18, 1871, and grew up poor, in a small, cramped house on Marcus Street, in Ramsden’s Yard, on a hill in Sheffield. The city was the world capital of steelmaking; by 1850 Sheffield steelmakers produced half of all the steel in Europe, and 90 percent of the steel in England. By 1860, no fewer than 178 edge tool and saw makers were registered in Sheffield. In the first half of the 19th century, as Sheffield rose to prominence, the population of the city grew fivefold, and its filth grew proportionally. A saying at the time, that “where there’s muck there’s money,” legitimized the grime, reek, and dust of industrial Sheffield, but Harry recognized later that it was a misfortune to be from there, for nobody had much ambition. [Continue reading…]

Facebooktwittermail

Why we need to tackle the growing mountain of ‘digital waste’

By Chris Preist, University of Bristol

We are very aware of waste in our lives today, from the culture of recycling to the email signatures that urge us not to print them off. But as more and more aspects of life become reliant on digital technology, have we stopped to consider the new potential avenues of waste that are being generated? It’s not just about the energy and resources used by our devices – the services we run over the cloud can generate “digital waste” of their own.

Current approaches to reducing energy use focus on improving the hardware: better datacentre energy management, improved electronics that provide more processing power for less energy, and compression techniques that mean images, videos and other files use less bandwidth as they are transmitted across networks. Our research, rather than focusing on making individual system components more efficient, seeks to understand the impact of any particular digital service – one delivered via a website or through the internet – and re-designing the software involved to make better, more efficient use of the technology that supports it.

We also examine what aspects of a digital service actually provide value to the end user, as establishing where resources and effort are wasted – digital waste – reveals what can be cut out. For example, MP3 audio compression works by removing frequencies that are inaudible or less audible to the human ear – shrinking the size of the file for minimal loss of audible quality.

This is no small task. Estimates have put the technology sector’s global carbon footprint at roughly 2% of worldwide emissions – almost as much as that generated by aviation. But there is a big difference: IT is a more pervasive, and in some ways more democratic, technology. Perhaps 6% or so of the world’s population will fly in a given year, while around 40% have access to the internet at home. More than a billion people have Facebook accounts. Digital technology and the online services it provides are used by far more of us, and far more often.

Continue reading

Facebooktwittermail

Google’s new YouTube analysis app crowdsources war reporting

Wired reports: In armed conflicts of the past, the “fog of war” meant a lack of data. In the era of ubiquitous pocket-sized cameras, it often means an information overload.

Four years ago, when analysts at the non-profit Carter Center began using YouTube videos to analyze the escalating conflicts in Syria and Libya, they found that, in contrast to older wars, it was nearly impossible to keep up with the thousands of clips uploaded every month from the smartphones and cameras of both armed groups and bystanders. “The difference with Syria and Libya is that they’re taking place in a truly connected environment. Everyone is online,” says Chris McNaboe, the manager of the Carter Center’s Syria Mapping Project. “The amount of video coming out was overwhelming…There have been more minutes of video from Syria than there have been minutes of real time.”

To handle that flood of digital footage, his team has been testing a tool called Montage. Montage was built by the human rights-focused tech incubator Jigsaw, the subsidiary of Google’s parent company Alphabet that was formerly known as a Google Ideas, to sort, map, and tag video evidence from conflict zones. Over the last few months, it allowed six Carter Center analysts to categorize video coming out of Syria—identifying government forces and each of the slew of armed opposition groups, recording the appearance of different armaments and vehicles, and keeping all of that data carefully marked with time stamps and locations to create a searchable, sortable and mappable catalog of the Syrian conflict. “Some of our Montage investigations have had over 600 videos in them,” says McNaboe. “Even with a small team we’ve been able to go through days worth of video in a relatively short amount of time.” [Continue reading…]

Facebooktwittermail

Technology is not ruining our kids. Parents (and their technology) are ruining them

Jenny Anderson writes: Many of us worry what technology is doing to our kids. A cascade of reports show that their addiction to iAnything is diminishing empathy, increasing bullying (pdf), robbing them of time to play, and just be. So we parents set timers, lock away devices and drone on about the importance of actual real-live human interaction. And then we check our phones.

Sherry Turkle, a professor in the program in Science, Technology and Society at M.I.T. and the author, most recently, of Reclaiming Conversation: The Power of Talk in a Digital Age, turned the tables by imploring parents to take control and model better behavior.

A 15-year-old boy told her that: “someday he wanted to raise a family, not the way his parents are raising him (with phones out during meals and in the park and during his school sports events) but the way his parents think they are raising him — with no phones at meals and plentiful family conversation.”

Turkle explains the cost of too-much technology in stark terms: Our children can’t engage in conversation, or experience solitude, making it very hard for them to be empathetic. “In one experiment, many student subjects opted to give themselves mild electric shocks rather than sit alone with their thoughts,” she noted.

Unfortunately, it seems we parents are the solution. (Newsflash, kids aren’t going to give up their devices because they are worried about how it may influence their future ability to empathize.)

That means exercising some self-control. Many of us aren’t exactly paragons of virtue in this arena. [Continue reading…]

Facebooktwittermail

Yuri Milner is spending $100 million on a probe that could travel to Alpha Centauri within a generation

Ross Andersen writes: In the Southern Hemisphere’s sky, there is a constellation, a centaur holding a spear, its legs raised in mid-gallop. The creature’s front hoof is marked by a star that has long hypnotized humanity, with its brightness, and more recently, its proximity.

Since the dawn of written culture, at least, humans have dreamt of star travel. As the nearest star system to Earth, Alpha Centauri is the most natural subject of these dreams. To a certain cast of mind, the star seems destined to figure prominently in our future.

In the four centuries since the Scientific Revolution, a series of increasingly powerful instruments has slowly brought Alpha Centauri into focus. In 1689, the Jesuit priest Jean Richaud fixed his telescope on a comet, as it was streaking through the stick-figure centaur. He was startled to find not one, but two stars twinkling in its hoof. In 1915, a third star was spotted, this one a small, red satellite of the system’s two central, sunlike stars.

To say that Alpha Centauri is the nearest star system to Earth is not to say that it’s near. A 25 trillion mile abyss separates us. Alpha Centauri’s light travels to Earth at the absurd rate of 186,000 miles per second, and still takes more than four years to arrive. [Continue reading…]

Facebooktwittermail

Technology, the faux equalizer

structure3

Adrienne LaFrance writes: Just over a century ago, an electric company in Minnesota took out a full-page newspaper advertisement and listed 1,000 uses for electricity.

Bakers could get ice-cream freezers and waffle irons! Hat makers could put up electric signs! Paper-box manufacturers could use glue pots and fans! Then there were the at-home uses: decorative lights, corn poppers, curling irons, foot warmers, massage machines, carpet sweepers, sewing machines, and milk warmers all made the list. “Make electricity cut your housework in two,” the advertisement said.

This has long been the promise of new technology: That it will make your work easier, which will make your life better. The idea is that the arc of technology bends toward social progress. This is practically the mantra of Silicon Valley, so it’s not surprising that Google’s CEO, Sundar Pichai, seems similarly teleological in his views. [Continue reading…]

Facebooktwittermail