Space.com reports: Life on Earth may owe its existence to incredibly powerful storms that erupted on the sun long ago, a new study suggests.
Potent and frequent solar eruptions could have warmed the planet enough for life to take root, and also provided the vital energy needed to transform simple molecules into the complex building blocks of life, such as DNA, researchers said.
The first organisms evolved on Earth about 4 billion years ago. This fact has long puzzled scientists, because in those days, the sun was only about 70 percent as bright as it is today.
“That means Earth should have been an icy ball,” study lead author Vladimir Airapetian, a solar scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, said in a statement. “Instead, geological evidence says it was a warm globe with liquid water. We call this the Faint Young Sun Paradox.” [Continue reading…]
Category Archives: Science/Technology
The range of the mind’s eye is restricted by the skill of the hand
Jonathan Waldman writes: Sometime in 1882, a skinny, dark-haired, 11-year-old boy named Harry Brearley entered a steelworks for the first time. A shy kid — he was scared of the dark, and a picky eater — he was also curious, and the industrial revolution in Sheffield, England, offered much in the way of amusements. He enjoyed wandering around town — he later called himself a Sheffield Street Arab — watching road builders, bricklayers, painters, coal deliverers, butchers, and grinders. He was drawn especially to workshops; if he couldn’t see in a shop window, he would knock on the door and offer to run an errand for the privilege of watching whatever work was going on inside. Factories were even more appealing, and he had learned to gain access by delivering, or pretending to deliver, lunch or dinner to an employee. Once inside, he must have reveled, for not until the day’s end did he emerge, all grimy and gray but for his blue eyes. Inside the steelworks, the action compelled him so much that he spent hours sitting inconspicuously on great piles of coal, breathing through his mouth, watching brawny men shoveling fuel into furnaces, hammering white-hot ingots of iron.
There was one operation in particular that young Harry liked: a toughness test performed by the blacksmith. After melting and pouring a molten mixture from a crucible, the blacksmith would cast a bar or two of that alloy, and after it cooled, he would cut notches in the ends of those bars. Then he’d put the bars in a vise, and hammer away at them.
The effort required to break the metal bars, as interpreted through the blacksmith’s muscles, could vary by an order of magnitude, but the result of the test was expressed qualitatively. The metal was pronounced on the spot either rotten or darned good stuff. The latter was simply called D.G.S. The aim of the men at that steelworks, and every other, was to produce D.G.S., and Harry took that to heart.
In this way, young Harry became familiar with steelmaking long before he formally taught himself as much as there was to know about the practice. It was the beginning of a life devoted to steel, without the distractions of hobbies, vacations, or church. It was the origin of a career in which Brearley wrote eight books on metals, five of which contain the word steel in the title; in which he could argue about steelmaking — but not politics — all night; and in which the love and devotion he bestowed upon inanimate metals exceeded that which he bestowed upon his parents or wife or son. Steel was Harry’s true love. It would lead, eventually, to the discovery of stainless steel.
Harry Brearley was born on Feb. 18, 1871, and grew up poor, in a small, cramped house on Marcus Street, in Ramsden’s Yard, on a hill in Sheffield. The city was the world capital of steelmaking; by 1850 Sheffield steelmakers produced half of all the steel in Europe, and 90 percent of the steel in England. By 1860, no fewer than 178 edge tool and saw makers were registered in Sheffield. In the first half of the 19th century, as Sheffield rose to prominence, the population of the city grew fivefold, and its filth grew proportionally. A saying at the time, that “where there’s muck there’s money,” legitimized the grime, reek, and dust of industrial Sheffield, but Harry recognized later that it was a misfortune to be from there, for nobody had much ambition. [Continue reading…]
The ex-anarchist construction worker who became a world-renowned scientist
Daniel Gumbiner writes: “See these lichens here? I don’t know how you see them but, to me, I see them as a surrealist.”
I am sitting in the UC Riverside herbarium, speaking to Kerry Knudsen, Southern California’s only professional lichenologist. We are looking at his collection of lichens, which consists of over 16,000 individual specimens, all of them neatly organized in large green file cabinets. Knudsen has published over 200 peer-reviewed scientific papers on lichens, and discovered more than 60 species that are new to science. It is an extraordinary output, for any scientist, but Knudsen has achieved it in only fifteen years. Science is his second career. For more than two decades he worked in construction. Before that, he was a teenage runaway living in an anarchist commune in Chicago.
“He’s amazing,” said Shirley Tucker, a retired professor of botany at LSU. “He came out of nowhere and became an expert in the most difficult genera.”
A lichen is a fungus in a symbiotic relationship with an algae or a cyanobacteria. The fungus essentially farms the algae or cyanobacteria, who are able to harvest energy from the sun through photosynthesis. In return, the fungus provides the algae or cyanobacteria with protection, but the relationship is a little one-sided.
“The algae is trapped,” Knudsen explained. “It has a lot of tubes going into it. It’s controlled by chemical signals … The first time I saw it under the microscope, I wanted to join the Algae Liberation Front. I mean, it looked bad.”
Scientists believe that lichen evolved over 500 million years ago, about the same time as fish. Although lichen make up 8 percent of the world’s biomass, they are rarely considered by the amateur naturalist, and therefore have very few common names. [Continue reading…]
Why physics is not a discipline
Philip Ball writes: Have you heard the one about the biologist, the physicist, and the mathematician? They’re all sitting in a cafe watching people come and go from a house across the street. Two people enter, and then some time later, three emerge. The physicist says, “The measurement wasn’t accurate.” The biologist says, “They have reproduced.” The mathematician says, “If now exactly one person enters the house then it will be empty again.”
Hilarious, no? You can find plenty of jokes like this — many invoke the notion of a spherical cow — but I’ve yet to find one that makes me laugh. Still, that’s not what they’re for. They’re designed to show us that these academic disciplines look at the world in very different, perhaps incompatible ways.
There’s some truth in that. Many physicists, for example, will tell stories of how indifferent biologists are to their efforts in that field, regarding them as irrelevant and misconceived. It’s not just that the physicists were thought to be doing things wrong. Often the biologists’ view was that (outside perhaps of the well established but tightly defined discipline of biophysics) there simply wasn’t any place for physics in biology.
But such objections (and jokes) conflate academic labels with scientific ones. Physics, properly understood, is not a subject taught at schools and university departments; it is a certain way of understanding how processes happen in the world. When Aristotle wrote his Physics in the fourth century B.C., he wasn’t describing an academic discipline, but a mode of philosophy: a way of thinking about nature. You might imagine that’s just an archaic usage, but it’s not. When physicists speak today (as they often do) about the “physics” of the problem, they mean something close to what Aristotle meant: neither a bare mathematical formalism nor a mere narrative, but a way of deriving process from fundamental principles.
This is why there is a physics of biology just as there is a physics of chemistry, geology, and society. But it’s not necessarily “physicists” in the professional sense who will discover it. [Continue reading…]
‘Nobody knew what you would see on the other side of a mountain’
Carl Zimmer writes: As a boy growing up in Denmark, Eske Willerslev could not wait to leave Gentofte, his suburban hometown. As soon as he was old enough, he would strike out for the Arctic wilderness.
His twin brother, Rane, shared his obsession. On vacations, they retreated to the woods to teach themselves survival skills. Their first journey would be to Siberia, the Willerslev twins decided. They would make contact with a mysterious group of people called the Yukaghir, who supposedly lived on nothing but elk and moose.
When the Willerslev twins reached 18, they made good on their promise. They were soon paddling a canoe up remote Siberian rivers.
“Nobody knew what you would see on the other side of a mountain,” said Eske Willerslev, who is now 44. “There were villages on the maps, and you wouldn’t even see a trace of them.”
Dr. Willerslev spent much of the next four years in Siberia, hunting moose, traveling across empty tundra and meeting the Yukaghirs and other people of the region. The experience left him wondering about the history of ethnic groups, about how people spread across the planet.
A quarter of a century later, Dr. Willerslev is still asking those questions, but now he’s getting some eye-opening answers.
As the director of the Center for GeoGenetics at the University of Copenhagen, Dr. Willerslev uses ancient DNA to reconstruct the past 50,000 years of human history. The findings have enriched our understanding of prehistory, shedding light on human development with evidence that can’t be found in pottery shards or studies of living cultures. [Continue reading…]
Ancient space dust hints at a mysterious period in Earth’s early history
Rebecca Boyle writes: Geologists tell a pretty broad-brush narrative of Earth’s 4.5 billion-year history. For its first half-billion years, the newly formed planet was a seething ball of lava constantly pelted by giant space rocks, including a Mars-sized object that sheared off a chunk that became the moon. Things calmed down when the Late Heavy Bombardment tapered off some 3.8 billion years ago, but volcanoes ensured Earth’s atmosphere remained a toxic stew of gases with almost no oxygen to speak of. It stayed that way for another billion years, when single-celled bacteria filled the oceans. Around 2.5 billion years ago, at the end of the Archean era, algae figured out how to make energy from sunlight, and the Great Oxygenation Event gave Earth its lungs. Complex life took its time, finally exploding in the Cambrian era some 500 million years ago. Evolution moved a lot faster after that, resulting in dinosaurs, then mammals, then us.
It’s a great story, and scientists have been telling it for decades, but tiny fossilized space pebbles from Australia may upend it entirely, giving us a new narrative about Earth’s adolescence. These pebbles rained down on our planet’s surface 2.7 billion years ago. As they passed through the upper atmosphere, they melted and rusted, making new crystal shapes and minerals that only form where there is plenty of oxygen. A new paper describing the space pebbles will be published today in the journal Nature. It suggests the atmosphere’s upper reaches were surprisingly rich in oxygen during the Archean, when Earth’s surface had practically none.
“If they’re right, a lot of people have had misconceptions, or have been wrong,” says Kevin Zahnle, a planetary scientist at NASA’s Ames Research Center. Moreover, if the research holds up, geologists will have a new mystery on their hands: How did all that oxygen get there, and why didn’t it reach the ground? [Continue reading…]
Why we need to tackle the growing mountain of ‘digital waste’
By Chris Preist, University of Bristol
We are very aware of waste in our lives today, from the culture of recycling to the email signatures that urge us not to print them off. But as more and more aspects of life become reliant on digital technology, have we stopped to consider the new potential avenues of waste that are being generated? It’s not just about the energy and resources used by our devices – the services we run over the cloud can generate “digital waste” of their own.
Current approaches to reducing energy use focus on improving the hardware: better datacentre energy management, improved electronics that provide more processing power for less energy, and compression techniques that mean images, videos and other files use less bandwidth as they are transmitted across networks. Our research, rather than focusing on making individual system components more efficient, seeks to understand the impact of any particular digital service – one delivered via a website or through the internet – and re-designing the software involved to make better, more efficient use of the technology that supports it.
We also examine what aspects of a digital service actually provide value to the end user, as establishing where resources and effort are wasted – digital waste – reveals what can be cut out. For example, MP3 audio compression works by removing frequencies that are inaudible or less audible to the human ear – shrinking the size of the file for minimal loss of audible quality.
This is no small task. Estimates have put the technology sector’s global carbon footprint at roughly 2% of worldwide emissions – almost as much as that generated by aviation. But there is a big difference: IT is a more pervasive, and in some ways more democratic, technology. Perhaps 6% or so of the world’s population will fly in a given year, while around 40% have access to the internet at home. More than a billion people have Facebook accounts. Digital technology and the online services it provides are used by far more of us, and far more often.
The spark of life and a burst of zinc fluorescence
For some religious believers, the idea that human life has a divine origin includes the notion that the biological event of conception has a divine component: the moment at which a soul enters a developing embryo.
It is now being claimed that this belief is supported by scientific evidence.
Citing a recently published study appearing in Scientific Reports, Catholic Online says:
Researchers discovered the moment a human soul enters an egg, which gives pro-life groups an even greater edge in the battle between embryonic life and death. The precise moment is celebrated with a zap of energy released around the newly fertilized egg.
Teresa Woodruff, one of the study’s senior authors and professor in obstetrics and gynecology at the university, delivered a press release in which she stated, “to see the zinc radiate out in a burst from each human egg was breathtaking.”
It’s easy to understand why images showing a burst of light as an egg is fertilized, might appear to provide scientific validation of religious belief.
But attaching religious significance to these findings requires ignoring a key detail in what has been reported.
If the zinc spark that’s been observed — a burst of zinc fluorescence that occurs as millions of zinc atoms get dumped out of the egg — actually bore a relationship with the arrival of a soul enabling the emergence of life, then no such sparks would have been photographed. Why? Because the experiment involved staging a facsimile of fertilization using a sperm enzyme, not live sperm.
Either the experimenters fooled God into placing souls into unfertilized eggs, or these “sparks of life” can be understood as chemical events — though no less wondrous to behold.
Moreover, for those who insist these zinc sparks are triggered by souls, they might need to make some theological revisions to accommodate the evidence that mice apparently possess souls too.
To understand the science in more detail, watch this:
Google’s new YouTube analysis app crowdsources war reporting
Wired reports: In armed conflicts of the past, the “fog of war” meant a lack of data. In the era of ubiquitous pocket-sized cameras, it often means an information overload.
Four years ago, when analysts at the non-profit Carter Center began using YouTube videos to analyze the escalating conflicts in Syria and Libya, they found that, in contrast to older wars, it was nearly impossible to keep up with the thousands of clips uploaded every month from the smartphones and cameras of both armed groups and bystanders. “The difference with Syria and Libya is that they’re taking place in a truly connected environment. Everyone is online,” says Chris McNaboe, the manager of the Carter Center’s Syria Mapping Project. “The amount of video coming out was overwhelming…There have been more minutes of video from Syria than there have been minutes of real time.”
To handle that flood of digital footage, his team has been testing a tool called Montage. Montage was built by the human rights-focused tech incubator Jigsaw, the subsidiary of Google’s parent company Alphabet that was formerly known as a Google Ideas, to sort, map, and tag video evidence from conflict zones. Over the last few months, it allowed six Carter Center analysts to categorize video coming out of Syria—identifying government forces and each of the slew of armed opposition groups, recording the appearance of different armaments and vehicles, and keeping all of that data carefully marked with time stamps and locations to create a searchable, sortable and mappable catalog of the Syrian conflict. “Some of our Montage investigations have had over 600 videos in them,” says McNaboe. “Even with a small team we’ve been able to go through days worth of video in a relatively short amount of time.” [Continue reading…]
Exploding the myth of the scientific vs artistic mind
By David Pearson, Anglia Ruskin University
It’s a stereotype, but many of us have made the assumption that scientists are a bit rigid and less artistic than others. Artists, on the other hand, are often seen as being less rational than the rest of us. Sometimes described as the left side of the brain versus the right side – or simply logical thinking versus artistic creativity – the two are often seen as polar opposites.
Neuroscience has already shown that everyone uses both sides of the brain when performing any task. And while certain patterns of brain activity have sometimes been linked to artistic or logical thinking, it doesn’t really explain who is good at what – and why. That’s because the exact interplay of nature and nurture is notoriously difficult to tease out. But if we put the brain aside for a while and just focus on documented ability, is there any evidence to support the logic versus art stereotype?
Psychological research has approached this question by distinguishing between two styles of thinking: convergent and divergent. The emphasis in convergent thinking is on analytical and deductive reasoning, such as that measured in IQ tests. Divergent thinking, however, is more spontaneous and free-flowing. It focuses on novelty and is measured by tasks requiring us to generate multiple solutions for a problem. An example may be thinking of new, innovative uses for familiar objects.
Studies conducted during the 1960s suggested that convergent thinkers were more likely to be good at science subjects at school. Divergent thinking was shown to be more common in the arts and humanities.
However, we are increasingly learning that convergent and divergent thinking styles need not be mutually exclusive. In 2011, researchers assessed 116 final-year UK arts and science undergraduates on measures of convergent and divergent thinking and creative problem solving. The study found no difference in ability between the arts and science groups on any of these measures. Another study reported no significant difference in measures of divergent thinking between arts, natural science and social science undergraduates. Both arts and natural sciences students, however, rated themselves as being more creative than social sciences students did.
What the European Union can learn from CERN about international co-operation
By Roger Barlow, University of Huddersfield
Can Europe work? This is the real question being asked of British people on June 23. Behind the details of subsidies, regulations and eurozones lies a more fundamental puzzle: can different nationalities retain their own identities and work together, without merging into some bland United States of Europe?
I would like to suggest that there may be an example to follow in the history of CERN, the international research organisation based in Switzerland, and home to the world-famous particle accelerators used recently by teams of thousands of scientists from many nations to confirm the existence of the Higgs boson.
There are many similarities between CERN and the EU. The former was founded in 1954 and the latter in 1957, when the Treaty of Rome was signed (although it was then called the European Economic Community). Both CERN and the EU have grown over the years. The EU started with six countries and now brings together 28. CERN has grown from an initial 12 members, including the UK, to 21.
Both also emerged as a response to a post-war world in which the two superpowers dominated, not only militarily but also economically and scientifically. The US and the USSR were supreme on either side of the iron curtain, and with their great resources they pushed ahead with prestige research: space travel, electronics, and nuclear physics.
The European nations were impoverished by the financial and human cost of the war. Many of its greatest (often Jewish) scientists had fled to the US and were slow to come back. None had the people or the capacity to compete on their own.
Technology is not ruining our kids. Parents (and their technology) are ruining them
Jenny Anderson writes: Many of us worry what technology is doing to our kids. A cascade of reports show that their addiction to iAnything is diminishing empathy, increasing bullying (pdf), robbing them of time to play, and just be. So we parents set timers, lock away devices and drone on about the importance of actual real-live human interaction. And then we check our phones.
Sherry Turkle, a professor in the program in Science, Technology and Society at M.I.T. and the author, most recently, of Reclaiming Conversation: The Power of Talk in a Digital Age, turned the tables by imploring parents to take control and model better behavior.
A 15-year-old boy told her that: “someday he wanted to raise a family, not the way his parents are raising him (with phones out during meals and in the park and during his school sports events) but the way his parents think they are raising him — with no phones at meals and plentiful family conversation.”
Turkle explains the cost of too-much technology in stark terms: Our children can’t engage in conversation, or experience solitude, making it very hard for them to be empathetic. “In one experiment, many student subjects opted to give themselves mild electric shocks rather than sit alone with their thoughts,” she noted.
Unfortunately, it seems we parents are the solution. (Newsflash, kids aren’t going to give up their devices because they are worried about how it may influence their future ability to empathize.)
That means exercising some self-control. Many of us aren’t exactly paragons of virtue in this arena. [Continue reading…]
Brain scans reveal how LSD affects consciousness
Researchers from Imperial College London, working with the Beckley Foundation, have for the first time visualised the effects of LSD on the brain: In a series of experiments, scientists have gained a glimpse into how the psychedelic compound affects brain activity. The team administered LSD (Lysergic acid diethylamide) to 20 healthy volunteers in a specialist research centre and used various leading-edge and complementary brain scanning techniques to visualise how LSD alters the way the brain works.
The findings, published in Proceedings of the National Academy of Sciences (PNAS), reveal what happens in the brain when people experience the complex visual hallucinations that are often associated with LSD state. They also shed light on the brain changes that underlie the profound altered state of consciousness the drug can produce.
A major finding of the research is the discovery of what happens in the brain when people experience complex dreamlike hallucinations under LSD. Under normal conditions, information from our eyes is processed in a part of the brain at the back of the head called the visual cortex. However, when the volunteers took LSD, many additional brain areas – not just the visual cortex – contributed to visual processing.
Dr Robin Carhart-Harris, from the Department of Medicine at Imperial, who led the research, explained: “We observed brain changes under LSD that suggested our volunteers were ‘seeing with their eyes shut’ – albeit they were seeing things from their imagination rather than from the outside world. We saw that many more areas of the brain than normal were contributing to visual processing under LSD – even though the volunteers’ eyes were closed. Furthermore, the size of this effect correlated with volunteers’ ratings of complex, dreamlike visions.”
The study also revealed what happens in the brain when people report a fundamental change in the quality of their consciousness under LSD.
Dr Carhart-Harris explained: “Normally our brain consists of independent networks that perform separate specialised functions, such as vision, movement and hearing – as well as more complex things like attention. However, under LSD the separateness of these networks breaks down and instead you see a more integrated or unified brain.
“Our results suggest that this effect underlies the profound altered state of consciousness that people often describe during an LSD experience. It is also related to what people sometimes call ‘ego-dissolution’, which means the normal sense of self is broken down and replaced by a sense of reconnection with themselves, others and the natural world. This experience is sometimes framed in a religious or spiritual way – and seems to be associated with improvements in well-being after the drug’s effects have subsided.” [Continue reading…]
Amanda Feilding, executive director of the Beckley Foundation, in an address she will deliver to the Royal Society tomorrow, says: I think Albert Hoffman would have been delighted to have his “Problem child” celebrated at the Royal Society, as in his long lifetime the academic establishment never recognised his great contribution. But for the taboo surrounding this field, he would surely have won the Nobel Prize. That was the beginning of the modern psychedelic age, which has fundamentally changed society.
After the discovery of the effects of LSD, there was a burst of excitement in the medical and therapeutic worlds – over 1000 experimental and clinical studies were undertaken. Then, in the early 60s, LSD escaped from the labs and began to spread into the world at large. Fuelled by its transformational insights, a cultural evolution took place, whose effects are still felt today. It sparked a wave of interest in Eastern mysticism, healthy living, nurturing the environment, individual freedoms and new music and art among many other changes. Then the establishment panicked and turned to prohibition, partly motivated by American youth becoming disenchanted with fighting a war in far-off Vietnam.
Aghast at the global devastation caused by the war on drugs, I set up the Beckley Foundation in 1998. With the advent of brain imaging technology, I realised that one could correlate the subjective experience of altered states of consciousness, brought about by psychedelic substances, with empirical findings. I realised that only through the very best science investigating how psychedelics work in the brain could one overcome the misplaced taboo which had transformed them from the food of the gods to the work of the devil. [Continue reading…]
Just to be clear, as valuable as this research is, it is an exercise in map-making. The map should never be confused with the territory.
Yuri Milner is spending $100 million on a probe that could travel to Alpha Centauri within a generation
Ross Andersen writes: In the Southern Hemisphere’s sky, there is a constellation, a centaur holding a spear, its legs raised in mid-gallop. The creature’s front hoof is marked by a star that has long hypnotized humanity, with its brightness, and more recently, its proximity.
Since the dawn of written culture, at least, humans have dreamt of star travel. As the nearest star system to Earth, Alpha Centauri is the most natural subject of these dreams. To a certain cast of mind, the star seems destined to figure prominently in our future.
In the four centuries since the Scientific Revolution, a series of increasingly powerful instruments has slowly brought Alpha Centauri into focus. In 1689, the Jesuit priest Jean Richaud fixed his telescope on a comet, as it was streaking through the stick-figure centaur. He was startled to find not one, but two stars twinkling in its hoof. In 1915, a third star was spotted, this one a small, red satellite of the system’s two central, sunlike stars.
To say that Alpha Centauri is the nearest star system to Earth is not to say that it’s near. A 25 trillion mile abyss separates us. Alpha Centauri’s light travels to Earth at the absurd rate of 186,000 miles per second, and still takes more than four years to arrive. [Continue reading…]
Technology, the faux equalizer
Adrienne LaFrance writes: Just over a century ago, an electric company in Minnesota took out a full-page newspaper advertisement and listed 1,000 uses for electricity.
Bakers could get ice-cream freezers and waffle irons! Hat makers could put up electric signs! Paper-box manufacturers could use glue pots and fans! Then there were the at-home uses: decorative lights, corn poppers, curling irons, foot warmers, massage machines, carpet sweepers, sewing machines, and milk warmers all made the list. “Make electricity cut your housework in two,” the advertisement said.
This has long been the promise of new technology: That it will make your work easier, which will make your life better. The idea is that the arc of technology bends toward social progress. This is practically the mantra of Silicon Valley, so it’s not surprising that Google’s CEO, Sundar Pichai, seems similarly teleological in his views. [Continue reading…]
FBI backs off from its day in court with Apple this time – but there will be others
By Martin Kleppmann, University of Cambridge
After a very public stand-off over an encrypted terrorist’s smartphone, the FBI has backed down in its court case against Apple, stating that an “outside party” – rumoured to be an Israeli mobile forensics company – has found a way of accessing the data on the phone.
The exact method is not known. Forensics experts have speculated that it involves tricking the hardware into not recording how many passcode combinations have been tried, which would allow all 10,000 possible four-digit passcodes to be tried within a fairly short time. This technique would apply to the iPhone 5C in question, but not newer models, which have stronger hardware protection through the so-called secure enclave, a chip that performs security-critical operations in hardware. The FBI has denied that the technique involves copying storage chips.
So while the details of the technique remain classified, it’s reasonable to assume that any security technology can be broken given sufficient resources. In fact, the technology industry’s dirty secret is that most products are frighteningly insecure.
Why science and religion aren’t as opposed as you might think
By Stephen Jones, Newman University and Carola Leicht, University of Kent
The debate about science and religion is usually viewed as a competition between worldviews. Differing opinions on whether the two subjects can comfortably co-exist – even among scientists – are pitted against each other in a battle for supremacy.
For some, like the late paleontologist Stephen Jay Gould, science and religion represent two separate areas of enquiry, asking and answering different questions without overlap. Others, such as the biologist Richard Dawkins – and perhaps the majority of the public – see the two as fundamentally opposed belief systems.
But another way to look at the subject is to consider why people believe what they do. When we do this, we discover that the supposed conflict between science and religion is nowhere near as clear cut as some might assume.
Dramatic change in the moon’s tilt may help us trace the origin of water on Earth
By Mahesh Anand, The Open University
Astronomers have found evidence that the axis that the moon spins around shifted billions of years ago due to changes in the moon’s internal structure. The research could help explain the strange distribution of water ice near the lunar poles – the tilt would have caused some of the ice to melt by suddenly exposing it to the sun while shadowing other areas. It could also help us pinpoint craters that have been shadowed for so long that they contain water ice from early in the solar system.
Identifying recent and ancient water ice in specific craters will help scientists map the history of water on the moon. And as the moon likely formed from the Earth colliding with a planet 4.5 billion years ago, it may also help explain how the Earth got its water – a longstanding puzzle.