Brad Plumer writes: It was the early 1990s. Climate scientists had long known that humans were warming up the planet. But politicians were just beginning to grasp that it would take a huge coordinated effort to get nations to burn fewer fossil fuels and avoid sharp temperature increases in the decades ahead.
Those policymakers needed a common goal — a way to say, Here’s how bad things will get and This is what we need to do to stop it. And that posed a dilemma. No one could really agree on how much global warming was unacceptable. How high did the seas need to rise before we had a serious problem? How much heat was too much?
Around this time, an advisory council of scientists in Germany proposed a stunningly simple way to think about climate change. Look, they reasoned, human civilization hasn’t been around all that long. And for the last 13,000 years, Earth’s climate has fluctuated within a narrow band. So, to be on the safe side, we should prevent global average temperatures from rising more than 2° Celsius (or 3.6° Fahrenheit) above what they were just before the dawn of industrialization.
Critics grumbled that the 2°C limit seemed arbitrary or overly simplistic. But scientists were already compiling evidence that the risks of global warming became especially daunting somewhere above the 2°C threshold: rapid sea-level rise, the risk of crop failure, the collapse of coral reefs. And policymakers loved the idea of a simple, easily digestible target. So it stuck.
By 2009, nearly every government in the world had endorsed the 2°C limit — global warming beyond that level was deemed “dangerous.” And so, every year, the world’s leaders meet at UN climate conferences to discuss policies and emissions cuts that they hope will keep us below 2°C. Climate experts churn out endless papers on how we can adapt to 2°C of warming (or less).
Two decades later, there’s just one major problem with this picture. The idea that the world can stay below 2°C looks increasingly delusional. [Continue reading…]