The Guardian reports: The International Monetary Fund has slashed its forecast for UK growth next year after warning that the decision to leave the EU has damaged the British economy’s short-term prospects and “thrown a spanner in the works” of the global recovery.
The IMF, which voiced strong misgivings about a vote for Brexit in the runup to the EU referendum, said it expected the UK economy to grow by 1.3% in 2017, 0.9 points lower than a previous estimate made in its April world economic outlook (WEO).
Jean Pisani-Ferry writes: By the time British citizens went to the polls on June 23 to decide on their country’s continued membership in the European Union, there had been no shortage of advice in favor of remaining. Foreign leaders and moral authorities had voiced unambiguous concern about the consequences of an exit, and economists had overwhelmingly warned that leaving the EU would entail significant economic costs.
Yet the warnings were ignored. A pre-referendum YouGov opinion poll tells why: “Leave” voters had no trust whatsoever in the advice-givers. They did not want their judgment to rely on politicians, academics, journalists, international organizations, or think tanks. As one of the Leave campaign’s leaders, justice secretary Michael Gove, who is now seeking to succeed David Cameron as Prime Minister, bluntly put it: “people in this country have had enough of experts.”
It is tempting to dismiss this attitude as a triumph of passion over rationality. Yet the pattern seen in the UK is oddly familiar: in the United States, Republican voters disregarded the pundits and nominated Donald Trump as their party’s presidential candidate; in France, Marine Le Pen, the leader of the far-right National Front, elicits little sympathy among experts, but has strong popular support. Everywhere, a significant number of citizens have become hostile to the cognoscenti.
Why this angry attitude toward the bearers of knowledge and expertise? The first explanation is that many voters attach little value to the opinions of those who failed to warn them about the risk of a financial crisis in 2008. Queen Elizabeth II spoke for many when, on a visit to the London School of Economics in the autumn of 2008, she asked why no one saw it coming. Furthermore, the suspicion that economists have been captured by the financial industry, expressed in the 2010 movie Inside Job, has not been dispelled. Ordinary people feel angry about what they regard as a betrayal by the intellectuals.
Most economists, let alone specialists in other disciplines, regard such accusations as unfair, because only a few of them devoted themselves to scrutinizing financial developments; yet their credibility has been seriously dented. Because no one pled guilty for the suffering that followed the crisis, the guilt has become collective. [Continue reading…]
While economists have begun to realise the failure of market orthodoxy, politicians remain in its thrall
Tony Karon writes: The policymaking elites of the industrialised West are panicking – and with good reason. The seismic shock of Britons voting to leave the European Union has sharpened awareness of the possibility that in November Donald Trump could ride a wave of xenophobia all the way to the White House. Voters in the advanced capitalist democracies appear more willing than ever to register a potentially catastrophic protest against a post-Cold War global economic order that has deified markets just as the fallen communist ideology deified the state.
A quarter century of market-driven globalisation and neoliberal orthodoxy has systematically deregulated finance, and led to tax cuts and trade deals that favour wealthy elites and leave most of the others to fend for themselves. Its response to economic crises is to adjust interest rates, bailing out capital markets (and the fortunes of the elites) while forcing endless austerity on the most economically vulnerable. The prevailing economic consensus among western governments has steadily increased inequality and diminished hopes, but such are the rules of capitalist democracies that the economically marginalised still get to vote.
“The real story of this election is that after several decades, American democracy is finally responding to the rise of inequality and the economic stagnation experienced by most of the population,” observed Francis Fukuyama recently. Fukuyama is the political scientist best known for declaring in 1989 that the collapse of the Soviet bloc heralded “the end of history”, with free-market capitalism now the undisputed ideological wisdom for the rest of time.
But the neoliberal order he proclaimed as eternal looks increasingly vulnerable, thanks to the very logic of the market economics he championed. “The gap between the fortunes of elites and those of the rest of the public has been growing for two generations, but only now is it coming to dominate national politics,” Fukuyama wrote in Foreign Affairs last month. “Now that the elites have been shocked out of their smug complacency, the time has come for them to devise more workable solutions to the problems they can no longer deny or ignore.” [Continue reading…]
The New York Times reports: With delivery trucks under constant attack, the nation’s food is now transported under armed guard. Soldiers stand watch over bakeries. The police fire rubber bullets at desperate mobs storming grocery stores, pharmacies and butcher shops. A 4-year-old girl was shot to death as street gangs fought over food.
Venezuela is convulsing from hunger.
Hundreds of people here in the city of Cumaná, home to one of the region’s independence heroes, marched on a supermarket in recent days, screaming for food. They forced open a large metal gate and poured inside. They snatched water, flour, cornmeal, salt, sugar, potatoes, anything they could find, leaving behind only broken freezers and overturned shelves.
And they showed that even in a country with the largest oil reserves in the world, it is possible for people to riot because there is not enough food.
In the last two weeks alone, more than 50 food riots, protests and mass looting have erupted around the country. Scores of businesses have been stripped bare or destroyed. At least five people have been killed. [Continue reading…]
As the end of the oil era approaches, Saudi Arabia is lining up a US$2 trillion sovereign wealth fund
The falling price of oil is beginning to have a real impact on the energy-fuelled economies of the Gulf. In 2014, after almost a decade of record highs, the price of a barrel of Brent crude began to collapse from a peak of US$140 to less than US$30.
Saudi Arabia is lining up a US$2 trillion sovereign wealth fund to see it through the twilight years of the oil era. But not all the countries of the Gulf Co-operation Council, or GCC, have this kind of cash. Indeed, even for Saudi Arabia, the new era of low oil prices spells increasing budget deficits, reductions in state subsidies and a slowdown of the energy and construction sectors, which the region’s economies have been built on.
Both private and state-owned firms are starting to restructure to reduce costs and increase efficiency now that the boom is over. They are merging divisions or outsourcing certain functions, introducing performance-related earnings, offering redundancies or smaller pay increases to staff. Qatar ought to be able to continue awarding annual salary increases given the continued investment in areas such as construction thanks to the 2022 football World Cup. But others, such as Saudi Arabia – most exposed to oil price fluctuations and subject to wide-ranging public sector cuts – will likely see redundancies at a time when the rate of inflation is high and subsidies are declining.
Climate Nexus reports: This past December, representatives from 195 nations gathered in Paris to negotiate an historic agreement to combat climate change and accelerate the transition to a sustainable, low-carbon future. After two weeks of negotiations, the nations unanimously agreed to adopt the international climate pact. On April 22, 2016, nations will again gather, this time at the United Nations headquarters in New York, to formally sign the Paris Agreement.
The world has already seen a significant shift towards stronger climate action, in the ensuing months since the adoption of the Agreement.
The adoption of the Paris negotiations delivered a signal to governments, businesses and the global public: All parties, from national governments to small businesses, must do their part to minimize the risks and impacts of climate change.
This signal has mobilized actions by public and private sector institutions to move away from fossil fuels, which drive climate change, and towards an economy powered by renewable energy. [Continue reading…]
Lee Vinsel & Andrew Russell write: Innovation is a dominant ideology of our era, embraced in America by Silicon Valley, Wall Street, and the Washington DC political elite. As the pursuit of innovation has inspired technologists and capitalists, it has also provoked critics who suspect that the peddlers of innovation radically overvalue innovation. What happens after innovation, they argue, is more important. Maintenance and repair, the building of infrastructures, the mundane labour that goes into sustaining functioning and efficient infrastructures, simply has more impact on people’s daily lives than the vast majority of technological innovations.
The fates of nations on opposing sides of the Iron Curtain illustrate good reasons that led to the rise of innovation as a buzzword and organising concept. Over the course of the 20th century, open societies that celebrated diversity, novelty, and progress performed better than closed societies that defended uniformity and order.
In the late 1960s in the face of the Vietnam War, environmental degradation, the Kennedy and King assassinations, and other social and technological disappointments, it grew more difficult for many to have faith in moral and social progress. To take the place of progress, ‘innovation’, a smaller, and morally neutral, concept arose. Innovation provided a way to celebrate the accomplishments of a high-tech age without expecting too much from them in the way of moral and social improvement.
Before the dreams of the New Left had been dashed by massacres at My Lai and Altamont, economists had already turned to technology to explain the economic growth and high standards of living in capitalist democracies. Beginning in the late 1950s, the prominent economists Robert Solow and Kenneth Arrow found that traditional explanations – changes in education and capital, for example – could not account for significant portions of growth. They hypothesised that technological change was the hidden X factor. Their finding fit hand-in-glove with all of the technical marvels that had come out of the Second World War, the Cold War, the post-Sputnik craze for science and technology, and the post-war vision of a material abundance. [Continue reading…]
The Guardian reports: Climate change could cut the value of the world’s financial assets by $2.5tn (£1.7tn), according to the first estimate from economic modelling.
In the worst case scenarios, often used by regulators to check the financial health of companies and economies, the losses could soar to $24tn, or 17% of the world’s entire assets, and wreck the global economy.
However, the research also showed the financial sense in taking action to keep climate change under the 2C danger limit agreed by the world’s nations. In this scenario, the value of financial assets would fall by $315bn less, even when the costs of cutting emissions are included.
“Our work suggests to long-term investors that we would be better off in a low-carbon world,” said Prof Simon Dietz, at the London School of Economics, the lead author of the study. “Pension funds should be getting on top of this issue, and many of them are.” But he said awareness in the financial sector was low.
Mark Campanale, at the thinktank Carbon Tracker Initiative, said the actual financial losses from unchecked global warming could be higher than estimated by the financial model behind the new study: “It could be a lot worse. The loss of financial capital can be a lot higher and faster than the GDP losses [used to model the costs of climate change in the study]. Just look at value of coal giant Peabody Energy: it was worth billions just a few years ago and now it is worth nothing.” [Continue reading…]
Louis Hyman writes: In 1967, the celebrated economist and intellectual John Kenneth Galbraith argued in his best-selling book The New Industrial State that “we have an economic system which, whatever its formal ideological billing, is in substantial part a planned economy.” Though postwar American politicians juxtaposed US free markets to the centrally planned economies of the Soviet bloc, Galbraith recognized that the two were more similar than one might have thought. The private planning of corporations, whose budgets were sometimes bigger than those of governments, defined postwar American capitalism, not markets. Markets meant uncertainty, and postwar corporate planners eschewed risk above all else.
After the chaos of depression and war, corporate planners had worked in conjunction with federal policymakers to make a world that promoted stability. None of the top 100 postwar corporations had failed to earn a profit. This profitability was not an accident. Nor was it the result of seizing every lucrative prospect. Rather, it had come from minimizing risk in favor of long-term certainty.
This postwar economy had allowed employees and employers alike to plan for the future, assuring them steady wages and steady profits. Big business had to be big to contain all the functions it would not entrust to the market. Through their own five-year plans, Galbraith argued, corporations “minimize[d] or [got] rid of market influences.” This American planned economy — which had appeared to be the natural future of capitalism in 1967 — began to fall apart only two years later, in 1969, nearly twenty years before the fall of the Soviet Union.
The collapse of this postwar economy came from the overreach of its new corporate form—the conglomerate—whose rise was legitimated by the belief in managerial planning. But its essential moral underpinnings — stability for investment and, especially, stability for work — took more of an effort to dislodge. Yet in the 1970s and 1980s, this effort succeeded as corporations began to embrace risk and markets, undoing the stability of the postwar period. By the 1980s, the risk-taking entrepreneur had displaced the safe company man as the ideal employee. [Continue reading…]
John Quiggin writes: For most of us, the industrial economy is a thing of the past. In the entire United States, large factories employ fewer than 2 million people. Even adding China to the picture does not change things much. And yet the conceptual categories of the 20th century still dominate our thinking. We remain fixated on the industrial model of economic growth, where ‘growth’ means ‘more of everything’, and we can express our rate of development in a single number. This model leads naturally to the conclusion that economic expansion must eventually run up against constraints on the availability of natural resources, such as trees to make paper.
And yet in 2013, despite positive growth overall, the world reached ‘Peak Paper’: global paper production and consumption reached its maximum, flattened out, and is now falling. A prediction that was over-hyped in the 20th century and then derided in the early 2000s – namely, the Paperless Office – is finally being realised. Growth continues, but paper is in retreat. Why did this seem so unlikely only a decade ago?
The problem is a standard assumption of macroeconomics – namely, that all sectors of the economy expand at a roughly equal rate. If this ‘fixed proportions’ assumption does not hold, the theory used to construct GDP numbers ceases to work, and the concept of a ‘rate of growth’ is no longer meaningful. Until the end of the 20th century, these assumptions did in fact work reasonably well for paper, books and newspapers. The volume of information increased somewhat more rapidly than the economy as a whole, but not so rapidly as to undermine the notion of an overall rate of economic growth. The volume of printed matter grew steadily, to around a million new books every year, and the demand for paper for printing grew in line with demand for books. [Continue reading…]
Financial Times reports: US builders do not suffer from too many Mexican workers, but too few; they stand to gain from immigration reform, not lose. The joke in Texas is that if Mr Trump really wants to put up a wall between the US and Mexico, he will have to open the border first to find enough workers to finish the job.
Across the US, the construction sector — which contributes 4 per cent to US gross domestic product — is suffering from chronic shortages of workers that are pushing up wages and slowing down activity. Of the 1,358 companies surveyed last year by the Associated General Contractors of America, 86 per cent had trouble filling positions, up three percentage points from 2014. More than seven out of 10 contractors reported difficulty finding carpenters, 60 per cent for electricians and 56 per cent for roofers. In 2014, a builder called Camden Property Trust installed security guards at sites in Denver, Colorado, and Austin, Texas, to prevent competitors from poaching workers.
“I could be twice the size in terms of revenue if we had the flow of labour that we could be training,” says Chad Collins, owner of Bone Dry Roofing in Athens, Georgia. “We are handcuffed by the lack of a willing and skilled work force.”
The impact is particularly dramatic in Dallas. Even as US recession fears grow, the city and its suburbs are thriving as companies such as Toyota, Facebook and JPMorgan Chase build facilities. “There is a workforce issue,” says Keith Post, KPost chief executive. “People are stealing and overpaying [workers].” Steve Little, the company’s president, says labour costs have risen 15 per cent in the past two years.
Homebuilder Bruno Pasquinelli, president and founder of CB JENI Homes in Dallas, says delays are mounting. “It’s very difficult to predict when a house is going to get finished,” he says. “Houses that we used to build in 22 weeks are now taking upwards of 30 and bad ones could be 40.”
The labour shortages are counterintuitive. US construction employment fell from 7.7m to 5.4m during the downturn, and it was assumed there would be plenty of workers once business recovered. But some industry veterans retired. Others headed to the oilfields as the shale boom gathered pace. And many went home to Mexico — creating a problem, given the sector’s dependence on immigrants. [Continue reading…]
When Saudi Arabia led an OPEC decision to end a restraint put on oil production in November 2014, it marked the beginning of a new era in oil economics. It has given us a tumbling oil price, prompted huge losses and job cuts at oil firms like BP and might yet give us economic and political drama in the heart of Moscow. To understand why, it’s worth drilling down to the start of the whole process, and the costs of getting oil out of the ground in the first place.
Historically, the OPEC cartel of oil-producing nations has been able to manage oil prices because of the lack of flexibility in global supply. The whole business of setting up wells, operating pipelines and building rigs entails large and long-term investments which makes producers slow to respond to price movements. And a small cut in OPEC supply can have a significant impact on the global oil price.
The advent of the US shale oil boom changed this dynamic. The industry has lower fixed costs but higher variable costs and is more like an industrial process than a major one-off investment. That makes it more responsive to price movements and more flexible in adjusting short-term output.
Overall though, shale is a relatively high cost source of oil, especially compared to Middle East production. As a result, when US shale threatened OPEC’s market share, the cartel allowed a position of global oversupply to develop. It was a simple trick: make oil prices fall to make shale unprofitable.
The Washington Post reports: Air pollution caused by energy production in the U.S. caused at least $131 billion in damages in the year 2011 alone, a new analysis concludes — but while the number sounds grim, it’s also a sign of improvement. In 2002, the damages totaled as high as $175 billion, and the decline in the past decade highlights the success of more stringent emissions regulations on the energy sector while also pointing out the need to continue cracking down.
“The bulk of the cost of emissions is the result of health impacts — so morbidity and particularly mortality,” said the paper’s lead author, Paulina Jaramillo, an assistant professor of engineering and public policy at Carnegie Mellon University. Using models, researchers can place a monetary value on the health effects caused by air pollution and come up with a “social cost” of the offending emissions — in other words, the monetary damages associated with emitting an additional ton (or other unit) of a given type of pollutant. This social cost can then be used to calculate the total monetary damages produced by a certain amount of emissions in a given time period.
The new analysis, just published in the journal Energy Policy, did just that. Using an up-to-date model and a set of data acquired from the Environmental Protection Agency on emissions from the energy sector, the researchers set about estimating the monetary damages caused by air pollution from energy production between 2002 and 2011. [Continue reading…]