Cross-posted at Inter Press Service.
By Jomo Kwame Sundaram
Privatization has been central to the ‘neo-liberal’ counter-revolution from the 1970s against government economic interventions associated with Roosevelt and Keynes as well as post-colonial state-led economic development.
Many developing countries were forced to accept privatization policies as a condition for credit or loan support from the World Bank and other international financial institutions, especially after the fiscal and debt crises of the early 1980s. Other countries voluntarily embraced privatization, often on the pretext of fiscal and debt constraints, in their efforts to mimic new Anglo-American criteria of economic progress.Demonizing SOEs
Globally, inflation was attributed to excessive government intervention, public sector expansion and state-owned enterprise (SOE) inefficiency. It was claimed, with uneven and dubious evidence, that SOEs were inherently likely to be inefficient, corrupt, subject to abuse, and so on.
In the 1970s, the motives of many involved in the preceding public sector expansion – enabled by high commodity prices and earnings as well as low real interest rates due to easy credit, with the need to ‘recycle petro-dollars’ (invest revenues from petroleum exports) – were developmental and noble.
Regardless of their original rationale or intent, many SOEs become problematic and often inefficient. Yet, privatization is not, and has never been a universal panacea for the myriad problems faced by SOEs.
Only more pragmatic and appropriate approaches — recognizing their origins, roles, functioning, impacts and problems — can realistically expect to address and overcome the burdens they have come to impose on many developing economies.Various meanings
Privatization usually refers to a change of ownership from public to private hands. Over recent decades, the term has been used more loosely. For example, it may only involve minority private ownership after the corporatization of an SOE, and the sale of a minority share of its stock, or even a majority share with control remaining in state hands by various means such as the use of a ‘golden share’.
It sometimes also refers to contracting out services previously undertaken solely by the government. The definition may include cases where private enterprises are awarded licenses to participate in activities previously reserved for the public sector.
Strictly speaking, however, privatization involves the transfer of at least a majority share of and a controlling interest in a public enterprise or SOE and its assets, or an entity (such as a government department, a statutory body or a government company) previously controlled and typically at least majority-owned by the government, either directly or indirectly.Mainstreaming privatization
Following the oil price shocks of the mid- and late 1970s, inflation spread through much of the world. US President Jimmy Carter appointed Paul Volcker as Chairman of the US Federal Reserve in 1980. The US Fed sharply raised interest rates to stem inflation, which precipitated the fiscal and debt crises of the early 1980s in many parts of the world, especially in Latin America, Africa and Eastern Europe.
The unexpected sovereign debt crises forced many countries to seek emergency financial support from the International Monetary Fund (IMF) and the World Bank (WB), both headquartered in Washington, DC. The IMF provided emergency credit facilities requiring (price) stabilization programmes to bring down inflation, typically blamed on ‘deficit financing’ due to ‘macroeconomic populism’.
Generally, the WB worked closely to provide medium- and long-term credit to these governments on condition that they adopted structural adjustment programmes (SAPs). The SAPs generally prescribed economic globalization (especially of international trade and finance), national (or domestic) deregulation and privatization.
Since then, these international financial institutions have been more powerful in relation to developing countries than ever before. Soon, privatization became a standard requirement of SAPs. Thus, many governments of developing countries were forced to privatize by the SAPs’ loan conditions.
Many other governments voluntarily adopted such policies which became standard pillars of the emerging ‘Washington Consensus’ associated with the WB, the IMF and the US policy consensus of the 1980s. Privatization in developing countries was preceded by the political ‘counter-revolution’ associated with the rise and election of Margaret Thatcher as the Prime Minister of the United Kingdom and Ronald Reagan as the President of the United States of America.
Jomo Kwame Sundaram, a former economics professor, was United Nations Assistant Secretary-General for Economic Development, and received the Wassily Leontief Prize for Advancing the Frontiers of Economic Thought.
Triple Crisis welcomes your comments. Please share your thoughts below.
By Frank Ackerman
Climate change is at once a common problem that threatens us all, and a source of differential harms based on location and resources. We are all on the same boat, in perilous waters – but some of us have much nicer cabins than others. What is the relationship of inequality to climate policy?
The ultimate economic obstacle to climate policy is the long life of so many investments. Housing can last for a century or more, locking residents into locations that made sense long ago. Business investments often survive for decades. These investments, in the not-so-distant past, assumed continuation of cheap oil and minimally regulated coal – thereby building in a commitment to high carbon emissions. Now, in a climate-aware world, we need to treat all fossil fuels as expensive and maintain stringent regulation of coal. And it is impossible to repurpose many past investments for the new era: they are sunk costs, valuable only in their original location or industry.
If we could wave a magic wand and have a complete do-over on urban planning, we could create a new, more comfortable and more sustainable way of life. Transit-centered housing complexes, surrounded by green spaces and by local amenities and services, could offer convenient car-free links to major employment sites. Absent a magic wand, the challenge is how to get there from here, in a short enough time frame to matter for climate policy.
Space is the final frontier in energy use. Instead of shared public spaces for all, an ever-more-unequal society allows the rich to enjoy immense private spaces, such as McMansions situated on huge exurban lots. This leads to higher heating and cooling costs for oversized housing, and to higher infrastructure costs in general: longer pipes, wires and travel distances between houses. And it locks in a commitment to low population density and long individual commutes. Outside of the biggest cities, much of the United States is too sparsely settled for mass transit.Pushing toward clean energy
Carbon prices and other incentives are designed to push people and businesses out of the most emissions-intensive locations and activities. Along with the wealthy exurbs, cold rural states, with high heating and transportation requirements per person, will become more expensive. So, too, will investment in emissions-intensive production processes, whether in electricity generation, heavy industry, or agriculture.
The art of policymaking requires a delicate balance. Too much pressure to make fuel expensive can produce a backlash, as in the Yellow Vests protests in France, which successfully blocked an increase in the price of gasoline. Too little pressure leads to complacency, to the false belief that enough is already being done. Subsidies to support the transition may be useful but must be time-limited to avoid becoming a permanent entitlement.
The green new deal, the hopeful, if still vague, political vision that is now drawing widespread attention, calls for a transition to clean energy, investment in low-carbon infrastructure, and a focus on equality and workers’ rights. It would create substantial net benefits for the country and the economy. A more fine-grained analysis is needed, however, to identify those who might lose from the transition. Their losses will loom large in the policy debate, regardless of the benefits to the rest of society.
For example, after years of seniority-based cutbacks, many of the remaining workers in legacy energy industries (coal mines, oil wells, fossil-fueled power plants) are nearing retirement age. Pension guarantees, combined with additional funding to allow early retirement, may be more important to these workers, while new green jobs could be important to their children or to the smaller number of younger workers in at-risk jobs.
Older residents who have spent their lives and invested their savings in a rural community, or have no assets except a farm, should be welcome to remain in those communities. But the lingering mystique of an almost-vanished rural America should not lead to new initiatives to attract younger residents back to an energy-intensive, emissions-intensive lifestyle.
Responding to inequality
Energy use and carbon emissions are quite unequally distributed, within as well as between countries. In all but the poorest countries, the rich spend more on energy in absolute dollar terms, but less than others as a percentage of income. As a result, any carbon price introduced in the United States or other high-income countries will be regressive, taking a greater percentage of income from lower-income households.
To address this problem, James Boyce proposes refunding carbon revenues to households on an equal per capita basis, in a cap-and-dividend system. Boyce’s calculations show that most people could come out ahead on a cap-and-dividend plan: only the richest 20 percent of U.S. households would lose from paying a relatively high carbon price, if the revenues were refunded via equal per capita dividends.
Other authors have proposed that some of the revenues could go to basic research or to infrastructure development, accelerating the arrival of sustainable energy use. Any use of the revenues, except distribution in proportion to individual fuel use or emissions, preserves the incentive effect of a carbon price. The question of cap-and-dividend versus investment in sustainable energy is largely a debate about what will make a regressive carbon price politically acceptable.
It is not only households that have invested too heavily in now-obsolete patterns of energy use. The same pattern arises in a different context, in the energy sector itself. Electric utilities have often invested in fossil-fuel-burning plants, expecting to recover their investment over 20 to 30 years of use. Now, as changing prices and priorities shut some of those plants before the end of their planned lifetimes, the unrecovered investment is a stranded asset, no longer useful for producers or customers.
The problem is further complicated by the regulatory bargains made in many states. Depending on utility regulations (which differ from state to state), a utility may have formally agreed to allow state regulators to set its rates, in exchange for an opportunity to recover its entire investment over a long period of years. What happens to that regulatory bargain when a regulated plant becomes uneconomic to operate?
Businesses whose investments have gone badly do not elicit the same degree of sympathy as individuals stuck in energy-intensive homes and careers. Indeed, Milton Friedman, the godfather of modern conservative economics, used to emphasize that private enterprise is a profit and loss system, where losses are even more important than profits in forcing companies to use their resources effectively.
Despite Friedman’s praise of losses, demanding that a utility absorb the entire loss on its stranded assets could provoke political obstacles to clean energy and climate policy. Neither zero recovery nor full recovery of a utility’s stranded assets may be appropriate in theory. Given the urgency of a rapid and complete energy transition, it may be more expedient to negotiate a settlement that allows prompt progress. Once again, it is the political art of the deal, not any fixed economic formula, that determines what should be done. Offering utilities too little provokes opposition and delay; offering them too much is unfair to everyone else and could encourage similar mistaken investments in the future.
What does global sustainability look like?
Climate change is a global problem that can only be solved by cooperation among all major countries. The challenge for American policy is not only to reduce our own emissions, but also to play a constructive role in global climate cooperation. U.S. leadership, in cooperation with China and Europe, is crucial to the global effort to control the climate. Reviving that leadership, which had barely surfaced under Obama before being abandoned by Trump, is among the most important things we can do for the world today.
In the longer run, questions of climate justice and international obligations are among the most difficult aspects of climate policy. High-income countries such as the United States and northern Europe bear substantial responsibility for the climate crisis worldwide. Among other approaches, the Greenhouse Development Rights framework combines historical responsibility for emissions and current ability to pay for mitigation, in assigning shares of the global cost of climate stabilization.
In the current political climate there is no hope of achieving complete consensus about international burden-sharing before beginning to address the climate crisis. The urgency of climate protection requires major initiatives as soon as possible, in parallel with (not waiting for the conclusion of) discussions of international equity. U.S. actions on both fronts are essential for global progress toward climate stabilization. Significant steps toward equity and burden-sharing may be required to win the support of emerging economies such as India, Indonesia and Brazil.
Finally, assuming success, what would global sustainable development look like? In view of the rapid urbanization of emerging economies, the key question is, what kind of low-carbon urban life can the world afford? The sprawling, car-intensive and carbon-intensive expanse of Los Angeles, Phoenix, or Houston seems like an amazingly expensive mistake. The compact, energy-efficient, transit-based urbanism of Tokyo or Hong Kong is at least a contender, a high-income life with much lower resource use per person.
The American example matters around the world: if our vision of the good life remains one of extravagant sprawl, others will try to imitate it. If we develop a more sustainable vision of our own future, the whole world will be watching.
Frank Ackerman is principal economist at Synapse Energy Economics in Cambridge, Mass., and one of the founders of Dollars & Sense, which publishes Triple Crisis.
By Frank Ackerman
We need a price on carbon emissions. This opinion, virtually unanimous among economists, is also shared by a growing number of advocates and policymakers. But unanimity disappears in the debate over how to price carbon: there is continuing controversy about the merits of taxes vs. cap-and-trade systems for pricing emissions, and about the role for complementary, non-price policies.
At the risk of spoiling the suspense, this blog post reaches two main conclusions: First, under either a carbon tax or a cap-and-trade system, the price level matters more than the mechanism used to reach that price. Second, under either approach, a reasonably high price is necessary but not sufficient for climate policy; other measures are needed to complement price incentives.Why taxes and cap-and-trade systems are similar
A carbon tax raises the cost of fossil fuels directly, by taxing their carbon emissions from combustion. This is most easily done upstream, i.e. taxing the oil or gas well, coal mine, or fuel importer, who presumably passes the tax on to end users. There are only hundreds of upstream fuel producers and importers to keep track of, compared to millions of end users.
A cap-and-trade system accomplishes the same thing indirectly, by setting a cap on total allowable emissions, and issuing that many annual allowances. Companies that want to sell or use fossil fuels are required to hold allowances equal to their emissions. If the cap is low enough to make allowances a scarce resource, then the market will establish a price on allowances – in effect, a price on greenhouse gas emissions. Again, it is easier to apply allowance requirements, and thus induce carbon trading, at the upstream level rather than on millions of end users.
If the price of emissions is, for example, $50 per ton of carbon dioxide, then any firm that can reduce emissions for less than $50 a ton will do so – under either a tax or cap-and-trade system. Cutting emissions reduces tax payments, under a carbon tax; it reduces the need to buy allowances under a cap-and-trade system. The price, not the mechanism, is what matters for this incentive effect.
A review of the economics literature on carbon taxes vs. cap-and-trade systems found a number of other points of similarity. Either system can be configured to achieve a desired distribution of the burden on households and industries, e.g. via free allocation of some allowances, or partial exemption from taxes. Money raised from either taxes or allowance auctions could be wholly or partially refunded to households. Either approach can be manipulated to reduce effects on international competitiveness.
And problems raised with offsets – along the lines of credits given too casually for tree-planting – are not unique to cap and trade. A carbon tax could emerge from Congress riddled with obscure loopholes, which could be as damaging to the integrity of carbon pricing as any of the poorly written offset provisions of existing cap-and-trade systems. More positively speaking, either approach to carbon pricing can be carried out either with or without offsets and tax exemptions.
Why taxes and cap-and-trade systems are different
Compared to the numerous similarities between the two approaches, the list of differences is a shorter one. A carbon tax is easier and cheaper to administer. In theory, a carbon tax provides certainty about the price of emissions, while a cap-and-trade system provides certainty about the quantity of emissions (in practice, these certainties can be undone by too-frequent tinkering with tax rates or emissions caps).
Cap-and-trade systems have been more widely used in practice. The European Union’s Emissions Trading System (EU ETS) is the world’s largest carbon market. Others include the linked carbon market of California and several Canadian provinces, and the Regional Greenhouse Gas Initiative (RGGI) among states in the Northeast.
Numerous critics have pointed to potential flaws in cap-and-trade, such as overly generous, poorly monitored offsets. Many recent cap-and-trade systems, introduced in a conservative era, began with caps so high and prices so low that they have little effect (leaving them open to the criticism that the administrative costs are not justified by the skimpy results). The price must be high enough, and the cap must be low enough, to alter the behavior of major emitters.
The same applies, of course, to a carbon tax. Starting with a trivial level of carbon tax, in order to calm opponents of the measure, runs the risk of “proving” that a carbon price has no effect. The correct starting price under either system is the highest price that is politically acceptable; there is no hope of “getting the prices right” due to the uncertain and potentially disastrous scope of climate damages.
Perhaps the most salient difference between taxes and cap-and-trade is political rather than economic: in an era when people like to chant “no new taxes”, the prospects for any initiative seem worse if it involves a new tax. This could explain why there is so much more experience to date with cap-and-trade systems.
Beyond price incentives
Some carbon emitters, for instance in electricity generation, have multiple choices among alternative technologies. In such cases, price incentives alone are powerful, and producers can respond incrementally, retiring and replacing individual plants when appropriate. Other sectors face barriers that an individual firm cannot usually overcome on its own. Electric vehicles are not practical without an extensive recharging and repair infrastructure, which is just beginning to exist in a few parts of the country. In this case, no reasonable level of carbon price can, by itself, bring an adequate nationwide electric vehicle infrastructure into existence. Policies that build and promote electric vehicle infrastructure are valuable complements to a carbon price: they create a combined incentive to move away from gasoline.
Yet another reason for combining non-price climate policies with a carbon price is that purely price-based decision-making can be exhausting. People could calculate for themselves the fuel saved by buying a more fuel-efficient car and subtract that from the sticker price of the vehicle, but it is not an easy calculation. Federal and state fuel economy standards make the process simpler, by setting a floor underneath vehicle fuel efficiency.
When buying a major appliance, it is possible in theory to read the energy efficiency sticker on the carton, calculate your average annual use of the appliance, convert it to dollars saved per year, and see if that savings justifies purchase of a more efficient appliance. But who does all that arithmetic? Even I don’t want to do that calculation, and I have a PhD in economics and enjoy playing with numbers. My guess is that virtually no one does the calculation consistently and correctly. On the other hand, federal and state appliance efficiency standards have often set minimum levels of required efficiency, which increase over time. It’s much more fun to buy something off the shelf that meets those standards, instead of settling in for an extended data-crunching session any time you need a new fridge, air conditioner, washing machine…
In short, the carbon price is what matters, not the mechanism used to adopt that price. And whatever the price, non-price climate policies are needed as well – both to build things that no one company can do on its own, and to make energy-efficient choices accessible to all, without heroic feats of calculation.
Frank Ackerman is principal economist at Synapse Energy Economics in Cambridge, Mass., and one of the founders of Dollars & Sense, which publishes Triple Crisis.
By Jomo Kwame Sundaram
Cross-posted at Inter Press Service.
For two centuries, all too many discussions about hunger and resource scarcity has been haunted by the ghost of Parson Thomas Malthus. Malthus warned that rising populations would exhaust resources, especially those needed for food production. Exponential population growth would outstrip food output.
Humanity now faces a major challenge as global warming is expected to frustrate the production of enough food as the world population rises to 9.7 billion by 2050. Timothy Wise’s new book Eating Tomorrow: Agribusiness, Family Farmers, and the Battle for the Future of Food (New Press, New York, 2019) argues that most solutions currently put forward by government, philanthropic and private sector luminaries are misleading.Malthus’ ghost returns
The early 2008 food price crisis has often been wrongly associated with the 2008-2009 global financial crisis. The number of hungry in the world was said to have risen to over a billion, feeding a resurgence of neo-Malthusianism.
Agribusiness advocates fed such fears, insisting that food production must double by 2050, and high-yielding industrial agriculture, under the auspices of agribusiness, is the only solution. In fact, the world is mainly fed by hundreds of millions of small-scale, often called family farmers who produce over two-thirds of developing countries’ food.
Contrary to conventional wisdom, neither food scarcity nor poor physical access are the main causes of food insecurity and hunger. Instead, Reuters has observed a ‘global grain glut’, with surplus cereal stocks piling up.
Meanwhile, poor production, processing and storage facilities cause food losses of an average of about a third of developing countries’ output. A similar share is believed lost in rich countries due to wasteful food storage, marketing and consumption behaviour.
Nevertheless, despite grain abundance, the 2018 State of Food Insecurity report — by the Rome-based United Nations food agencies led by the Food and Agriculture Organization (FAO) — reported rising chronic and severe hunger or undernourishment involving more than 800 million.
Political, philanthropic and corporate leaders have promised to help struggling African and other countries grow more food, by offering to improve farming practices. New seed and other technologies would modernize those left behind.
But producing more food, by itself, does not enable the hungry to eat. Thus, agribusiness and its philanthropic promoters are often the problem, not the solution, in feeding the world.
Eating Tomorrow addresses related questions such as: Why doesn’t rising global food production feed the hungry? How can we “feed the world” of rising populations and unsustainable pressure on land, water and other natural resources that farmers need to grow food?Family farmers lack power
Drawing on five years of extensive fieldwork in Southern Africa, Mexico, India and the US Mid-West, Wise concludes that the problem is essentially one of power. He shows how powerful business interests influence government food and agricultural policies to favour large farms.
This is typically at the expense of ‘family’ farmers, who grow most of the world’s food, but also involves putting consumers and others at risk, e.g., due to agrochemical use. His many examples not only detail and explain the many problems small-scale farmers face, but also their typically constructive responses despite lack of support, if not worse, from most governments:
• In Mexico, trade liberalization following the 1993 North American Free Trade Area (NAFTA) agreement swamped the country with cheap, subsidized US maize and pork, accelerating migration from the countryside. Apparently, this was actively encouraged by transnational pork producers employing ‘undocumented’ and un-unionised Mexican workers willing to accept low wages and poor working conditions.
• In Malawi, large government subsidies encouraged farmers to buy commercial fertilizers and seeds from US agribusinesses such as now Bayer-owned Monsanto, but to little effect, as their productivity and food security stagnated or even deteriorated. Meanwhile, Monsanto took over the government seed company, favouring its own patented seeds at the expense of productive local varieties, while a former senior Monsanto official co-authored the national seed policy that threatens to criminalize farmers who save, exchange and sell seeds instead!
• In Zambia, greater use of seeds and fertilizers from agribusiness tripled maize production without reducing the country’s very high rates of poverty and malnutrition. Meanwhile, as the government provides 250,000-acre ‘farm blocks’ to foreign investors, family farmers struggle for title to farm land.
• In Mozambique too, the government gives away vast tracts of farm land to foreign investors. Meanwhile, women-led cooperatives successfully run their own native maize seed banks.
• Meanwhile, Iowa promotes vast monocultures of maize and soybean to feed hogs and bioethanol rather than ‘feed the world’.
• A large Mexican farmer cooperative launched an ‘agro-ecological revolution’, while the old government kept trying to legalize Monsanto’s controversial genetically modified maize. Farmers have thus far halted the Monsanto plan, arguing that GM corn threatens the rich diversity of native Mexican varieties.
Much of the research for the book was done in 2014-15, when Obama was US president, although the narrative begins with developments and policies following the 2008 food price crisis, during Bush’s last year in the White House. The book tells a story of US big business’ influence on policies enabling more aggressive transnational expansion.
Yet, Wise remains optimistic, emphasizing that the world can feed the hungry, many of whom are family farmers. Despite the challenges they face, many family farmers are finding innovative and effective ways to grow more and better food. He advocates support for farmers’ efforts to improve their soil, output and wellbeing.Eating better
Hungry farmers are nourishing their life-giving soils using more ecologically sound practices to plant a diversity of native crops, instead of using costly chemicals for export-oriented monocultures. According to Wise, they are growing more and better food, and are capable of feeding the hungry.
Unfortunately, most national governments and international institutions still favour large-scale, high-input, industrial agriculture, neglecting more sustainable solutions offered by family farmers, and the need to improve the wellbeing of poor farmers.
Undoubtedly, many new agricultural techniques offer the prospect of improving the welfare of farmers, not only by increasing productivity and output, but also by limiting costs, using scarce resources more effectively, and reducing the drudgery of farm work.
But the world must recognize that farming may no longer be viable for many who face land, water and other resource constraints, unless they get better access to such resources. Meanwhile, malnutrition of various types affects well over two billion people in the world, and industrial agriculture contributes about 30% of greenhouse gas emissions.
Going forward, it will be important to ensure affordable, healthy and nutritious food supplies for all, mindful not only of food and water safety, but also of various pollution threats. A related challenge will be to enhance dietary diversity affordably to overcome micronutrient deficiencies and diet-related non-communicable diseases for all.
Jomo Kwame Sundaram, a former economics professor, was United Nations Assistant Secretary-General for Economic Development, and received the Wassily Leontief Prize for Advancing the Frontiers of Economic Thought.
By Frank Ackerman
Carbon dioxide (CO2) represents most, but not all, greenhouse gas emissions. In EPA’s Greenhouse Gas Inventory for 2016, CO2 represented 82 percent of gross U.S. GHG emissions, while methane represented 10 percent (measured as CO2-equivalents). The top three sources of methane are agriculture, the energy industry, and waste management.
As fascinating as some of us may find such details, the general public has a short attention span for new information about climate change. Within that constraint, what do we want to communicate? For methane, there are two choices, an introductory and an advanced message.
The introductory message emphasizes that methane, the principal component of natural gas, is an important cause of global warming under any version of the data. It is therefore crucial to reduce and eliminate all fossil fuels, gas included, as soon as possible, replacing them with efficiency, renewables and energy storage.
At a more advanced level, some new research suggests that conventional data understate methane emissions. And a different way of comparing methane to CO2 implies that methane should have a higher CO2-equivalent value, raising its relative importance in GHG accounting. Some combination of these factors might even make natural gas as bad as coal, from a global warming perspective.
The introductory message is the one that matters for public communication. It focuses discussion on the simple fact that natural gas, like other fossil fuels, has got to go: it is part of the problem, not the solution. The advanced message, in contrast, emphasizes technical controversies and interpretation of recent research. It tends to produce eyes-glazed-over responses along the lines of “I wasn’t that good at science in school.”
But if you’re reading this, you can probably follow the technical debates, at least partway down the rabbit hole. Consider three chapters of the story of methane: the time span for calculating CO2-equivalents; the issue of gas leaks; and the gas vs. coal comparison.Thinking about tomorrow
Methane is a much more potent heat-trapping gas than CO2, but CO2 remains in the atmosphere and traps heat for much longer than methane. On balance, how much does a ton of methane emissions contribute to warming, relative to a ton of CO2?
The answer depends on the time period under consideration. Methane has an atmospheric lifetime of 12 years. CO2 is affected by several processes that operate at very different speeds: 50 percent of CO2 is removed from the atmosphere within 30 years of emission, while 20 percent persists in the atmosphere for thousands of years.
Zooming in on a timespan as short as 20 years after emissions means focusing mainly on years when methane is still present in the atmosphere, trapping a lot of heat. Over a longer interval such as 100 years after emissions, most of the years are ones when methane has faded away, while a significant fraction of the CO2 emissions remains in the atmosphere. As a result, the CO2-equivalent value of methane is 84 over a 20-year period, compared to only 28 over a 100-year period.
Early IPCC and other technical reports tended to standardize on the 100-year CO2-equivalent value, implying that methane is 28 times as bad per ton as CO2. More recent studies have often highlighted the 20-year CO2-equivalent value, making methane 84 times as bad.
Neither one or the other is the correct value. Climate change is a problem of both short-term urgency and long-term consequences, of 20-year impacts, 100-year impacts, and beyond. This produces an awkward situation for research and reporting on greenhouse gases: the “exchange rate” between the two leading gases is either 28 or 84. It is less of a problem for public policy, where either of the CO2-equivalent values for methane is enough to make the case: a low-carbon economy must eliminate methane emissions, not rely on natural gas as a bridge to anywhere we want to go.Counting the leaks
Natural gas leaks from wells and pipelines, increasing the lifecycle methane emissions associated with gas-fired heating or electricity generation. EPA estimated that methane leaks represented 1.4 percent of gas production nationwide in 2015. But a new study based on extensive field measurement found that methane leaks were 2.3 percent of gas production that year. Other studies have reported even higher leakage rates in areas where fracking is widespread.
It would be a mistake, however, to pin the critique of natural gas solely to high levels of leaks. The same study that found leaks of 2.3 percent also found that “the higher estimates stem from a small number of so-called superemitters … most tied to [malfunctioning] hatches and vents in natural gas storage tanks at extraction wells.”
It is not hard to imagine the industry, under pressure from regulators, fixing the malfunctioning hatches and vents, and developing better ways to seal leaks in general. This would increase the amount of gas that could be delivered to customers, potentially increasing industry profits. The International Energy Agency, which estimates gas leaks of 1.7 percent worldwide, also finds that 40 to 50 percent of current methane emissions could be avoided at no net cost.
The key point about methane is not the current high levels of leaks. Rather, reducing methane emissions, from leaks and other sources, is one of the most cost-effective strategies for greenhouse gas mitigation.Different shades of bad
Some environmental advocates now claim that burning gas is just as bad for the climate as burning coal. There are several strong counterarguments, which do not undermine the case against gas.
Above all, coal is an environmental disaster, causing havoc throughout its life cycle. Coal mining devastates local communities, on a level that equals or surpasses anything done by fracking. It even releases methane from coal mining, equal to 8 percent of U.S. methane emissions according to the 2016 greenhouse gas inventory. The canaries in the coal mines, back in the day, were brought in to detect carbon monoxide and methane, the deadly gases that threatened miners.
Coal combustion gives rise not only to CO2, but also to many toxic pollutants which kill people near the plants. Since coal plants are usually located in low-income and minority neighborhoods, plant siting raises issues of environmental justice. After combustion, coal ash must be disposed of, creating a whole new set of toxic risks and environmental justice concerns in the siting of these impacts. Gas does not cause local toxic emissions or leave ash behind when it burns.
Even restricting attention to greenhouse gas emissions, an extraordinary level of leakage is required to make gas as bad as coal from a 20-year perspective, let alone a 100-year perspective. The IEA has a graph displaying the relationship between leak rates, time horizon, and climate impacts from coal vs. gas.
Finally, consider the emotional meaning of the statement that gas is as bad as coal. It often seems as if activists feel the need to show that gas is as bad as it gets, in order to support opposition to gas-fired power plants. This is surely a mistake.
It is not necessary to make something the worst ever, in order to establish that it is bad. George W. Bush was a bad president, for the environment and so much else; now it turns out that he was not the worst possible president. There is no reason to claim that Bush was as bad as Trump – or that a return to Bush-era policies would be a bridge to the future. It’s just a different shade of bad.
A gas-fired electric grid is different from a coal-fired one. But from a climate perspective, they are different shades of bad: both involve carbon emissions far above a sustainable, climate-friendly level. The need for a carbon-free alternative is the conclusion that matters, independent of the latest research details on methane.
Frank Ackerman is principal economist at Synapse Energy Economics in Cambridge, Mass., and one of the founders of Dollars & Sense, which publishes Triple Crisis.
By Frank Ackerman
Second in a series of posts on climate policy. Find Part 1 here.
According to scientists, climate damages are deeply uncertain, but could be ominously large (see the previous post). Alternatively, according to the best-known economic calculation, lifetime damages caused by emissions in 2020 will be worth $51 per metric ton of carbon dioxide, in 2018 prices.
These two views can’t both be right. This post explains where the $51 estimate comes from, why it’s not reliable, and the meaning for climate policy of the deep uncertainty about the value of damages.A tale of three models
The “social cost of carbon” (SCC) is the value of present and future climate damages caused by a ton of carbon dioxide emissions. The Obama administration assembled an Interagency Working Group to estimate the SCC. In its final (August 2016) revision of the numbers, the most widely used variant of the SCC was $42 per metric ton of carbon dioxide emitted in 2020, expressed in 2007 dollars – equivalent to $51 in 2018 dollars. Numbers like this were used in Obama-era cost-benefit analyses of new regulations, placing a dollar value on the reduction in carbon emissions from, say, vehicle fuel efficiency standards.
To create these numbers, the Working Group averaged the results from three well-known models. These do not provide more detailed or in-depth analysis than other models. On the contrary, two of them stand out for being simpler and easier to use than other models. They are, however, the most frequently cited models in climate economics. They are famous for being famous, the Kardashians of climate models.
DICE, developed by William Nordhaus at Yale University, offers a skeletal simplicity: it represents the dynamics of the world economy, the climate, and the interactions between the two with only 19 equations. This (plus Nordhaus’ free distribution of the software) has made it by far the most widely used model, valuable for classroom teaching, for initial high-level sketches of climate impacts, and for researchers (at times including myself) who lack the funding to acquire and use more complicated models. Yet no one thinks that DICE represents the frontier of knowledge about the world economy or the environment. DICE estimates aggregate global climate damages as a quadratic function of temperature increases, rising only gradually as the world warms.
PAGE, developed by Chris Hope at Cambridge University, resembles DICE in level of complexity, and has been used in many European analyses. It is the only one of the three models to include any explicit treatment of uncertain climate risks, assuming the threat of an abrupt, mid-size economic loss (beyond the “predictable” damages) that becomes both more likely and more severe as temperatures rise. Perhaps for this reason, PAGE consistently produces the highest SCC estimates among the three models.
FUND, developed by Richard Tol and David Anthoff, is more detailed than DICE or PAGE, with separate treatment of more than a dozen damage categories. Yet the development of these damages estimates has been idiosyncratic, in some cases (such as agriculture) relying on relatively optimistic research from 20 years ago rather than more troubling, recent findings on climate impacts. Even in later versions, after many small updates, FUND still estimates that many of its damage categories are too small to matter; in some FUND scenarios, the largest cost of warming is the increased expenditure on air conditioning.
Much has been written about what’s wrong with relying on these three models. The definitive critique is the National Academy of Sciences study, which reviews the shortcomings of the three models in detail and suggests ways to build a better model for estimating the SCC. (Released just days before the Trump inauguration, the study was doomed to be ignored.)
Expected climate damages are uncertain over a wide range, including the possibility of disastrously large impacts. The SCC is a monetary valuation of expected damages per ton of carbon dioxide. Therefore, SCC values should be uncertain over a wide range, including the possibility of disastrously high values.
Look beyond the three-model calculation, and the range of possible SCC values is extremely wide, including very high upper bounds. Many studies have adopted DICE or another model as a base, then demonstrated that minor, reasonable changes in assumptions lead to huge changes in the SCC. To cite a few examples:
- A meta-analysis of SCC values found that, in order to reflect major climate risks, the SCC needs to be at least $125.
- A study by Simon Dietz and Nicholas Stern found a range of optimal carbon prices (i.e. SCC values), depending on key climate uncertainties, ranging from $45 to $160 for emissions in 2025, and from $111 to $394 for emissions in 2055 (in 2018 dollars per ton of carbon dioxide).
- In my own research, coauthored with Liz Stanton, we found that a few major uncertainties lead to an extremely wide range of possible SCC values, from $34 to $1,079 for emissions in 2010, and from $77 to $1,875 for 2050 emissions (again converted to 2018 dollars).
- Martin Weitzman has written several articles emphasizing that the SCC depends heavily on the unknown shape of the damage function – that is, the details of the assumed relationship between rising temperatures and rising damages. His “Dismal Theorem” article argues that the marginal value of reducing emissions – the SCC – is literally infinite, since catastrophes that would cause human extinction remain too plausible to ignore (although they are not the most likely outcomes).
Whether or not the SCC is infinite, many researchers have found that it is uncertain, with the broad range of plausible values including dangerously high estimates. This is the economic reflection of scientific uncertainty about the timing and extent of climate damages.
How much can we afford?
As explained in the previous post in this series, deep uncertainty about the magnitude and timing of risks stymies the use of cost-benefit analysis for climate policy. Rather, policy should be set in an insurance-like framework, focused on credible worst-case losses rather than most likely outcomes. Given the magnitude of the global problem, this means “self-insurance” – investing in measures that make worst cases less likely.
How much does climate “self-insurance” – greenhouse gas emission reduction – cost? Several early (2008 to 2010) studies of rapid decarbonization, pushing the envelope of what was technically feasible at the time, came up with mid-century carbon prices of roughly $150 – $500 per ton of carbon dioxide abated. Since then, renewable energy has experienced rapid progress and declining prices, undoubtedly lowering the carbon price on a maximum feasible reduction scenario.
Even at $150 to $500 per ton, the cost of abatement was comparable to or lower than many of the worst-case estimates of the SCC, or climate damages per ton. In short, we already know that doing everything on the least-cost emission reduction path will cost less, per ton of carbon dioxide, than worst-case climate damages.
That’s it: end of economic story about evaluating climate policy. We don’t need more exact, accurate SCC estimates; they will not be forthcoming in time to shape policy, due to the uncertainties involved. Since estimated worst-case damages are rising over time, while abatement costs (such as the costs of renewables) are falling, the balance is tipping farther and farther toward “do everything you can, now.” That was already the correct answer some years ago, and only becomes more correct over time.
That’s not the end of this series of blog posts, however. Three more are coming, addressing three policy problems that arise in climate advocacy: how to talk about methane and natural gas; taxes versus cap and trade systems; and the role of equity and economic obstacles to climate policy.
By Frank Ackerman
First in a series of posts on climate policy.
The damages expected from climate change seem to get worse with each new study. Reports from the IPCC and the U.S. Global Change Research Project, and a multi-author review article in Science, all published in late 2018, are among the recent bearers of bad news. Even more continues to arrive in a swarm of research articles, too numerous to list here. And most of these reports are talking about not-so-long-term damages. Dramatic climate disruption and massive economic losses are coming in just a few decades, not centuries, if we continue along our present path of inaction. It’s almost enough to make you support an emergency program to reduce emissions and switch to a path of rapid decarbonization.
But wait: isn’t there something about economics we need to figure out first? Would drastic emission reductions pass a cost-benefit test? How do we know that we wouldn’t be spending too much on climate policy?
In fact, a crash program to decarbonize the economy is obviously the right answer. There are just a few things you need to know about the economics of climate policy, in order to confirm that Adam Smith and his intellectual heirs have not overturned common sense on this issue. Three key points are worth remembering.Worst-case risks matter more than likely outcomes
For uncertain, extreme risks, policy should be based on the credible worst-case outcome, not the expected or most likely value. This is the way people think about insurance against disasters. The odds that your house won’t burn down next year are better than 99 percent – but you probably have fire insurance anyway. Likewise, young parents have more than a 99 percent chance of surviving the coming year, but often buy life insurance to protect their children.
Real uncertainty, of course, has nothing to do with the fake uncertainty of climate denial. In insurance terms, real uncertainty consists of not knowing when a house fire might occur; fake uncertainty is the (obviously wrong) claim that houses never catch fire. See my Worst-Case Economics for more detailed exploration of worst cases and (real) uncertainty, in both climate and finance.
For climate risks, worst cases are much too dreadful to ignore. What we know is that climate change could be very bad for us; but no one knows exactly how bad it will be or exactly when it will arrive. How likely are we to reach tipping points into an irreversibly worse climate, and when will these tipping points occur? As the careful qualifications in the IPCC and other reports remind us, climate change could be very bad, surprisingly soon, but almost no one is willing to put a precise number or date on the expected losses.
One group does rush in where scientists fear to tread, guessing about the precise magnitude and timing of future climate damages: economists engaged in cost-benefit analysis (CBA). Rarely used before the 1990s, CBA has become the default, “common-sense” approach to policy evaluation, particularly in environmental policy. In CBA-world you begin by measuring and monetizing the benefits, and the costs, of a policy – and then “buy” the policy if, and only if, the monetary value of the benefits exceeds the costs.
There are numerous problems with CBA, such as the need to (literally) make up monetary prices for priceless values of human life, health and the natural environment. In practice, CBA often trivializes the value of life and nature. Climate policy raises yet another problem: CBA requires a single number, such as a most likely outcome, best guess, or weighted average, for every element of costs (e.g. future costs of clean energy) and benefits (e.g. monetary value of future damages avoided by clean energy expenditures). There is no simple way to incorporate a wide range of uncertainty about such values into CBA. The second post in this series will look more deeply at economists’ misplaced precision about climate damages.Costs of emission reduction are dropping fast
The insurance analogy is suggestive, but not a perfect fit for climate policy. There is no intergalactic insurance agency that can offer us a loaner planet to use while ours is towed back to the shop for repairs. Instead, we will have to “self-insure” against climate risks – the equivalent of spending money on fireproofing your house rather than relying on an insurance policy.
Climate self-insurance consists largely of reducing carbon emissions, in order to reduce future losses. The one piece of unalloyed good news in climate policy today is the plummeting cost of clean energy. In the windiest and sunniest parts of the world (and the United States), new wind and solar power installations now produce electricity at costs equal to or lower than from fossil fuel-burning plants.
A 2017 report from the International Renewable Energy Agency (IRENA) projects that this will soon be true worldwide: global average renewable energy costs will be within the range of fossil fuel-fired costs by 2020, with on-shore wind and solar photovoltaic panels at the low end of the range. Despite low costs for clean energy, many utilities will still propose to build fossil fuel plants, reflecting the inertia of traditional energy planning and the once-prudent wisdom of the cheap-fuel, pre-climate change era.
Super-low costs for renewables, which would have seemed like fantasies 10 years ago, are now driving the economics and the feasibility of plans for decarbonization. Many progressive Democrats have endorsed a “green new deal”, calling for elimination of fossil fuels, massive investment in energy efficiency and clean energy, and fairness in the distribution of jobs and opportunities.
Robert Pollin, an economist who has studied green new deal options, estimates that annual investment of about 1.5 percent of GDP would be needed. That’s about $300 billion a year for the United States, and four times as much, $1.2 trillion a year, for the world economy. Those numbers may sound large, but so are the fossil fuel subsidies and investments that the green new deal would eliminate.
In a 2015 study, my colleagues and I calculated that 80 percent of U.S. greenhouse gas emissions could be eliminated by 2050, with no net increase in energy or transportation costs. Since that time, renewables have only gotten cheaper. (Our result does not necessarily contradict Pollin’s estimate, since the last 20 percent of emissions will be the hardest and most expensive to eliminate.)
These projections of future costs are inevitably uncertain, because the future has not happened yet. The risks, however, do not appear dangerous or burdensome. So far, the surprises on the cost side have been unexpectedly rapid decreases in renewable energy prices. These are not the risks that require rethinking our approach to climate policy.Costs of not reducing emissions may be disastrously large
The disastrous worst-case risks are all on the benefits, or avoided climate damages, side of the ledger. The scientific uncertainties about climate change concern the timing and extent of damages. Therefore, the urgency of avoiding these damages, or conversely the cost of not avoiding them, is intrinsically uncertain, and could be disastrously large.
It has become common, among economists, to estimate the “social cost of carbon” (SCC), defined as the monetary value of the present and future climate damages per ton of carbon dioxide or equivalent. This is where the pick-a-number imperative of cost-benefit analysis introduces the greatest distortion: huge uncertainties in damages should naturally translate into huge uncertainties in the SCC, not a single point estimate.
The next post in this series will examine the debates about the SCC, showing that there are indeed large uncertainties in its value, no matter how inconvenient that may be for economists and their models.
 Adaptation, or expenditure to reduce vulnerability to climate damages, is also important but may not be effective beyond the early stages of warming. And some adaptation costs are required to cope with warming that can no longer be avoided – that is, they have become sunk costs, not present or future policy choices.
By Timothy A. Wise
Cross-posted at Food Tank.
On December 17, the United Nations General Assembly took a quiet but historic vote, approving the Declaration on the Rights of Peasants and other People Working in Rural Areas, by a vote of 121-8 with 52 abstentions. The declaration, which was the product of some 17 years of diplomatic work led by the international peasant alliance La Via Campesina, formally extends human rights protections to farmers whose “seed sovereignty” is threatened by government and corporate practices.
“As peasants we need the protection and respect for our values and for our role in society in achieving food sovereignty,” said Via Campesina coordinator Elizabeth Mpofu after the vote. Most developing countries voted in favor of the resolution, while many developed country representatives abstained. The only “no” votes came from the United States, United Kingdom, Australia, New Zealand, Hungary, Israel, and Sweden.
“To have an internationally recognized instrument at the highest level of governance that was written by and for peasants from every continent is a tremendous achievement,” said Jessie MacInnis of Canada’s National Farmers Union. The challenge now, of course, is to mobilize small-scale farmers to claim those rights, which are threatened by efforts to impose rich-country crop breeding regulations onto less developed countries, where the vast majority of food is grown by peasant farmers using seeds they save and exchange.
Seed sovereignty in Zambia
The loss of seed diversity is a national problem in Zambia. “We found a lot of erosion of local seed varieties,” Juliet Nangamba, program director for the Community Technology Development Trust, told me in her Lusaka office. She is working with the regional Seed Knowledge Iniatiave (SKI) to identify farmer seed systems and prevent the disappearance of local varieties. “Even crops that were common just ten years ago are gone.” Most have been displaced by maize, which is heavily subsidized by the government. She’s from Southern Province, and she said their survey found very little presence of finger millet, a nutritious, drought-tolerant grain far better adapted to the region’s growing conditions.
Farmers are taking action. Mary Tembo welcomed us to her farm near Chongwe in rural Zambia. Trained several years ago by Kasisi Agricultural Training Center in organic agriculture, Tembo is part of the SKI network, which is growing out native crops so seed is available to local farmers. Tembo pulled some chairs into the shade of a mango tree to escape the near-100-degree Fahrenheit heat, an unseasonable reminder of Southern Africa’s changing climate. Rains were late, as they had been several of the last few years. Farmers had prepared their land for planting but were waiting for a rainy season they could believe in.
Tembo didn’t seem worried. She still had some of her land in government-sponsored hybrid maize and chemical fertilizer, especially when she was lucky enough to get a government subsidy. But most of her land was in diverse native crops, chemical free for ten years.
“I see improvements from organic,” she explained, as Kasisi’s Austin Chalala translated for me from the local Nyanja language. “It takes more work, but we are now used to it.” The work involves more careful management of a diverse range of crops planted in ways that conserve and rebuild the soil: crop rotations, intercropping, conservation farming with minimal plowing, and the regular incorporation of crop residues and composted manure to build soil fertility. She has six pigs, seven goats, and twenty-five chickens, which she says gives her enough manure for the farm.
She was most proud of her seeds. She disappeared into the darkness of her small home. I was surprised when she emerged with a large fertilizer bag. She untied the top of the bag and began to pull out her stores of homegrown organic seeds. She laughed when I explained my surprise. She laid them out before us, a dazzling array: finger millet, orange maize, Bambara nuts, cowpeas, sorghum, soybeans, mung beans, three kinds of groundnuts, popcorn, common beans. All had been saved from her previous harvest. The contribution of chemical fertilizer to these crops was, clearly, just the bag.
She explained that some would be sold for seed. There is a growing market for these common crops that have all-but-disappeared with the government’s obsessive promotion of maize. Some she would share with the 50 other farmer members of the local SKI network. And some she and her family would happily consume. Crop diversity is certainly good for the soil, she said, but it’s even better for the body.
Peasant rights crucial to climate adaptation
We visited three other Kasisi-trained farmers. All sang the praises of organic production and its diversity of native crops. All said their diets had improved dramatically, and they are much more food-secure than when they planted only maize. Diverse crops are the perfect hedge against a fickle climate. If the maize fails, as it has in recent years, other crops survive to feed farmers’ families, providing a broader range of nutrients. Many traditional crops are more drought-tolerant than maize.
Another farmer we visited had already planted, optimistically, before the rains arrived. She showed us her fields, dry and with few shoots emerging. With her toe she cleared some dirt from one furrow to reveal small green leaves, alive in the dry heat. “Millet,” she said proudly. With a range of crops, she said, “the farmer can never go wrong.”
I found the same determination in Malawi, where the new Farm-Saved Seed Network (FASSNet) is building awareness and working with government on a “Farmers’ Rights” bill to complement a controversial Seed Bill, which deals only with commercial seeds. A parallel process is advancing legislation on the right to food and nutrition. Both efforts should get a shot in the arm with the UN’s Peasants’ Rights declaration.
The declaration now gives such farmers a potentially powerful international tool to defend themselves from the onslaught of policies and initiatives, led by multinational seed companies, to replace native seeds with commercial varieties, the kind farmers have to buy every year.
Kasisi’s Chalala told me that narrative is fierce in Zambia, with government representatives telling farmers like Tembo that because her seeds are not certified by the government they should be referred to only as “grain.”
Eroding protection from GMOs
As if to illustrate the ongoing threats to farm-saved seed, that same week in Zambia controversy erupted over two actions by the government’s National Biosafety Board to weaken the country’s proud and clear stance against the use of genetically modified crops. The Board had quietly granted approval for a supermarket chain to import and sell three products with GMOs, a move promptly criticized by the Zambian National Farmers Union.
Then it was revealed that the Board was secretly drawing up regulations for the future planting of GM crops in the country, again in defiance of the government’s approved policies. The Zambian Alliance for Agroecology and Biodiversity quickly denounced the initiative.
The UN declaration makes such actions a violation of peasants’ rights. Now the task is to put that new tool in farmers’ hands. “As with other rights, the vision and potential of the Peasant Rights Declaration will only be realized if people organize to claim these rights and to implement them in national and local institutions,” argued University of Pittsburgh sociologists Jackie Smith and Caitlin Schroering in Common Dreams. “Human rights don’t ‘trickle down’—they rise up!”
Cross-posted at Inter Press Service.
The notion of the BRICS (Brazil, Russia, India, China, and later, South Africa) was concocted by Goldman Sachs’ Jim O’Neill. His 2001 acronym was initially seen as a timely, if not belated acknowledgement of the rise of the South.
But if one takes China out of the BRICS, one is left with little more than RIBS. While the RIBS have undoubtedly grown in recent decades, their expansion has been quite uneven and much more modest than China’s, while the post-Soviet Russian economy contracted by half during Boris Yeltsin’s first three years of ‘shock therapy’ during 1992-1994.
Unsurprisingly, Goldman Sachs quietly shut down its BRICS investment fund in October 2015 after years of losses, marking “the end of an era”, according to Bloomberg.
Growth spurts in South America’s southern cone and sub-Saharan Africa lasted over a decade until the Saudi-induced commodity price collapse from 2014. But the recently celebrated rise of the South and developing country convergence with the OECD has largely remained an East Asian story.
Increasingly, that has involved China’s and South Korea’s continued ascendance after Japan’s financial ‘big bang’ and ensuing stagnation three decades ago. They have progressed and grown rapidly for extended periods precisely because they have not followed rules set by the advanced economies.
Industrial policy — involving state owned enterprises (SOEs), technology transfer agreements, government procurement, strict terms for foreign direct investment and other developmental interventions — was condemned by the Washington Consensus, promoting liberalization, privatization and deregulation favouring large transnational corporations.
Well-managed SOEs, government procurement practices and effective protection conditional on export promotion accelerated structural transformation. When foreign corporations were allowed to invest, they were typically required to transfer technology to the host economy.
Countries have only progressed by using industrial policy judiciously when sufficient policy space was available, as was the norm in most developed countries. But such successful development practices have been denied to most developing countries in recent decades. Instead, the North now emphasizes the dangers of industrial policy, subsidies, SOEs and technology transfer agreements, to justify precluding their use by others.
Blocking the Alternative
Instead, corporate-led globalization continues to be sold as the way to develop and progress.
Some advocates insist that global value chain participation will provide handsome opportunities for sustained economic development despite the evidence to the contrary.
Major OECD economies appear intent on tightening international rules to further reduce developing countries’ policy space under the pretext of reforming the multilateral trading system in order to save it.
Trump and other challenges to this neoliberal narrative do not offer any better options for the South. Nevertheless, their nationalist and chauvinist rhetoric has undermined the pious claims and very legitimacy of their neoliberal ‘globalist’ rivals on the Right.
UNCTAD’s 2018 Trade and Development Report emphasizes the link between infrastructure and industrialization. It argues that successful industrialization since 19th century England has crucially depended on public infrastructure. Infrastructure investment is thus considered crucial for economic growth and structural transformation.
The ascendance of the neoliberal Washington Consensus agenda has not only undermined public interventions generally, but also state revenue and spending in particular, especially in the developing world. But even the World Bank now admits that it had wrongly discouraged infrastructure financing, which it now advocates.
Most Western controlled international financial institutions have recently advocated public-private partnerships to finance, manage and implement infrastructure projects. The presumption is that only the private sector has the expertise and capacity to be efficient and profitable. In practice, states borrowed and bore most of the risk, e.g., of contingent liabilities, while private partners reaped much profit, often with state guaranteed revenues.
Unexpected Policy Space
Infrastructure, including both its construction and financing, has been central, not only to China’s own progress, but also to its international development cooperation. China’s financial redeployment of its massive current account surplus has created an alternative to traditional sources of investment finance, both private and public.
The availability of Chinese infrastructure finance on preferential or concessionary terms has been enthusiastically taken up, not least by countries long starved of investible resources. Not surprisingly, this has resulted in over-investments in some infrastructure, resulting in underutilization and poor returns to investment.
The resulting debt burdens and related problems have been well publicized, if not exaggerated by critics with different motivations. Now threatened by China’s rise, Western governments and Japan have suddenly found additional resources to offer similar concessionary financing for their own infrastructure firms.
Thus, not unlike the US-Soviet Cold War, the perceived new threat from China has created a new bipolar rivalry. That has inadvertently created policy space and concessions reminiscent of the post-Second World War ‘Golden Age’ for Keynesian and development economics.
Second part of a series, 2018 TRADE AND DEVELOPMENT REPORT: Interview with UNCTAD’s Richard Kozul-Wright, from the Real News Network.
RICHARD KOZUL-WRIGHT: Here’s a big question that I think needs to be honestly and frankly addressed. If state ownership, technology transfer agreements and subsidies work to sustain growth and eliminate poverty and these policies have taken 500 million people out of poverty in China, and by implication, because of China’s connections to other developing countries, another 100 million people out of poverty in other parts of the developing world: Why do advanced economies want to deny their use to other developing countries? Given their success, given what they have achieved in terms of economic performance and social performance, why would you want, why would you want to eliminate these options from the policy toolkit of developing countries? It’s a very serious question that needs to be, I think, asked in a more frank and honest way than has so far been the case.
Obviously, for us least in terms of the TDR, it’s about rethinking multilateralism in progressive ways. It’s a plague on both your houses This is not an issue of do we support regressive nationalism, which we’ve already seen in various formations. Nor is it a support for the kind of corporate cosmopolitanism that has dominated the multilateral discussion for the last…We need some kind of alternative that is neither of these options. And I’ll finish here. We, in the Trade and Development Report this year, have gone back to the Havana Charter, which as many of you know, was the forerunner of the GATT, indeed more ambitious of the multilateral discussions in 1947 and 1948 than the GATT, that was signed up to by 50 odd countries, both developed and developing. Indeed, the majority of countries that signed up the Havana Charter were from the developing world. It was eventually rejected by the U.S. Congress and was shelved, but it is a remarkable document.
LYNN FRIES: It’s The Real News, I’m Lynn Fries.
And this is part two of a conversation with Richard Kozul-Wright on Power, Platforms and the Free Trade Delusion, UNCTAD’s 2018 Trade and Development Report. The opening clip featured Richard Kozul-Wright presenting the report at the Board meeting at the United Nations Conference on Trade and Development. As mentioned in part one, both the report and the UN institution are better known by their acronyms, the TDR and UNCTAD, respectively. Richard Kozul-Wright is lead author of the 2018 TDR and Director of the Division on Globalization and Development Strategies at UNCTAD. Thanks again for joining us, Richard, welcome.
RICHARD KOZUL-WRIGHT: Thank you.
LYNN FRIES: The report emphasizes the link between infrastructure and industrialization, and that since 19th century Great Britain through the 21st century China, a central building block of successful industrialization has been public spending on infrastructure. Talk about that.
RICHARD KOZUL-WRIGHT: I think one of the great tragedies of the neoliberal agenda as it’s eviscerated public spending, particularly in the developing world, has been the way in which infrastructure finance has been crowded out of the resource mobilization discussion. And it’s taken China, essentially, to put this back onto the agenda. Infrastructure has been a central feature of the Chinese development model. The Washington-based institutions, we think, have a fairly simplistic view of public-private partnerships as the way to manage infrastructure projects, which we don’t find to be particularly convincing. The report is really calling for a return to the kinds of thinking that were commonplace in the debates on infrastructure in the 1950s and the 1960s. That is, linking infrastructure projects to the wider issue of structural transformation and economic growth.
So the report is a plea to go back and revisit the kinds of debates that have unfortunately dropped off the development agenda under the very simplistic belief that if you allow the private sector and market forces to rein free, then somehow infrastructure will spontaneously become a part of a healthy growth and development path, which is simply not the case.
LYNN FRIES: Enthusiasts in favor of global value chains in the structure of global trade argue global value chains provide developing countries with opportunities for economic advancement and development. You presented a far different narrative when presenting the TDR to the UNCTAD Board at the annual meeting. Here’s a brief clip.
RICHARD KOZUL-WRIGHT: The idea that value chains offer this opening to climb up the value chain, diversify your economy, upgrade, is not backed up by strong evidence outside of the East Asia region.
LYNN FRIES: Tell us more about that.
RICHARD KOZUL-WRIGHT: Yeah. It’s the big promise of global value chains linked to comparative advantage arguments, that somehow those countries with low skilled, low wage labor would be the big winners of this hyper-globalized world. And talk about the BRICS and emerging economies has being hard-wired into that narrative about the big winners from hyper-globalization. And as you said, we show that there is not very much evidence to support this, with one big exception. And the big exception, of course, is China. When you look at the share of the profits of the top 2000 TNCs in global income, the only part of the world that has actually made headways into that elite club, other than the advanced economies themselves, is of course, corporations from East Asia, including China.
And so, if you take the BRICS as being an indication of the rise of the South and you take China out of the BRICS acronym and you’re left with the RIBS – Russia, India, Brazil, South Africa – their share of global income over the last 20, 25 years has increased, but not significantly and certainly not to have generated the kind of hype and, to some extent, hysteria that we have seen around the notion of emerging economies. So the story of the rise of the South, as we see it, still remains fundamentally an East Asia story, with China, in many respects, simply replicating the experience that began with Japan in the 1950s and was repeated by Korea and other countries in the 1960s and the 1970s.
Now, from our point of view of course, it’s an interesting story if China has been able to buck the trend of hyper-globalization. The question is, how did it do it? And it turns out it did it by not following the rules of hyper-globalization as promoted within the capitals of the advanced economies, whether that’s from Washington or Paris or Brussels. So we know now that the Chinese have been very successful in using industrial policy, including the use of subsidies. We know they’ve been very successful in using state-owned enterprises as part of their development model. We know that they’ve been very good at ensuring that if international corporations come to their country to make profits, they have certain responsibilities in terms of transferring technology to the to the host economy.
Now, what is worrying for us, of course, in the current discourse that we hear about reforming the multilateral system, is all the things that worked for China and lifted 500 million people out of poverty in China, and to some extent even more people out of poverty outside of China given China’s connections with other parts of the developing world, these are the very elements that the advanced economies want to preclude other developing countries from using. So now, the endless rhetoric that we hear about, the dangers of state-owned enterprises, the problems of industrial subsidies, the problems with technology transfer agreements, are on the table for tightening the rules to prevent those being used in the future.
And that has to be a worrying trend, given the track record of successful use of these policies in the Chinese case. And it’s one of the great ironies here, that hyper-globalization has been sold to many as a way of alleviating poverty. But it turns out that the only way that it can work to alleviate poverty is if countries have sufficient policy space to be able to make use of its benefits and to mitigate its problems. And now, we have an agenda that is trying to reduce the policy space further of countries to ensure that they can’t do that in the future. It’s a very troubling and clearly hypocritical agenda that is emerging under the rhetoric of reforming the multilateral system.
LYNN FRIES: In a recent article you wrote, and I quote, “China’s success is exactly what was envisaged in the 1947 United Nations Conference on Trade and Employment in Havana, where the international community laid the groundwork for what would become the global trading system. The difference in discourse between then and now attests to how far the current multilateral order has moved from its original aims.”
RICHARD KOZUL-WRIGHT: It’s one of the great tragedies, I think, of discussion around the crisis and multilateralism, is just how much people have forgotten about the origins and evolution of this system. So the current tensions in the global trading system are essentially presented as a threat to a 70 year old international economic order that was hatched in the late 1940s and was gradually perfected over the subsequent seven decades. And that’s clearly not the case. The kind of international order that was designed in the immediate post-war period essentially died in the late 1970s and the early 1980s, when policymakers in the advanced economies gave up on full employment as a central goal of their agendas, when financial markets were rapidly and in many respects recklessly deregulated, when the mixed economy, the idea of a mixed economy became a dirty phrase in which the public sector had a central role in the organization of a healthy and balanced economy.
So that order died in the late 1970s and early 1980s. But one of the high points, in my opinion, of those early negotiations was the Havana Charter. 55 countries that negotiated the Havana charter, or 53, 54, signed up to that in 1948, so 70 years ago this year. The majority of those countries were actually developing countries, so it was a really genuine dialogue between the North and the South that gave rise to the Havana Charter. And of course, the charter eventually died when the U.S. Senate, the U.S. negotiators having originally signed up to it, but when the U.S. Senate essentially killed off the charter at the end of the 1940s, early 1950s. And all that remained were the GATT rules that became the basis for structuring trade rules in the postwar order.
LYNN FRIES: I just want to give viewers a very brief overview of the chronology of the history of the multilateral trade system that you’re going to be referring to. So from 1945 to 1947, over 50 countries began to create an International Trade Organization, the ITO. And in 1947 in Havana, 56 countries started negotiating the charter for the ITO. And the Final Act of that charter, the Havana Charter, was signed in 1948 in Havana. But the Havana Charter and its policy goals of full employment and domestic industrialization for structuring trade rules in the post-war order, as you said, got shelved. The International Trade Organization, along with its charter, the Havana Charter, was shelved.
Instead, from 1948 to 1994, a provisional arrangement, the GATT, the General Agreement on Tariffs and Trade, was the only multilateral instrument governing international trade. And as you said, the GATT rules became the basis for structuring trade rules. The last round of multilateral trade negotiations under the GATT, the Uruguay Round, was from 1986 to 1994. The Uruguay Round led to the end of the postwar order governed by the General Agreement on Tariffs and Trade, replacing it with the creation of the World Trade Organization, or WTO. The World Trade Organization was established by the Uruguay Round in 1995 and governs the multilateral trade system into the present. So what made the original aims of the Havana Charter so different from the current multilateral order?
RICHARD KOZUL-WRIGHT: The Havana Charter itself was far more ambitious than the GATT rules. As you said, it recognized that a healthy trading system could only evolve out of an economy that guaranteed full and stable employment in which aggregate demand was at a level that could generate full and decent work. It recognized that there were distributional issues always on the trade agenda that had to be managed at the international level. They were, at that time, seriously concerned about the use of restrictive business practices in distorting the international trading system and sought ways of dealing with those restrictive business practices by large international firms.
And as you said, they recognized this central role for industrialization in the South as part of a healthy trading system and they found ways to ensure that industrial policy and the use of various tools of industrial policy would be a legitimate part of the international trading rules. And all that was part of the Havana Charter that was negotiated by these 50 odd countries after the end of the Second World War. And to some extent, these features were retained in a more informal way in the post-war architecture, but not really as part of the rules of the system. Obviously, with the rise of neoliberalism, the Uruguay Round, the direction of trade rules was in the opposite direction. Concerns about unemployment were essentially removed from the discussion, industrial policy became a dirty word, particularly for developing countries.
LYNN FRIES: I have one more quote of something you wrote marking the 70th anniversary of the Havana Charter and then I’m going to ask you for a concluding thought. “The silence on the 70th anniversary of the Havana Charter speaks volumes about the current era’s approach to multilateralism, not simply in the contrasting levels of ambition, but in subverting its underlying logic by assuming that by opening up to trade and private capital flows, full employment and economic development will automatically follow. Evidence continues to favor the drafters in Havana.” And your concluding thought?
RICHARD KOZUL-WRIGHT: I strongly believe that despite all the changes that have happened in the global economy since 1948, there is a serious need to take a look again at what the Havana Charter was trying to do in terms of generating a more balanced and healthy trading system and to find ways of replicating those kinds of ambitions in line with the changes that we’ve seen in the second half of the 20th century and the beginning of the 21st century. And the report tries to make a case for revisiting the Havana Charter in light of the challenges of the digital economy, in light of the challenges of environmental and financial stress that we’ve seen over the last few years.
LYNN FRIES: Richard Kozul-Wright, thank you.
RICHARD KOZUL-WRIGHT: Thank you, Lynn.
LYNN FRIES: And thank you for joining us on The Real News Network.