WARWICK McKIBBIN. How should technocrats count the true cost of climate?

A bad model with transparent assumptions is better than arbitrary analysis based on wishful thinking, writes Warwick McKibbin.Published in AFR on 26 March 2019

The Brookings Institution recently published a report that I co-authored on the impact of the Paris Climate Agreement on the global economy with a particular focus on the economic and environmental impacts. In that report we use a multi-region model of the world economy. To model the Paris Agreement, we converted the disparate emission targets for each country or region in their Nationally Determined Contributions (NDC) formulations into estimated reductions in CO2 emissions relative to a baseline scenario with no new climate policies. We then solved for a carbon price path in each region that achieves the NDC-consistent emissions reductions in the target year.

We found that if all regions achieve their NDCs, the Paris Agreement significantly reduces CO2 emissions relative to baseline. However, the Paris policy scenario suggests that global CO2 emissions would not decline in absolute terms relative to 2015 levels, let alone follow a path consistent with a 2 degrees stabilisation scenario. The Paris policies result in significant macroeconomic spill-overs across the global economy, meaning that macroeconomic outcomes across countries depend not only on their own commitments but also on those of the rest of the world.

The model used in the Brookings report was the G-Cubed model. It is used by researchers, governments and corporations around the world and participated in the recent Stanford University Energy Modelling Forum, which recently compared climate policies across all major publicly available models.

The Brookings report explored the current Australian government commitment on targets and not the specific policies since we don’t yet know what these will be over coming years. We assumed the most efficient policy which is a price on carbon.

Several results stood out for Australia.

First, the impact on Australia of the Paris Agreementlargely depends on the actions of the rest of the world in reducing their demand for our fossil fuels and our carbon intensive exports.

Around 80 per cent of the loss in GDP in Australia is caused by the policies of other countries. This clearly indicates that the debate on climate policy should be focused on making sure the world follows a sensible climate policy rather than debating the size of the Australian target.

Second, we find that the impact on the Australian economy is significant with a reduction in real wages relative to trend of around 2 per cent to 3 per cent by 2030 and a fall in employment relative to trend of around 1 per cent in 2020 falling to 0.4 per cent by 2030 as the fall in real wages helps increase employment.

The loss of jobs by 2030 is about 72,000 jobs relative to trend. GDP is estimated to be 2 per cent lower than otherwise of which 0.4 per cent of that is due to Australia’s policy. The carbon price in Australia by 2030 is $US5 or around $8 per ton. The highest tax in all countries by 2030 is $US40 per ton of CO2.

Last week’s Fisher reportusing the BAEGEM model, explores this same scenario (scenario 1 in that study). The results for GDP, employment and real wages for the current government policy are larger than in the G-Cubed model but within the same ballpark.

Where there is a major difference is in the carbon price required to achieve the Paris target. In the Fisher report the carbon price is $263 per ton CO2, which is vastly different to that estimated in G-Cubed. There are very few experts who would agree with the marginal abatement cost curve in the Fisher study. It does not mean it is wrong – but as an outlier the result needs to be understood and fully articulated.

The key difference across models must come down to the assumptions about rigidities in the BAEGEM model. A key assumption appears to be that there is low substitution between energy inputs. The Fisher report argues that a coal-fired power station cannot be used to produce wind power. This is obviously true but is not the point of substitution. The G-Cubed model assumes that electricity distributors can source power from wind or coal or other generators and can substitute this within the grid at low cost.

A higher carbon price makes coal less economic than renewables and so distributors will switch to lower-cost power sources – as they have done over the past decade. Without having seen the model code, I predict that the assumption in the BAEGEM model of the ability to substitute energy sources and the cost of doing this substitution drive the results for the Paris simulations. The only way to get lower emissions in the BAEGEM model is to shrink the economy because changing the price of carbon does not lead to substitution away from carbon intensive energy. This is why the carbon price is so high across all scenarios in the Fisher study. The bigger the target, the larger the economic contraction. The plausibility of this assumption is one place where the debate should focus.

Both models agree that there are likely to be economic costs in the Paris Agreement but there are large quantitative differences and the Brookings study also shows there are quantifiable economic benefits that should not be ignored.

Where I believe there needs to be a wider debate is on the results of the Labor policy in the Fisher report. The Brookings study did not explore either a 45 per cent target or a 50 per cent renewable energy target. The Fisher report did consider this and found the costs to be six times larger than the 26 per cent to 28 per cent target and 10 times larger than the estimate I made in an earlier 2015 study undertaken for the Department on Foreign Affairs and Trade. The earlier report was the basis of the negotiations on the target that Australia took to the Paris negotiations. It considered a range of cuts including a 26 per cent target and a 45 per cent target. The cost of the deeper cuts was more than double the costs of the 26 per cent target. Sensitivity analysis showed that changes in the assumptions about the future price of renewable energy could halve these results if renewable prices fell quickly in the future.

It should be stressed that neither model gives the correct answer.

The models are simplifications of a complex world. There are many views of how the world works. The difference between model results is a healthy revelation that there is great uncertainty and that these issues cannot be resolved with sounds bites but require detailed professional analysis. The critical issue is that the models and the assumptions in the models need to be transparent, peer reviewed and publicly available in order to have sensible discussion.

One question to pose is how does the BAEGEM model explain the past decade of experience in Australia and other countries of the significant penetration of renewables into the energy system? What is the assumption in the analysis about the role of renewables without any further policy intervention in the future?

There needs to be a healthy debate in Australia on the economics of climate policy and not an attack on the credibility of any model builder. It is important to ensure that all policies are subject to a wide range of economic modelling and that the assumptions and sensitivities in each model are understood. This is necessary for good public policy design. Yes – the models will disagree – but a bad model with transparent assumptions is better than arbitrary analysis based on wishful thinking.

A debate about key assumptions that become transparent when building a model-based evaluation, is increasingly absent in many of the policies proposed by both sides of politics. This won’t be addressed in the period leading up to the election, but it should be a priority of whoever forms the next government.

 Warwick McKibbin AO, is the director of the Centre for Applied Macroeconomic Analysis in the ANU Crawford School of Public Policy and is a non-resident senior fellow at the Brookings Institution in Washington .


This post kindly provided to us by one of our many occasional contributors.

This entry was posted in Environment and climate. Bookmark the permalink.

4 Responses to WARWICK McKIBBIN. How should technocrats count the true cost of climate?

  1. Avatar Mark Duffett says:

    “The G-Cubed model assumes that electricity distributors can source power from wind or coal or other generators and can substitute this within the grid at low cost.”

    I find this quite disturbing. There is considerable literature indicating this assumption may not be well founded at high levels of non-dispatchable renewables penetration, e.g. Trembath and Jenkins 2015 (introduced at https://www.greentechmedia.com/articles/read/how-wind-and-solar-will-blow-up-power-markets#gs.4ufcuk) and Heard et al 2017 (https://www.researchgate.net/publication/315745952_Burden_of_proof_A_comprehensive_review_of_the_feasibility_of_100_renewable-electricity_systems)

  2. Avatar Andrew Glikson says:

    The science of climate change is being ignored, any nexus between the politics of climate change and the physical relaities of global warming has by now been disrupted. The Paris agreement is based on climate models which do not take the amplifying feedbacks of the CO2 and other greenhouse gases into account. The close to 470 ppm CO2 equivalent now in the atmosphere is enough to trigger further release of carbon from land and oceans, including methane from permafrost.

  3. Avatar Henry Haszler says:

    McKibbin writes:

    “There are very few experts who would agree with the marginal abatement cost curve in the Fisher study. It does not mean it is wrong – but as an outlier the result needs to be understood and fully articulated.”

    If McKibbin is correct this will not be the first time that expert analysis has questioned the choice of model details with which Brian Fisher has been associated. Relative to all the complexities of a CGE model I am thinking of the very simple two equation so called wool policy model used by ABARE — Fisher was the Director at the time — to evaluate the options for dealing with the wool stockpile that Australia amassed because of stupid policy decisions and policy management from about 1987 to 1991.

    In the earlier case the issue was the short run (annual) own price elasticity of demand for Australian wool. Modelling by ABARE staff had put that number at around -0.35. The elasticity used in the ABARE model — I think Fisher may have been the joint author of some of the reports — was -0.8 and was never sourced to anything. I have estimated that there was less than about 1/500 chance of the -0.8 being consistent with the reported literature.

    The effect of using the larger [absolute] elasticity was to make selling the stockpile in competition with current production a viable option. The -0.8 seems to be a critical value for the parameter. At the lower absolute elasticity the stockpile is just a millstone around the industry’s and Australia’s neck — destruction or denaturing of the wool are better policies.

    The issue is that the simulation results based on the -0.8 which imply the stockpile is an asset rather than a liability made the responsible minister, his Department, ABARE and the Wool Corporation and the Wool Council of Australia all look a lot less foolish than otherwise.

    These issues were covered in Haszler, H., Chisholm, A., Edwards, G, and Hone P, (1996) “The Wool Debt, the Wool Stockpile and the National Interest: Did the Gamaut Committee Get it Right?”, Economic Record, 72(218), September and the other papers referenced in that article. ABARE has ignored the criticism of its modelling.

  4. Avatar Henry Haszler says:

    Good to have McKibbin’s article and comparisons of the Brookings model results with those of Fisher’s model which apparently gives us results favouring the Coalition’s policies that are rather less ambitious than those of Labor.

    Having just looked at Fisher’s paper I see his results are the differences from a projected baseline in 2030. I wonder just how different is that baseline from what we have now, in terms of industry structure, energy demand and other consumption level, etc? And what differences might that particular 2030 baseline imply for the costs of climate policies compared to some other baselines?

    I also see that Fisher’s impacts are calculated from the projected baseline, which seems to be the standard approach. The impacts seem to be a 2% lower GDP than “otherwise”. My understanding is that the routine statistical error in measuring our GDP is of the order of 2% so just how significant are the Fisher and Brookings simulated results?. Moreover, is the real issue that we will still grow but not quite as fast?

    Finally I don’t see anything explicit in Fisher’s analysis and I presume in the Brookings work as well that factors in the costs of doing nothing. For example, if we don’t reduce our carbon emissions what will be the adaptation costs we will need to incur?

    A recent reported assessment is that we are looking at a temperature rise of 2 to 4 degC and sea level rises of 20m over the next two or three centuries. Those conditions will cause MASSIVE movements of people NEVER seen before which I think are likely to see major conflicts around the World. I know 200 and 300 years are substantial time frames. But just think, many young people born in advanced countries today can expect to live to 85 years and older. So we are talking of just two “full” generations.

    Maybe the latter costs can be described as not strictly economic costs but boy oh boy will they ever have economic impacts.

Comments are closed.