Monday, January 09, 2023

Technology and the Cost of Carbon: A Second Try (revised)

 One of my chief criticisms of Rennert et. al. 2022 is that it calculates the cost imposed by an additional ton of CO2 by summing costs from now to 2300 while almost entirely ignoring technological change, in effect assuming technological stasis. That raises the question of how one ought to model technological change in trying to calculate costs over a long period of time. My initial response was that you can't do it, that any estimate of effects centuries in the future is a wild guess, science fiction not science. But that started me thinking about how, if I had to do it, I would. 

Here is my answer.

Step 1: Create a simplified procedure for deducing the Social Cost of Carbon (SCC) from your preferred climate change scenario using causal relationships, such as the effect of temperature on mortality or on crop yield, calculated with data from a single decade. The causal relationships should take account of variables other than technology, such as income, that can be expected to change over time.

Step 2: Use the procedure to produce a value for SCC using data from the most recent decade for which suitable data are available, say 2010-2020. Call this SCC(2010).

Step 3: Repeat step 2, using data from the previous decade. Call this SCC(2000).

Step 4: Calculate the ratio R1=SCC(2010)/SCC(2000)

This is essentially what Lay et. al. 2021 did for the effect of technology on temperature-related mortality.

Suppose R1=.9. That implies that changes over the decade were reducing the social cost of carbon implied by your climate change scenario at about 1%/year.

Step 5: You now abandon your simplified procedure and substitute whatever you consider the best way of calculating SCC. But instead of discounting at the discount rate you discount at the discount rate plus 1% (or whatever R1/10 turned out to be). You have now allowed, as best you can, for the reduction in cost over time due to technological change.

It is not a very good solution to the problem but better than assuming stasis. One problem with it is that it ignores the fact that, the farther into the future you go, the more uncertain your estimate of the effect of technological progress. To solve that you need ...

The Improved Version

Step 3a: Repeat step 2 using data from each earlier decade for which the necessary data exist, generating SCC(1990), SCC(1980), ...

Step 4a: Calculate R2=SCC(2000)/SCC(1990), R3=SCC(1990)/SCC(1980), R4=  ...

If you are willing to model the effect of technological progress as a constant rate of decrease of costs, use the average of your ratios R1, R2, ... instead of R1 alone to estimate the rate at which cost is being reduced due to technological change. Alternatively, if you think the rate of change due to technological change is itself changing over time and you have enough ratios, you could try fitting them to a function linear in time and use that. 

In either case, use the variation of the ratios (from the average or from the linear fit) to calculate by how much you should increase the uncertainty in cost for the later years. If you estimate the rate as 1% when it is really 2% that will have little effect on costs in the near future but produce a large overestimate of costs a century or two later.

Actually doing all of this would be a large and complicated project. Even turning my verbal sketch into a precise mathematical formulation would be quite a lot of work. I don't propose to do either, but perhaps someone more ambitious could. Having been done once, the result could be applied to a variety of different approaches for estimating future costs — from CO2 or anything else.

7 comments:

Stephan Kolassa said...

Professional forecaster here. My proposition would be to use lower-granular data (yearly instead of decadal), if available, and then forecast the resulting annual time series R_t out using standard forecasting methodologies, e.g., as per Athanasopoulos & Hyndman, *Forecasting: Principles and Practice*, 3rd ed., also available online. (I'd post links, but that would get this flagged as spam.)

Specifically, I would go for simple standard methods like Exponential Smoothing, potentially with a trend as automatically determined, rather than fancy AI/ML forecasting methods, which get all the glory these days, but are far too complex for a long-term forecast based on very limited amounts of training data.

The advantage is that these methods also yield prediction intervals, or indeed full predictive densities (always assuming normal error distributions, dubious but a useful zero-th approximation). So you could actually simulate a couple thousand runs of the future, picking each future year's R_t from the predictive distribution. That way, at least one source of uncertainty would be included (perhaps better: "addressed").

I am virtually certain that once one looked at the sheer spread of the outcomes up to the year 2100 (let alone 2300), one would be wholesomely tempted towards *much* more humility as to such long-term forecasts. Then again, as a forecaster, I have long said that there is an egregious lack of precisely such humility in most if not all of forecasting.

Anonymous said...

Interesting idea. According to Tol (2022), "Estimates of the social cost of carbon have increased over time", (https://arxiv.org/pdf/2105.03656.pdf), SCC(2010)/SCC(2000) appears to be in the ballpark of 2x, and SCC(2020)/SCC(2010) appears to be around 2x as well. See Figure 1 of his paper.

Surprisingly though, this seems like it would have the reverse effect of what you hypothesized though--an SCC that increases over time--which seems to line up with the Tol (2022) paper title. Perhaps more things are changing than just technology?

David Friedman said...

@Stephen:
You obviously know more about the subject than I do. With regard to your final comment, in _Future Imperfect_ I wrote:
… with a few exceptions, I have limited my discussion of the future to the next thirty years or so. That is roughly the point at which both AI and nanotech begin to matter. It is also long enough to permit technologies that have not yet attracted my attention to start to play an important role. Beyond that my crystal ball, badly blurred at best, becomes useless; the further future dissolves into mist. (Friedman 2008)

David Friedman said...

@Anonymous:
I suspect that what you describe isn't the result of using more recent data in the same model but of changing the model — for example lowering the discount rate or lengthening the period over which costs are summed. If my criticism of Rennert is correct, it's a result of people being more willing to publish and believe work heavily biased towards finding a large cost.

David Friedman said...

I'm not sure I understand your question. I'm looking at the cost for a given time, say the year 2140, calculated using information on causal relations from different decades, not at the cost in different years. Calculating the cost in different years is something people, in particular the authors of Rennert, are already doing, whether correctly or not, and presumably they try to take account of the sort of effect you mention.

My objective is to see how the causal relations, such as the effect of an additional degree of temperature on mortality for people of a given income, change with technological progress. I can't actually observe it because the year 2140 isn't here to be observed. So I assume that the cost produced by an additional degree decreases, year by year, at the rate it was decreasing in the recent past, which I can measure. Except I am not doing it just for temperature-related mortality, which Lay et. al. has already done. I am doing it for total cost of an additional ton of CO2 released today, which involves summing costs year by year from now to, in the case of Rennert, 2300.

I hope that's clearer.

Richard Alben said...

I may not understand exactly. So let me try to describe your (basic) proposal in my own words and ask if this is correct:
Choose a climate model and an emissions pathway and obtain climate predictions running out to the distant future.
Use this pathway to calculate SCC2000 with contributions for future years done with with 2000 period technology.
With the same pathway calculate SCC2010 with contributions for future years made with 2010 period technology.
Take the ratio SCC2010/SCC2000 and divide by 10. Use this an added discount factor in adding up damages from future years to obtain an estimate of SCC that accounts for technology advance.

It seems to me that the above approach is problematic since conflates differences in SCC caused by technology, with differences caused by the damage in a future year being discounted by different amount in the two SCC calculations.

My suggestion would be to calculate SCC2000 with 2000 period technology and then calculate SCC2000 with 2010 period technology and work from that difference.

Please let me know if I am missing something.

David Friedman said...

@Richard:
Perhaps I was unclear. You are calculating the SCC summed from (say) 2020 to 2300 twice. Once you do it using data from 2000 to 2010, once using data from 2010 to 2020. So the only change is due to the change over ten years in the factors, such as the relation of mortality or croup yield to temperature, that are used for your calculation of the SCC. That gives you a measure of how fast those factors are changing.