An updated Bayesian climate sensitivity estimate

I thought I would update my Bayesian climate sensitivity estimate, given the comments I received (peer-review in action). Based on James’s comment, I’ve removed the noise term and am now using the aerosol forcing as the forcing uncertainty. Based on Paul’s comment, I’m using the CMIP5 RCP forcing data, which you can get from here (Specifically, I’ve used the RCP6 forcing data).

I’m still using the Cowtan and Way global surface temperature data, which you can get from here, but I’m using the full HadCRUT4 uncertainties, which you can access here. I’m also still using the 0-2000m Ocean Heat Content (OHC) data from Zanna et al. (2019), which you can get here. I’ve doubled the OHC uncertainties.

Just to remind people, I’m using a simple two-box model:

C_1 \dfrac{d T}{dt} = F(t) - \beta T(t) - \gamma \left[T(t) - T_o(t) \right],

C_2 \dfrac{d T_o}{dt} = \gamma \left[ T(t) - T_o(t) \right],

where the upper box is the ocean’s mixed layer and atmosphere, and the lower box is the ocean down to 2000m. I’m going to be lazy and not describe all the parameters and variables, which are described in my previous post. I’m fitting my model using Markov Chain Monte Carlo, which I’m doing using a python package called emcee.

The figure on the top right shows the fit to the surface temperature data, while the figure on the bottom right shows the fit to the 0-2000m OHC data. In both cases, the orange curve is the median result, while the grey lines sample the range.

Below is a corner plot showing the resulting distribution for the parameters. The parameter \beta is essentially climate sensitivity, \theta_1 is essentially an intial temperature difference between the two boxes, \gamma represents the exchange of energy between the two boxes, and C_1 is the heat capacity of the upper box.

The table below shows the results for the Equilibrium Climate Sensitivity (ECS), Transient Climate Response (TCR), ECS-to-TCR ratio, and the heat capacity of the upper box. Since the C_1 value seemed a little high (the medium is equivalent to about 150m of ocean), I repeated the analysis, but used a fixed C_1 = 2 (67m). I should also make clear that the ECS here, is really an effective climate sensitivity because the model assumes a constant \beta.

Parameter 15th percentile median 84th percential
ECS (K) 1.92 2.18 2.47
ECS (K) – C_1 = 2 2.06 2.25 2.47
TCR (K) 1.55 1.74 1.94
TCR (K) – C_1 = 2 1.48 1.59 1.73
TCR-to-ECS ratio 0.75 0.80 0.85
TCR-to-ECS ratio – C_1 = 2 0.68 0.71 0.75
C_1 3.85 4.73 5.70

This updated analysis has narrowed the range for both the ECS and TCR, and brought the upper end down somewhat. However, the median estimate for the ECS is still above 2K, and the lower limit (15th percentile) is still close to 2K. The figures on the right show the resulting ECS and TCR distributions.

Having now updated my analysis, I will probably stop here. I do have some other work I need to focus on. James has suggested that he is working on a similar analysis, so it will be interesting to see the results from this work and how it compares to what I’ve presented here.

Update:
As per Peter’s comment, I’ve redone the analysis using Lijing Cheng’s OHC data, which starts in 1940. I’ve also assumed a constant heat capacity for the upper box of C_1 = 2. Below are the figures and a table showing the results. I’ve also just realised that I forgot to correct the units on the OHC plot; it should be J, not ZJ.

Parameter 15th percentile median 84th percential
ECS (K) 2.08 2.36 2.64
TCR (K) 1.44 1.58 1.72
TCR-to-ECS ratio 0.63 0.67 0.71

Update number 2:
I had forgotten that I meant to mention that I’d had an email from Philip Goodwin who has also done similar analyses. For example, in this paper that I actually discussed in this post. There is also a recent paper by Skeie et al. (2018) that also uses MCMC, as does this paper by Bodman and Jones.

Advertisements
Posted in Climate sensitivity, Research, Science, The scientific method | Tagged , , , , , , , , | 40 Comments

An attempt to do a Bayesian estimate of climate sensitivity

Update (02/04/2019): I’ve updated this in a new post. The updated result suggests a slightly lower climate sensitivity and a narrower range. The main difference is – I think – how I was handling the forcing uncertainty. In this post, I was simply using some fraction of the total forcing, while a more appropriate thing to do is to use the aerosol forcing, which is what I’ve done in the updated analysis.

I’ve been spending some time working on a Bayesian estimate for climate sensitivity. This is somewhat preliminary and a bit simplistic, but I thought I would post what I’ve done. Essentially, I’ve used the Markov Chain Monte Carlo method to fit a simple climate model to both the surface temperature data and to the ocean heat content data.

Specifically, I’m using a simple two-box model which can be written as

C_1 \dfrac{dT}{dt} = F(t) - \beta T - \gamma (T - T_o) + \epsilon

C_2 \dfrac{dT_o}{dt} = \gamma (T - T_o).

In the above, C_1 is the heat capacity of the upper box (ocean mixed layer and atmosphere), T is the temperature of this box, C_2 is the heat capacity of the lower box (deep ocean), and T_o is this box’s temperature. The term \beta is essentially climate sensitivity, \gamma determines the exchange of energy between the two boxes, and \epsilon is a noise term that I’ve added.

In the above equations, F(t) is the radiative forcing. Unfortunately, I can’t seem to work out where I got this data from, but I will update this when I remember. Any forcing dataset would work, though. The term T in the top equation is the global surface temperature anomaly. I used the Cowtan and Way data, which you can access here. To complete this, I also needed a ocean heat content dataset. Laure Zanna very kindly sent the data from her recent paper, which can also be downloaded from here.

A couple of other things. I couldn’t find a forcing dataset that included uncertainties, so I assumed a 1\sigma uncertainty of 25%. I also initially had trouble getting a decent fit between the model and the temperature and ocean heat content data, so have increased these uncertainties a little.

To actually carry out the fit, I used a python package called emcee. It’s well tested, quite commonly used in astronomy, and is what I used for the paper I discussed in this post. The model has 5 parameters: \beta, \lambda, \epsilon, C_1, and \theta_1. The priors for \beta and C_1 were uniform in log space, while all the others were simply uniform.

The term \theta_1 is essentially an initial value for the deep ocean temperature, relative to the global surface temperature anomaly. I also adjust C_2 so that C_1 + C_2 is the total heat capacity of the ocean down to 2000m and the fit is based the 0-2000m ocean heat content matching the combined heat content of the upper and lower boxes.

The figure on the top right shows the resulting fit to the global surface temperature anomaly. The orange line is the median result, while the lighter gray lines sample the range. The figure on the bottom right is the resulting fit to the 0-2000m ocean heat content data. The orange and gray lines are also the median result and a sampling of the range.

The figure below shows the resulting distributions for the 5 parameters. As will be discussed below, these can then be used to determine the equilibrium climate sensitivity (ECS) and the transient climate response (TCR).

The equilibrium climate sensitivity is simply given by the change in forcing due to a doubling of atmospheric CO2 divided by \beta (i.e., {\rm ECS} = 3.7/\beta), while the transient climate response (TCR) can be determined using that the TCR-to-ECS ratio is \beta / (\beta + \gamma).

The resulting ECS distribution is shown in the top figure on the right, while the resulting TCR distribution is shown in the lower figure.

The table below shows the 15th percentile, median, and 84th percentile for the ECS, TCR, TCR-to-ECS ratio, and C_1 distributions. The results for the ECS and TCR are reasonably similar to what’s presented by the IPCC (although the lower-limit for the TCR is a bit higher: ~1.5K, rather than ~1K). The only term that may not be clear is C_1, the heat capacity of the upper box. A value of C_1 = 3 is equivalent to an ocean depth of 100m. The values I get seem a little high but may not be unreasonable (I was expecting this box to have a heat capacity equivalent to an ocean depth of about 75m).

Parameter 15th percentile median 84th percential
ECS (K) 1.99 2.65 3.63
TCR (K) 1.55 1.94 2.44
TCR-to-ECS ratio 0.65 0.73 0.80
C_1 2.77 3.57 4.47

Anyway, that’s what I’ve been working on. There may be more that I could do, but I’ve probably spent enough time on this, so will probably leave it at this. I did find it interesting that a relatively basic analysis using a very simple model produces results that seem entirely consistent with much more complex analyses and that is also consistent with various other lines of evidence.

I did try various ways to carry out this analysis. The results were all consistent with what I presented here. In some cases, however, the median climate sensitivity estimates were actually higher. However, in these cases, the fits between the model and the data seemed poorer. However, in none of my analyses did I recover climate sensitivities that were substantively lower than what I’ve presented here.

Posted in Climate change, Climate sensitivity, Research | Tagged , , , , , , , | 50 Comments

Jonathan’s Carrot and Stick

Jonathan Gilligan is Associate Professor of Earth and Environmental Sciences and Associate Professor of Civil and Environmental Engineering at Vanderbilt University, or so starts his media blurb. To me, Jonathan is the ClimateBall fellow I met at Keith’s ages ago. His play style and his calm constructiveness makes him a good fit for my fantasy draft. To top it all, he wrote a play and an opera libretto with his mom, Carol.

A gent. Here’s the first part of our chat. Italics are mine, Jonathan’s text is in roman.

***

OK. Let’s chat.

how’s everything, you seem busy

Everything’s good, but yeah. I’m busy. I tend to overcommit and then spend a lot of time trying to keep all my balls in the air. Just finished a two-day conference on the tenth anniversary of my “Behavioral Wedge” paper. It was fun and very productive, but getting everything organized for it and preparing my part was pretty exhausting.

nice – do you have to rewrite everything

I’m in the middle of calculating what’s changed since the original paper. A lot is very similar, particularly on the benefits of household weatherization and upgrading heating and air conditioning equipment, but there were some things we didn’t predict and other things that we knew were coming down the line that we couldn’t include back then.

found it

For one, we assumed that the 2017 Energy Independence and Security Act would push everyone toward efficient light bulbs and we didn’t think that LED bulbs were sufficiently available to include them. However, last year Lucas Davis at Berkeley showed that per-capita household energy consumption had dropped in the US and credited the adoption of efficient light bulbs for that. Meanwhile I find that only about half of US households have switched to mostly/all efficient bulbs, so there’s a lot more room for improvement beyond what the government regulations have produced. Another thing that wasn’t around for the original paper is electric cars, which look as though they have a lot of potential over the next decade or two.

good, you mention the LED example in the Carrot and Stick paper
i believe in those concepts, and liked the Walmart example

A big push going forward is going to be to see whether we can apply the same kind of analysis to businesses that we did to household and individual emissions.

is it harder to get corporate numbers?

The Walmart example is a good one, and there are others. The challenge is that there isn’t a lot of publicly available data, just as your question suggests. That’s a big challenge. There is some aggregated data from CDP and other sources, but it’s very hard to quantify the potential for efficiency improvement in industry. One of my big frustrations is that about a decade ago, McKinsey published a chart of potential emissions reductions versus cost, that was very optimistic, but they weren’t at all transparent about their methods or sources of data, so there’s no way to examine what went into it if you want to ask, “Do I believe this analysis?”

things should have changed since then
otoh, what is the incentive for companies to keep track of all this

Now there are more detailed analyses, but with the corporate side, you’re always running into the challenge that these things are very sensitive business information that companies don’t want to share with competitors (for good reason), so it doesn’t become publicly available.

it could be something related to audits
(although our actual auditing powers are limited)

However, investors are increasingly interested, in part because they want to understand the risks to companies from climate change (vulnerabilities and adaptation stuff), and also from possible future emissions regulations. This has created an opening for organizations like CDP [Carbon Disclosure Project] to get companies to disclose their emissions in a credible way, but keeping the information confidential except to subscribing investors. Then they can also publish aggregated data that does not expose anyone’s sensitive information.

i had a similar idea for scientific data – a fiduciary could vet private sets
e.g. psychologists
ideally it could be as simple as a checksum

Also, consumers are interested in a company’s emissions footprint. What we see suggests that consumers won’t pay a big premium to buy the greenest product, but they will pay to avoid the worst product, so with consumers, companies often are like the hunters running away from the bear. It’s not important to be the fastest as much as it’s important not to be the slowest.

tell me about the Sullivan Principles, as AT is South African

The Sullivan Principles: Back in the 1970s and early 80s, when people were debating how to deal with apartheid, Rev. Leon Sullivan proposed that rather than boycotting South Africa altogether, investors and consumers look at a company’s record and distinguish companies that followed a set of principles that promoted racial equity versus ones that went along with apartheid uncritically. Many institutional investors preferred to continue to invest in companies that followed the Sullivan principles, but many activists called for an all-out boycott of companies doing business in South Africa.

oh

A big distinction I see between the calls for divestment from fossil fuel companies and divestment from companies doing business in South Africa is that the anti-Apartheid activism emphasized divestment and boycott, whereas I worry that with fossil fuels, divestment without boycott will have much less impact on the companies. As I argue in my paper, unless the divestment is very wide-spread, divestment may benefit the company by getting rid of annoying shareholders who raise uncomfortable questions without affecting the share prices.

but divestment is just a stick, tell me about the carrot
reputation?


The carrot involves both reputation and also potentially finding and addressing inefficiencies, where a company can reduce emissions and also save money.

i really liked the concept of solution aversion – the contrarian matrix is powered by it

Solution aversion is fascinating. That was discovered by Troy Campbell and Aaron C. Kay at Duke. But it’s also important to observe that everything will not be win-win, so we also have to look deeper than just the places where companies can profit from addressing climate change.

Starting with the win-win items is a good starting point, and looking down the road there are a number of other benefits for companies: Many companies (both places that people traditionally think of as liberal, such as Google and Apple) and also places more traditionally considered conservative, such as Ingersoll Rand, find that employee satisfaction improves when workers feel the company is doing well by society (e.g., cutting greenhouse gas emissions). And this helps a lot with recruitment and retention of skilled employees.

ohoh that sounds like your inner hippie talking
will we need to reform corporationhood?

I struggle between my inner hippie (who is very outspoken) and my sense that I’m well to the left of most people, so many things I want to do won’t win elections. If we have to reform corperationhood in order to address climate change (the Naomi Klein position) then we’re screwed, so as much as I would like to do that, I also look for how to work with corporations as they are today.

an S&P analyst told a friend that corporations can’t be sued because they would not seek profit for shareholders at any cost – courts upheld governance right

Corporations do need to make a profit, but they don’t need to maximize profits at the expense of other considerations. In my book with Mike Vandenbergh, we talk about Benefit Corporations, which include social benefit in their charters. Mohammad Yunus has also written extensively about Social Businesses that repay their investors the original investment, but then plough their profits into social good rather than dividends or shareholder value.

that seems to lead to this kind of corporatism –

i bet angels work hard to manage their social aura

One social business, Grameen Shakti in Bangladesh, operates on a business model (as opposed to a charitable one), but has provided affordable solar power for an enormous number of rural homes.

like here, perhaps

I can follow up later with links to more detailed reports on Grameen Shakti. [Here it is.]

reminds me of the study about bed nets
money is a good experience tool

I was about to point you to Duflo and Banerjee’s work. I really love their approach. Their book, “Poor Economics” is great and I would recommend that everyone read it. There’s a lot in Green Economics about not letting the perfect be the enemy of the good in development work.

satisficing is tried and true – we’re doing it right now
imagine if we had to communicate in an absolutely perfect manner

Also, related to that I really think highly of Nancy Cartwright and Jeremy Hardie’s “Evidence-Based Policy: A Practical Guide to Doing it Better,” which covers similar ground with good common sense. You’ve got a lot of interest and knowledge about philosophical matters, and Cartwright is a prominent philosopher of science, who writes clearly and sensibly about epistemology applied to policy.

i know about her How the Laws of Physics Lie, or something like that

Yes. That’s her. You also brought up Ramez Naam on Green New Deal [GND] and I have strong opinions about GND and the way it’s discussed.

very-good-star-wars.gif
what are these opinions?

I get frustrated at the way everyone focuses on GND as a fully-fleshed out policy that its supporters expect to become law. I see GND as planting a flag, similar to MLK’s “I Have a Dream” speech (which was also very thin on details and which faced enormous political opposition in the Senate).

A well-known problem with climate policy has been that a majority of voters want to address climate change, but very few make it one of their top priorities, so politicians ignore it.

go off, king – i go grab coffee

What GND does is it connects the environment with jobs and prosperity, so instead of only being negative and warning people about looming catastrophe, we also inspire people with the benefits of massively rebuilding the nation’s energy infrastructure in a sustainable manner. That seems to me likely to get a lot more people excited that detailed spreadsheets of emissions tax rates and related wonkery.

If people become inspired by a vision of where they want to go, it seems to me that we’re more likely to get people putting priority on a policy—even if the end product isn’t much like the details spelled out in the early drafts—and getting something put into practice.

raising concerns has limits, as always

The end product of Obamacare lost many attributes that Obama emphasized in the 2008 campaign (e.g., a public option, which was the biggest thing differentiating it from Hillary Clinton’s plan), but something made it into law.

I have pushed back against the book “Break Through” for the last 12 years or so, but I have to admit that GND draws on a lot of the arguments made in that book, about the political power of hope over despair or fear. Many aspects of GND are straight out of the Breakthrough playbook, but there are also important distinctions, particularly in the sense of urgency about taking action. I have felt that there was unsupported optimism that technological fixes could eliminate tough choices. It’s like fighting obesity, diabetes, etc. by saying that instead of getting people to change their diets, we invest in developing technology that would allow someone to eat all they want of whatever they want without harming their health.

exactly
it’s the concerns that kills their brand, imho

In Break Through, I felt the book did not give adequate attention to the willingness of the public during WWII to make big sacrifices for the cause. My hope is that if we can give a hopeful and inspiring vision of where we’re going, people will be more willing to grapple with the tough choices (including temporary sacrifices) necessary to get there. I don’t, however, imagine that the public will make the same choices that I would myself.

ok, we’re almost there
there is this – have you looked?

OK. On Plumer: I have looked at that. Plumer’s analysis is very similar to the original Sustainability Wedges approach and to the approach I favor: Instead of looking for a single silver bullet to solve the emissions problem, look for many smaller things that are compatible with one another, and which can add up to a large enough reduction to matter (even if it doesn’t get us all the way there).
I really liked the interactive policy tool that Plumer and Migliozzi used for that article.

me too, i tried to hack it

There’s a very interesting literature on how interactive computational models can be very useful at helping people get an intuitive feel for managing complex systems (see, e.g., D. Dorner, “The Logic of Failure“). Dorner also expanded that article into a nice book. Moira Zellner at University of Illinois Chicago and her colleagues have done fascinating work on using interactive computational models to facilitate community-level decision-making about sustainability issues, such as groundwater management and flood risks – here and there.

I love what I’m doing.

i think it matters
https://andthentheresphysics.wordpress.com/2019/03/02/marios-room/

I also love being able to work with great colleagues on interdisciplinary projects.

Wow. When you put it that way, one thing that’s defined my career (which has been pretty unusual) has been that I have quickly jettisoned research projects when they stopped sparking joy.

it’s a great anti-depressant

I love working in teams. I get inspiration from working with other smart people, and I love getting constant constructive criticism and suggestions for how I can do better.

END OF THE FIRST PART

Posted in We Are Science | Tagged , , , , | 45 Comments

The BBC’s lack of balance

Credit: John Cook

Just discovered that a new BBC Scotland news programme (The Nine) decided that it would invite Andrew Montford on to discuss the Youth Strike for Climate. Fortunately, no one else was willing to appear with him, so the segment didn’t air.

In case people don’t know, Andrew Montford runs a blog (largely inactive now) called Bishop Hill and is currently the Deputy Director of the Global Warming Policy Foundation (GWPF). I’ve written about him and the GWPF on a number of occasions. In fact, one of my most read posts is about a previous occasion when the BBC invited him onto a show to provide some kind of balance.

The problem is that he has virtually no relevant expertise and simply spouts what many would call denialist talking points. There are many cases where it would be appropriate to provide some kind of balance. However, if the only way to do so is to invite someone who clearly doesn’t know what they’re talking about, and who has authored a report suggesting that school children are being brainwashed, then that’s the kind of balance that – in my view – should be avoided. The BBC doesn’t have to provide a platform just because someone has a view; that’s what blogs are for.

Links:
Andrew Montford on De Smog Blog.
Posts I’ve written about Andrew Montford.
Post I’ve written about the GWPF.
The BBC and its balance!

Posted in ClimateBall, Global warming, Policy | Tagged , , , , , , | 92 Comments

Open thread: Youth strike for climate

Since I haven’t had a post for a few days, I thought we could have an open thread about the youth strike for climate. I mostly think it’s quite a positive thing; it’s young people whose future is at stake making their voices heard. It’s also had quite a remarkable impact. On the other hand, we shouldn’t be leaving this to the youth, and I certainly don’t think they should regularly be taking days off school.

I have, however, come across some really remarkable responses. There’s a UK-based site where Greta Thunberg was described as radicalised and compared to Shemima Beggum. Watts Up With That plumbed the depths with a post about Greta Thunberg’s Aspergers. The Global Warming Policy Foundation have suggested that the striking youth are brainwashed. I’ve also been accussed of child abuse by more than one person on Twitter, because I’m supportive of the climate strikes. We really do need a better class of climate “skeptic”.

I really don’t know how there can be any common ground with people who seem to think that this kind of rhetoric is acceptable, or even why there should really be any reason to find some. Anyway, I’ll stop there. I realise that I’ve focussed on some of the more objectionable responses to the youth strikes, but maybe we can try to keep the comments about the youth strike itself, rather than about the more bizarre responses to it.

Posted in Climate change, ClimateBall, Open Thread, Watts Up With That | Tagged , , , , , | 67 Comments

The Plausibility of RCP8.5 – part II

A while ago I wrote a post about the plausibility of RCP8.5. It was essentially pointing out that there are a range of emission pathways, and hence cumulative emissions, that could lead to an RCP8.5 concentration pathway. Some of them are low enough that we would need to start substantial emission reductions quite soon if we wanted a high chance of avoiding an RCP8.5 concentration pathway.

I’ve since found the paper that presented this and thought I would slightly update my post. The figure on the right is essentially the key one. The top panel shows the emission pathways that can lead to the 4 different concentration pathways (RCP2.6, RCP4.5, RCP6 and RCP8.5). The bottom panel shows the cumulative emissions from 2006-2100 for each of these concentration pathways, and also shows the historical cumulative emissions up till 2005.

The key thing to note is that there is a range of possible emission pathways for each concentration pathway, and that the range of cumulative emissions that can produce a specific concentration pathway is quite high (represented by the black dots in the lower panel). In the case of RCP8.5, the mean from 2006-2100 is 1734GtC, but the 1\sigma range is ± 209 GtC (i.e., there’s a ~66% chance of it being between 1525 GtC and 1943 GtC). Just for comparison, we’re currently emitting about 10 GtC per year.

Something else to note is that for the higher concentration pathways (RCP6 and RCP8.5) the estimates from Integrated Assessment Models (IAMs) tend to be higher than the estimate from Earth System Global Circulation Models (ES-GCMs). In other words, the ES-GCM results suggest that avoiding RCP8.5 would require emitting less than would be suggested by IAMs. This is all summarised in the Table below.


Just to be clear, I don’t think it is particularly likely that we will follow an RCP8.5 concentration pathway. However, I don’t think it’s as unlikely as some like to suggest. Based on the Figure above, we could end up on an RCP8.5 concentration pathway if we double our emissions from ~10GtC/year today to ~20GtC/year by 2050. Given that we expect the global economy to keep growing and that there has been a relationship between emissions and GDP, avoiding doubling our emissions in the next few decades will require decoupling emission growth from GDP growth (well, assuming that the global economy does indeed continue to grow). There are hints that this is starting to happen (we’ve seen economic growth over a period where emissions have grown little) but I do think we should be careful of assuming that this will continue.

Posted in Climate change, Global warming, GRRRROWTH | Tagged , , , , , | 31 Comments