## The social construction of science

Richard Dawkins posted a tweet that cause a bit of a furore in some sectors of Twitter. He did try to clarify, but it still didn’t go down well.

The problem with his tweet is that science clearly is socially constructed. It’s done by people who make decisions that are strongly influenced by our social norms. In some cases, as discussed in Angela Saini’s book Superior, this not only influences how we do science, but also influences the results of some research activities, or how we interpret research results.

If we want to deal with the issues highlighted in Angela Saini’s book, and also improve diversity and inclusion in science, then we need to recognise that science is socially constructed.

However, I also understand why some scientist push back against this framing. It’s either because they think it’s implying that scientific results are social constructs, or that they will be interpreted as being social constructs. The concern being that this can imply that scientific results are constructed (made up) by people, rather than them tending towards properly representing whatever system is being studied.

I realise that the latter is not what those who highlight the social construction of science are actually suggesting, but I do get why it might sometimes seem that way. However, I do think it’s worth scientists trying to understand why it is important to recognise that societal factors play a big in role in determining how we do science, and can – in some circumstances – influence how we interpret scientific results.

However, I also think it’s worth Humanities scholars understanding why there can be push back from scientists. It’s not that they think societal factors play no role in science, it’s more that they think that this doesn’t necessarily imply that societal factors will have a big influence on scientific results, or how we interpret these results. There’s a concern that this can lead to people undermining scientific results when these results seem inconvenient. I think this is a valid concern, even if this isn’t what the social construction of science actually implies.

## ‘Net zero’

There’s been some recent debate about the term ‘net-zero’. Just to give some basic background, given that the zero emission commitment is close to zero (i.e., when we get anthropogenic emissions to zero, global surface temperatures should soon stabilise) means we can define a carbon budget. This tells us how much more we can emit if we want some chance of staying below some temperature target. It also tells us that our emissions must go to zero. The complication is that this could occur through emissions actually going to zero, or through some kind of negative emission technology offseting some continued human-caused emissions (this could include some active land management).

What some are concerned about is the possibility that some future negative emission technology could allow some to make ‘net zero’ promises that they may not be able to keep, or never actually intended to keep. In other words, they will claim that they’re aiming to get to a stage where they are offsetting all of their emissions despite it not yet being known if such technologies can actually operate at a suitable scale. Essentially, this becomes a form of greenwashing.

Another problem, though, is that some are interpreting ‘net zero’ in ways that aren’t consistent with what is intended. Mark Carney, for example, claimed that a company for which he is vice-Chair was ‘net zero’ because their enormous renewables business had avoided emissions that were comparable to their actual emissions. However, this isn’t ‘net zero’, it just means that they’ve ended up emitting about half of what they might have emitted.

‘Net zero’ requires actively sequestering an amount comparable to the amount actually emitted, not avoiding emitting an amount comparable to how much was actually emitted. I managed to come up with a reasonably popular tweet that illustrated the problem with Mark Carney’s suggestion.

I also came across another complication today, where someone interpreted ‘net zero’ as being the point when actual emissions are offset by negative emission technologies and by natural sinks. If this means that there is no net emission of any kind (anthropogenic, or natural), then it would imply constant concentrations. However, the reason that the zero emission commitment is probably small is because the natural sinks continue to take up some of our emissions after our emissions have gone to zero so that atmospheric CO2 concentrations actually decrease.

If we get to a stage where atmospheric concentrations stabilise, then we would actually continue to warm (this is the constant concentration commitment, which is different to the zero emission commitment). Hence, ‘net zero’ in this context means ‘net zero’ anthropogenic, not ‘net zero’ anthropogenic and natural. I should make clear that the above interpretation was, I think, more some confusion about ‘net zero’ than any attempt to define it in some convenient way, but it does illustrate how this can be a tricky concept.

So, I can see why some are concerned about the term ‘net zero’ and, as Simon Lewis points out, we can’t solve climate change using accounting tricks. However, I do think that the term ‘net zero’ is fine and that we should be careful of changing terminology just because some aren’t using it appropriately. However, it is important to stress that ‘net zero’ means ‘net zero’ anthropogenic emissions and that in the absence of negative emission technologies ‘net zero’ is the same as real zero. In other words, if negative emission technologies are unlikely to operate effectively at scale, then ‘net zero’ requires simply getting anthropogenic emissions to zero.

Warming commitments – post of mine about warming commitments with a number of useful links at the end.
Mark Carney Walks Back Brookfield Net-Zero Claim After Criticism – article describing Mark Carney’s misinterpretation of ‘net zero’.
The climate crisis can’t be solved by carbon accounting tricks – Guardian article by Simon Lewis.

## Losing the sky

Andy Lawrence, who happens to be a colleague, has just published a book called Losing the Sky. Andy also gave a brief presentation about it, which is what motivated me to write this post. The book is very reasonably priced and very easy to read. It’s about Starlink, the constellation of low Earth orbit satellites being launched by SpaceX. There are currently just over 1000 in orbit, with plans for 12000, and a possible extension to 42000. The goal is to provide high-speed internet with low latency.

As the image on the right illustrates, the issue is that (especially during the orbit raising phase) these satellites can be very prominent in astronomical images. Since there will be so many of them, this could have a very large impact. This is not only a problem for ground-based observations; even images taken with the Hubble Space Telescope have been impacted.

It’s also not only optical astronomy, radio astronomy may be even more severely impacted. Currently, most communication satellites are in geo-synchronous orbits. Consequently, radio observations can typically be planned to keep their transmissions out of the side-lobes. With this new constellation of low Earth orbit communication satellites, this may become essentially impossible, potentially ruining radio astronomy.

One concern with complaining about this, is that the stated goal is to provide internet to regions of the planet that don’t currently have decent access. This is clearly a worthy goal and so it can be tricky to object on the basis of how it will impact astronomical observations. There are, however, a few issues with this stated goal. One is that there are already solutions involving satellites on higher orbits, so it’s not clear that providing internet to under-served regions of the planet requires a constellation of low Earth orbit satellites. Also, the current price suggests that this may also currently be out of reach of many in these regions.

What seems more likely is that the motivation is to reduce the latency (the data transfer time) which will be very attractive to the financial sector. This will require a constellation of low Earth orbit satellites. So, the actual goal may not be quite as magnanimous as suggested.

As my colleague’s book suggests, this does seem to be another example of a tragedy of the commons. Some get to benefit from using the environment in a way that negatively impacts many others, who don’t get compensated for how they are impacted.

Even if we would benefit from high-speed, low-latency internet access across the globe, I do think there would still be merit to a process that assessess the impact of the proposed solution and that has some ability to influence, and potentially regulate, this kind of activity. We can’t keep ignoring the impact of how our activities influence the environment in which we all live, not only for fairness reasons, but because there is a cost to such activities that someone will eventually have to pay.

Posted in Environmental change, Scientists | | 33 Comments

## Agricultural emissions

There’s a really nice recent paper by John Lynch, Michelle Cain, David Frame and Ray Pierrehumbert on Agriculture’s Contribution to Climate Change and Role in Mitigation Is Distinct From Predominantly Fossil CO2-Emitting Sectors. It’s largely discussing why there are important differences between carbon dioxide (CO2), which is a stock pollutant, and methane (CH4), which is predominantly a flow pollutant.

The basic point is that the emission of CO2 increases the stock, which leads to a long-term increase in atmospheric concentrations and, consequently, to warming that will persist for a very long time. Methane, on the other hand, has a short atmospheric lifetime, decaying within decades to CO2 and water. Given that – for agricultural emissions – the carbon comes from plants, this doesn’t add a new carbon to the system, and hence doesn’t increase the stock. This isn’t strictly true for methane from natural gas, since that does add a new carbon to the system, but this is relatively small when compared to direct CO2 emissions from fossil fuels.

The key figure in the paper is the one above. The left-hand panel shows an example of an emission pathway based on using CO2-equivalents using the 100-year Global Warming Potential (GWP100). The right-hand panel shows the actual warming we would experience for different gas-specific compositions. CO2 warming (dark blue line) peaks when emissions gets to zero, but then remains at this level well after emissions have ceased (it’s essentially irreversible without some kind of artificial negative emission technology).

Methane (yellow line) initially produces more warming than would be expected based on its CO2-equivalence. However, when emissions start to go down, there is cooling, which continues well after emissions have ceased (for completeness, the pink line is 50% methane, 50% CO2, while the green line is N2O which has a reasonably long atmospheric lifetime).

The key point is that if one is using GWP100 to estimate CO2-equivalence, you would predict warming profiles that would be quite different to what would happen in reality. You would under-predict the impact of methane emissions initially, but then over-predict its impact later on.

The reason this is important is because any emission reduction pathways are likely to involve trade-offs. Consequently, as the paper highights,

reducing methane emissions at the expense of CO2 is a short-sighted approach that trades a near-term climate benefit with warmer temperatures for every year thereafter

and

If strong efforts are made to reduce agricultural emissions but prove expensive—in terms of monetary costs, political capital, public goodwill, or individual effort—and detract from efforts to eliminate fossil CO2 emissions then we will be climatically worse-off.

Essentially, the emission of a stock pollutant (CO2) leads to warming that will persist for a very long time, which is different to the impact of a flow pollutant (agricultural methane). The latter clearly does produce warming and, in fact, leads to more warming in the near-term than simple CO2-equivalent estimates would suggest. However, this warming would stabilise if emissions were to stabilise (unlike CO2) and can be reversed if these emissions are reduced (also, unlike CO2).

So, it would seem important to be aware of these differences when thinking of how best to decarbonise. Any strategy that prioritises short-lived pollutants over long-lived pollutants runs the risk of committing us to future warming that is essentially irreversible and that we could have avoided if we’d prioritised differently.

This isn’t to suggest that we should be ignoring the short-lived pollutants. They can have a large near-term impact which may be important if we wish to avoid crossing certain warming thresholds. There may also be other reasons for reducing these emissions (land use change, for example). I just happen to think that if we’re trying to assess the impact of different greenhouse gas emissions, it’s important to use a metric that properly represents this.

Agriculture’s Contribution to Climate Change and Role in Mitigation Is Distinct From Predominantly Fossil CO2-Emitting Sectors, new paper by Lynch et al. (2021)
Losing time, not buying time, Realclimate post by Ray Pierrehumbert making the same basic point (from 2010).
Methane, a post I wrote in 2019 about the impact of methane.
Guest post: A new way to assess ‘global warming potential’ of short-lived pollutants, Carbon Brief guest post by Michelle Cain.
Methane and things, another post I wrote last year trying to explain the difference between metane emissions and CO2 emissions.

Posted in Uncategorized | 179 Comments

## Deferential?

I was listening to a podcast interview with Steve Keen, whose work I’ve written about before. It was about his paper the appallingly bad neoclassical economics of climate change. I have a lot of sympathy with what he’s presenting. Some of the assumptions being made by economists in this context seem rather odd, and I’ve been critical of Integrated Assessment models (IAMs) myself.

I also make an appearance in the podcast, as an example of a scientist who is too deferential towards neoclassical economists. I don’t know if deferential is quite the right word (some of you may recall interactions I’ve had with a prominent climate economist), but I see what they mean and they do have a point. The point being made in the podcast is that some of the assumptions made in the neoclassical economics of climate change are so obviously nonsensical that they really should be being called out by scientists.

I agree that many of the assumptions seem odd. Economic growth is often assumed to be baked in. The damage estimates for high levels of warming seem ridiculously low. However, I’m also aware that it’s easy to look at some problem outside your area of expertise, think you’ve seen some obvious glaring error, and be wrong. Retired engineers are sometimes noted for this when it comes to climate change.

Also if you think it’s important to listen to experts, then you’d need to then have pretty strong reasons for arguing that we should ignore some of them. So, I am indeed reluctant to vocally call out neoclassical economists who work on climate change, mostly because there may well be (certainly are) aspects that I don’t understand, but partly because I do think expertise matters.

The suggestion was also not just that some scientists were too deferential, but that they should really be pushing back strongly against what is being presented by neoclassical economists.

I don’t really see why this should be the responsibility of scientists. I certainly think that it’s utterly bonkers that we could end up warming the climate (this century) by an amount comparable to the difference between a glacial and an inter-glacial, but I don’t have a good way to quantify the societal, and ecological, impact. It just seems obviously a silly thing to do.

I think scientists have done a great job of highlighting the risks. If others are still buying low-ball estimates from neoclassical economists, I don’t think this is the fault of scientists. I’m not suggesting scientists shouldn’t continue to highlight, and stress, these risks, but I don’t think they should be expected to sort out failings in another discipline. Feel free to disagree in the comments, though 🙂

## Anti-Virus

There’s a new site called Anti-Virus: The Covid-19 FAQ. It’s a little like Skeptical Science, with articles that respond to common arguments made by Covid Sceptics (what Skeptical Science would call Climate Myths). On a related note, I have been trying to help another group with a site called Simple Covid and we have been talking about doing something similar. Although we’ve (mostly others) have produced some infographics, we haven’t had time to write any myth-busting articles.

One thing I did find interesting about the Anti-Virus site was that they also list prominent Covid sceptics, including academics, journalists, and online sceptics. Such lists have been somewhat controversial in climate circles. Admittedly, this is partly to do with labelling, but the principle is clearly the same; create a list of people who have regularly promoted arguments that are wrong, and highlight what they’ve said and why they were wrong.

Although Skeptical Science has been remarkably successful, it’s not without its critics, partly because of a focus on consensus messaging and partly because of their climate misinformers page (now called misinformation by source). It would be interesting to know if some of these critics are also concerned about the tactics of the Anti-Virus site.

Personally, it seems to me that using consensus science to rebut common “skeptic” talking points is not only a reasonable thing to do, but can also be very effective. Skeptical science has actually numbered their responses to the Climate Myths, and we used this to rebut the recently released climate science (mis)information brief (which, amusingly, led to the re-assignment of two of the authors).

I also think that if people regularly promote arguments in public that are obviously wrong, then there’s nothing wrong with highlighting these errors and associating these people with others who also regularly promote such erroneous arguments. If those listed don’t like this, they could be more careful about what they say in public, could correct their past errors, or not really care if you still think your arguments are valid/defensible.

Anyway, I don’t really know where I’m going with this post, so will wrap up. I mostly just found it interesting that the Anti-Covid site is using a similar strategy to that used in the climate conext – debunking myths and naming and shaming prominent sceptics. I think it’s an interesting development.

Posted in ClimateBall, Pseudoscience, Science | | 534 Comments

## Alan’s Bottle

Me and Ken just had a talk over the Science Kerfuffle of the moment, featuring a physics and maths teacher known to pwn fashionable nonsense fans. He recently suggested that POMO weakened our herd immunity to combat objective untruths. He also wonders what to do now that the genie is out of the bottle. What Alan really means by these metaphors remains unclear.

Follows a slightly edited transcript.

[Willard, thereafter W]

[Ken, or AT in what follows]
That’s quite good. May motivate me to write a post.

[W]
thanks
the whole idea that people believe in fraud because of POMO looks ridiculous

[AT]
Do you agree with the suggestion that even if PoMo isn’t responsible it has undermined our ability to combat misinformation?

[W]
on the contrary, POMO tries to explain how misinformation can happen

Postmodernism is generally defined by an attitude of skepticism, irony, or rejection toward what it describes as the grand narratives and ideologies associated with modernism […]

[AT]
Okay, maybe I’ll have to rethink my post. Maybe I misunderstand PoMo, but if some of what goes on in STS falls with PoMo it certainly doesn’t seem to have helped, even if the goal is to explain how misinformation can happen.

[W]
we can disagree, that’s fine
it’s just small talk nobody will read

alan makes an important error:
indeterminacy should not lead to denial
and POMO could guard us against conspiracy ideation

the problem you got with STS is different:
for instance, MikeH’s main problem is that he has no idea of what he’s talking about
he has no business making metrological points without studying metrology
so we can agree that people say stuff without paying due diligence

[AT]
I guess I’m not a fan of over-generalizing. I guess my issue is more to do with STS, for example, claiming they have all sorts of tools for helping to deal with misinformation, while prominent people seem to either promote, or defend, misinformation. Grundmann with his “climate science is like race science”, Pearce with his criticism of consensus messaging without actually providing an alternative and publishing papers on climategate that repeat the myths, etc. So, if the tools are there, it feels that some people in that field are going to have to do a better job of explaining what they are and how to use them.

[W]
agreed
that’s not POMO tho, that’s editorializing or criticism, which is indeed a bane
STS sucks because it’s an interdisciplinary discipline whose practitionners know little about everything and therefore are dangerous enough almost everywhere
it may have inherited from POMO bad scholarship practices

[AT]
That’s what I was wondering. Isn’t there at least a PoMo element to some of STS. Weren’t they part of the Science Wars?

[W]
STS, as a discipline, is a result of older science wars
it tried to “sciencize” its output
instead of using abstract and unrealistic models like the old philosophers of science did,
it promised to look under the scientific hood
but if all you do is to play pretend by recycle kuhn this and popper that,
you get the worst of both worlds
(warren only adds “let’s find an exotic framework nobody will buy because it’s \$150”)

[AT]
Okay, yes, that probably does describe it pretty well.

[W]
so i would conclude two things
first, if one wishes to say something,
one has to study it with all the evidential responsibility it requires
due diligence, an idea that generalizes
me, you, alan, STS, POMO, everyone
second, it’s easier to be led astray by a lack of work in conceptual frameworks,
because words are just words–we need constructions

[AT]
I certainly agree with the first part of that. Don’t quite get what you mean by “words are not constructions”.
Why construct?

[W]
an old idea that i viktor recently retooled for his opiniated podcast
one can define impossible objects
one can’t construct them
empirical science prevents us from making claims that we can’t operationalize
scientists can’t pretend operationalization forces us to conclude one and only one thing
that’s just not what science affords us

that’s the main point from say bruno, whose framework is very good for climateball
once we accept that scientific theories evolve and are not to be taken for granted, all fits

[AT]
Okay, I think I get that.

[W]
so when i say that POMO isn’t responsible for our predicament, all i’m saying is that even if POMO did not exist, we’d still be stuck with that indeterminacy
(the inscrutability of reference is one of the indeterminacies attributed to van)

that said, you might be right on the historical point
warren peirce, gunter reiner grundmann, and mike hulme are not exactly helping
but even then, that’s just a guess
to show it would take some work
so as long as you keep clear that you’re editorializing, all should be fine, up to a point

[AT]
I’ll have to think a bit more. Alan’s point about PoMo not being responsible but also not helping resonated. Maybe that’s just too simple.

[W]
it resonates, but it rings hollow to me
after all these years, he’s just saying stuff, and that’s sad
his editorial exemplifies very well our predicament
we say stuff, and if it sounds good enough, we buy it

in fact the converse of his bottle hypothesis looks more plausible to me:
by amplifying the threat of POMO on the fate of western civilization, alan’s reactionary stance has been recycled by newscorp and has weaponized people with mental issues
conceptual boi has become a truther,
same for EricW

[AT]
That’s possible. I guess I have always thought that we don’t consider how what we say can then influence what we’re commenting on.
James Lindsay has always seemed a bit bonkers to me.

[W]
i learn from your posts because you express an attitude
you helped me keep my cool
in retrospect, toning down ages better
alan’s point is an old one, in fact as old as plato
philosophy is the history of how humans dealt with relativism and skepticism

[AT]
Yes, I am trying to tone down. Maybe I should ponder this a bit more.

[W]
as long as you can support what you’re saying, you should be fine
more so if your point is “if everyone supported their claims, that’d be great”
that’s just a more consistent approach
imo, alan fails that test
i could write a post if you prefer

[AT]
If you’re keen, go for it. I’m probably going to take it easy this evening, so if you have some time, feel free.

[W]
i’ll see what i can do
we could post that chat

[AT]
If you like, that’s fine with me.

[W]
good

[AT]
Thanks, you too.

## On baselines and climate normals

Mike Hulme, Professor of Human Geography at the University of Cambridge, has a somewhat bizarre article published in Academia Letters called Climates Multiple: Three Baselines, Two Tolerances, One Normal. It’s basically a discussion of the recent World Meteorological Organisation (WMO) decision to re-define the present day climate as the period 1991-2020, replacing the period 1961-1990.

The article starts by suggesting that this means that

Climate will ‘change’, one might say, in an instant; the world’s climate will ‘suddenly’ become nearly 0.5°C warmer. It is somewhat equivalent to re-setting Universal Time or adjusting the exact definition of a metre.

Well, from the mid-1970s to the early 2000s we actually have warmed by about 0.5oC. This has nothing to do with how the baseline is defined. It’s also hard to see that it’s equivalent to adjusting the exact definition of a metre. I also wonder if Mike Hulme has got this the wrong way around. If we make the baseline period more recent, then the anomaly values actually go down, not up. You might argue that the change in baseline has caused the world to suddenly become 0.5oC cooler, rather than warmer (it hasn’t, obviously, but the change has reduced the anomaly values by about 0.5oC).

The rest of the article discusses the various baselines (present day, pre-industrial, historical) and what we might mean by a climate normal, but I don’t really get the overall point. Clearly we have to be careful about how we discuss climate change, be clear about what baseline we’re using, and be aware that what might be regarded as normal is changing. But this is a feature of the topic; it’s not something that can really be avoided.

It may be also technically true that

The adoption of particular baselines and tolerances is an overtly political process with geopolitical, ethical and technological consequence

but it’s also the case that none of these decisions change physical reality. Changing the baseline does not change how much we’ve warmed, how fast we’ve warmed, and how much we will warm if we continue to emit greenhouse gases into the atmosphere. If this is carefully communicated, it’s hard to see how these changes have any real political significance (on top of the political significance of climate change itself, of course).

In some sense, Mike Hulme’s article seems to be doing the very thing it’s cautioning against. The only way that changing the baseline, or what we regard as a climate normal, can have any broader political significance is if people overplay the significance of making these changes. Suggesting that redefining a baseline has geopolitical implications would seem to be an example of doing so.

## Warming commitments

There’s been quite a lot of recent discussion about warming commitments. It started with an article by Bob Berwyn called Net Zero Emissions Would Stabilize Climate Quickly Says UK Scientist, followed soon after by one saying [w]arming already baked in will blow past climate goals, study finds. The first article is (I think) based on a recent multi-model analysis which suggests that the most likely value of Zero Emission Commitment (ZEC) on multi-decadal timescales is close to zero. The second article is reporting on results from another recent paper suggesting that [g]reater committed warming after accounting for the pattern effect.

So, why are we being presented with what appear to be inconsistent results? The simple answers is that we’re not really being careful enough to define what we mean by a warming commitment. The first article, and paper, are considering what would happen when we get emissions to zero. The second article, and paper, are essentially considering what would happen if atmospheric greenhouse gas concentrations remained at today’s levels. These are clearly two different scenarios.

When we get emissions to zero, the first paper indicates that – on multi-decade timescales – the zero emission warming commitment (ZEC) would be close to zero. On the other hand, if atmospheric CO2 concentrations were to remain constant, then we would continue warming to equilibrium. At today’s atmospheric CO2 concentrations, this would lead to additional warming of around 0.5oC or even more, according to the paper being highlighted in the second article above. However, it is important to realise that constant concentrations require continued emission, as illustrated by the second figure in this Steve Easterbrook post.

I should also stress that our understanding that there is little warming commitment associated with zero emissions has been understood for quite some time. The first paper to point this out was probably Matthews and Caldeira (2008), followed by Solomon et al. (2009), and Cao and Caldeira (2010). There’s also a Realclimate post pointing this out in 2010, the Steve Easterbrook post I mentioned above from 2013, and a post I wrote in 2016.

There are, however, a number of important caveats. That the zero emission warming commitment is probably small probably only applies on multi-decade timescales. The models that demonstrate this typically don’t include slower processes (such as ice sheet retreat, sea level rise, permafrost release) that may lead to additional warming on longer timescales.

Also, even though there is probably little commited warming on multi-decade timescale once we get emissions to zero, without negative emissions global surface temperatures will remain at an elevated level (relative to pre-industrial times) for a very long time. It does, however, indicate that our future warming depends mostly on future emssions. We can still influence how much future warming we are likely to experience, even if we can’t turn everything off right now.

So, I think it’s good that there is more recognition that the ZEC is probably small. It does address claims that there’s nothing we can do to avoid a lot of future warming and does illustrates that, in the context of future warming, most of the inertia is societal, rather than inertia in the climate system.

Net Zero Emissions Would Stabilize Climate Quickly Says UK Scientist, article by Bob Berwyn.
Warming already baked in will blow past climate goals, study finds, Associated Press article.
Is there warming in the pipeline? A multi-model analysis of the Zero Emissions Commitment from CO2, MacDougal et al. (2020).
Greater committed warming after accounting for the pattern effect, Zhou et al. (2021).
Stabilizing climate requires near‐zero emissions, Matthews and Caldeira (2008).
Irreversible climate change due to carbon dioxide emissions, Solomon et al. (2009).
Atmospheric carbon dioxide removal: long-term consequences and commitment, Cao and Caldeira (2010).
Climate Change Commitments, Realclimate (2010).
How Big is the Climate Change Deficit?, Steve Easterbrook (2013).
Committed Warming, my post from 2016.

## Have CO2 emissions peaked?

I noticed, as has Stoat, that Ken Caldeira and Ted Nordhaus have a bet about whether or not we’ve reached peak CO2 emissions. Specifically, the bet is

Between 2021 and the end of 2030, annual fossil fuel emissions (excluding carbonation) will not exceed annual fossil fuel emissions (excluding carbonation) from 2019.

Carbonation is essentially emissions from cement production.

As with many others, I’m hoping that Ted Nordhaus wins, but expecting that Ken Caldeira will do so. In truth, though, that’s a bit simple. Even if Ted Nordaus were to win, what would emissions having peaked actually imply?

Consider a simplified form of the Kaya Identity:

$CO_2 = GDP \times \dfrac{Energy}{GDP} \times \dfrac{CO_2}{Energy}$

CO2 emissions essentially depend on Gross Domestic Product (GDP), energy intensity (energy per GDP) and carbon intensity (CO2 per energy).

So, if emissions this decade do not exceed those from 2019, why would that be? Would it be because GDP growth had stalled? Would it because of improvements in energy efficiency? Would it be because we’d reduced emissions through using more alternative energy sources? Would it be because we’d developed, and deployed, carbon capture and storage technologies? A bit of everything?

Also, what would it imply about the developed and developing worlds? Will the developed world have accelerated their emissions reduction so that the developing world can have a more gradual transition? If it is partly due to slower, or stalled, GDP growth, would that imply that some have benefitted far less than they might otherwise have done?

I don’t know the answers to any of these questions, but I do sometimes wonder if we don’t always consider the potential implications of some of the scenarios we might be hoping for. I’ll leave it there, but if anyone has any answers to these questions, feel free to post them in the comments.