I’ve been trying to avoid discussing the whole comparison between atomic bombs and global warming. I initially didn’t have a problem with it as I always perceived it as an energy comparison. However, I recognise that there are aspects that make the comparison poor (entropy for example) and I can also see that using a horrific event, such as Hiroshima, to get people’s attention, is questionable. I will add, however, that when the impacts of global warming/climate change do become obvious, people will not be blaming those who tried to get everyone’s attention.
Thanks to a tweet from Barry Woods, I’ve become aware of a post by Jo Nova that attempts to put the Hiroshima comparison into context. Jo Nova attempts to do this by comparing the rate at which we’re accruing energy (which is probably the most fundamental aspect of global warming) and the rate at which the Sun is depositing energy into our climate system. The post says
Since 1998, Global Warming has been occurring at 4 Hiroshima Bombs per second, not that we can measure that rate to a statistically significant value*, or that it means anything at all. Every second the sun pours 2700 Hiroshima bombs of energy on the Earth at the top of the atmosphere.
Well, we may not be able to measure it accurately if we were to do a measurement for a second, but we can measure the energy accumulation over much longer timescales and then determine the average rate. It is currently equivalent to the energy of 4 Hiroshima bombs per second. The Sun may indeed be pouring 2700 Hiroshima bombs of energy every second into the top of the atmosphere but, typically, the earth then radiates 2700 Hiroshima bombs worth of energy every second back into space.
That’s not to say that the system has never been out of energy balance before. There are periods in the past when we’ve undergone global warming and global cooling (Milankovitch cycles, for example). However, the evidence suggests (see Marcott et al. 2013 for example) that the current rate at which we’re accruing energy is likely faster than it’s been for the last 11000 years and that the temperature today is likely higher than its been for 11000 years.
Joanne Nova also says
As well as missing the big-picture, Cook and Nuccitelli show us they don’t have a good grip on cause and effect. The world may have been warming, but that does not mean that CO2 caused it. Though they would very much like you to think that.
No, it is now very clear that we can attribute the warming to anthropogenic influences. The IPCC did not say that it was extremely likely that more than half of the warming since 1951 was anthropogenic because they are trying to mislead people. It’s because it is extremely likely.
Jo Nova’s post then includes the following figure. It shows the total amount of energy received by the Sun since 1998 (red) and the amount that’s accumulated (black). The implication being that the amount that’s accumulated is insignificantly small. Of course, it fails to correctly point out that, on average, the black portion would be expected to typically be zero – or close to zero – over timescales of many decades. The correct comparison would be with what we’d expect the accumulation to be (i.e., even smaller) and what it actually is (i.e., quite large compared to what we’d expect based on out past climate history).
I’ve come across these claims before. Currently the energy imbalance is around 0.6 Wm-2. In an earlier post I quote Anthony Watts as saying
So imagine the output of a 0.6 watt light bulb, 1/100th the power of a common household 60 watt light bulb.
Could you even see it?
As I point out, in the same post, if a typical house was retaining 0.6 Joules per square metre per second, the temperature in the house would rise by around 4.5oC per day and you’d reach the boiling point of water within a month.
We can do a similar calculation for the Earth to see if Joanne Nova is indeed correct that it’s essentially insignificant. The climate system is accruing energy at the rate of about 5 x 1022 J per decade. Most (about 93%) goes into the oceans, about 4% heats the land and atmosphere, and the rest is associated with melting polar ice. The land and atmosphere has a total mass of around 1019 kg and a heat capacity of 1000 J kg-1 K-1. Therefore it would take 1022J to increase the temperature of the land and atmosphere by 1oC (or 1 K).
If the total energy is increasing at 5 x 1022J per decade and 4% (2 x 1021J per decade) is associated with heating the land and atmosphere, that means – on average – we’d expect the temperature of the land and atmosphere to increase by 0.2oC per decade. Not far off what we’re actually seeing. That gives 2oC per century and 20oC per millenium.
The above also ignores that as we continue to add CO2 to the atmosphere, the energy imbalance will likely increase and the rate at which we’re accruing energy, therefore, will also increase. Hence, 2oC per century is a conservative lower limit. It’s much more likely to be in excess of 3oC by the end of this century.
The basic point is that just because one number happens to be small relative to another number, does not mean that the smaller number is not significant. A basic physics calculation suggests that the current rate of energy increase will warm the land and atmosphere at an average rate of around 0.2oC per decade. This is likely faster than at any time in human history. I find it hard to believe that anyone can argue that this is insignificant (actually, I don’t really find anything surprising anymore, but you know what I mean).