The length of the “pause”?

In a recent post, Judith Curry highlighted a new paper by Shaun Lovejoy called Return periods of global climate fluctuations and the pause. This is a follow-up of an earlier paper that concluded that

the probability of a centennial scale giant fluctuation was estimated as ≤0.1%, a new result that allows a confident rejection of the natural variability hypothesis.

In other words, the rise in temperature over the last 100 years or so is almost certainly anthropogenic. This new paper also says

The hypothesis is that while the actual series Tnat(t) does depend on the forcing, its statistics do not. From the point of view of numerical modeling, this is plausible since the anthropogenic effects primarily change the boundary conditions not the type of internal dynamics and responses,

which is interesting given the discussion that prompted an earlier post. The basic result of this paper is essentially that the variability we’ve seen in the instrumental temperature record is simply a consequence of natural variability around a long-term anthropogenic trend. Additionally, the estimated climate sensitivities are entirely in line with IPCC estimates.

So, I should quite like this paper as it is not only a follow-up to a paper that ruled out that the rise in temperature over the last 100 years or so could be natural, but also illustrates that the variability in the instrumental temperature record is simply natural variability around a long-term anthropogenic trend; something I’ve been stressing on this blog. But I don’t really. This is a rather odd paper and – unless I’m missing something – am rather surprised that it got published.

Why? Well the fundamental equation is essentially the one below

T(t) = \lambda_{2xCO2,eff} log_2 \left( \rho_{CO2}(t)/ \rho_{CO2,pre} \right) + T_{nat} (t) .

This seems to be a rather unusual form of such an equation, as it suggests that the temperature rise is a linear function of the increasing forcing, plus some term representing natural variability. So, if the forcing stops rising, the forced response stops instantly, rather than continuing to rise to equilibrium. The coefficient in front of the first term on the right-hand-side is, therefore, then really only some kind of effective transient response. The bigger issue, in my view, is that the forcing is CO2 only. The paper actually says

Two things should be noted: first, Tnat includes any temperature variation that is not anthropogenic in origin, i.e., it includes both “internal” variability and responses to any natural (including solar and volcanic) forcings.

So, as stated, the natural variability being investigated in this paper is both external forcings and internal variability. What’s more, this isn’t even quite right because if the only forcing included is CO2, then the natural variability term also includes other anthropogenic influences (aerosols, black carbon, land use). Therefore, not only is the natural variability in this paper not what Judith would regard as natural variability (i.e., it normally refers to unforced natural influences), it’s not even natural in any reasonable interpretation of the term as includes anthropogenic influences.

So, what this paper seems to have done is determine the variability associated with non-CO2 anthropogenic influences and both forced and unforced natural influences. Given that some of these influences are stochastic, some have cycles (solar), and some are monotonically increasing or decreasing (aerosols, land use, black carbon) how can any kind of pattern really make any sense. If there is a pattern, it surely has be purely coincidental. Furthermore, what’s of real interest is the magnitude of the influence of internal variability, which this paper appears entirely unable to determine (and is what I think some have thought it is doing).

What it really should be doing, I think, is using a standard one-dimensional model

C \frac{d T}{dt} = dF(t) - \lambda T + T_{nat}(t),

where C is the heat content of the system, dF(t) is the change in external forcing (natural and anthropogenic), \lambda is the climate sensitivity term, and T_{nat}(t) could be some term representing internal variability. If this was done (and it may already have been) I think one would find that the unforced variability is much smaller than indicated in this paper.

So, I think Judith likes the paper because it suggests that natural variability could be quite large and because it suggests “pauses” could return every 20 years or so. Given that “natural” in this paper doesn’t really mean natural and that any kind of pattern is presumably entirely coincidental, it’s a rather unconvincing result. The paper finishes with

To be fully convincing, GCM-free approaches are needed: we must quantify the natural variability and reject the hypothesis that the warming is no more than a giant century scale fluctuation.

I don’t really agree with this. There’s only so much one can do with simple models. They’re very useful, but the idea that we can completely characterise the anthropogenic and natural influences using simple models, seems a little unrealistic. In my view, the role of simple models is to provide a way of checking that the results of more complex models make sense. Of course, those who don’t like GCMs, appear to like this conclusion.

Advertisements
This entry was posted in Uncategorized and tagged , , , , , , . Bookmark the permalink.

20 Responses to The length of the “pause”?

  1. WebHubTelescope says:

    Lovejoy actually responds to Curry and others in the comments. Both he and I seem to agree that a simple multiple factor model can easily describe the global temperature trend, which will also reveal the effective aCO2 contribution..
    http://judithcurry.com/2014/09/01/how-long-is-the-pause/#comment-623694

    Nothing earth-shattering in this approach other than the fact it is very effective.

  2. WHT,

    Both he and I seem to agree that a simple multiple factor model can easily describe the global temperature trend

    I tend to agree that a simple model can do a good job of describing the temperature trend. I’m just not convinced that you need to have resolved the natural vs anthropogenic issue using simple models before you consider the results of GCMs.

    What’s your view of Lovejoy’s approach. Why would one run a simple model where the forcing was only CO2, and all the other effects (natural forcings, anthropogenic forcings, internal variability) are bundled into a single term. I can’t see what this really tells us. Surely what we’d like to understand is the influence of internal variability and the influence of external forcings. We have no evidence to suggest that our response to a change in a natural external forcing is somehow different to our response to a change in anthropogenic forcing, so doing it the way that Lovejoy has doesn’t make much sense to me.

  3. I sure hope in the future these models will account for methane concentration as a separate item.

    By the way, I don’t want to sound stupid, but would it be useful to account for orbital variations in long term projections? The other night I watched the movie Kon Tiki and I remembered just how much water there’s in the southern versus the northern hemisphere. Does it make a difference over the next three hundred years?

  4. Fernando,

    By the way, I don’t want to sound stupid, but would it be useful to account for orbital variations in long term projections?

    As far as I know, the shortest Milankovich period is about 4000 years so over the next few hundred years this will have virtually no effect.

  5. I guess kARsTeN did not review this paper. Ignoring aerosols. 😦

    That being said to describe the climate signal up to now the Lovejoy equation (BEST Muller) is a reasonable approximation. Maybe not for the future because aerosols will not keep on increasing as much.

    Would Lovejoy be a blogger, he would probably call his blog Fractal Variability. Like me, he does not see the slow change in the mean as very interesting. The interesting part for him is the statistical (multi-fractal) modelling of the variability around the mean. Calling that variability “natural” is not the best choice of words. Any suggestions for a better term? Just climate variability?

  6. Victor,
    So, you think it’s to represent the anthropogenic forcings as CO2 only?

    Additionally, it would seem, that the natural variability that the papers gets, is not really even natural and is certainly not unforced. It’s not that this isn’t obvious from the paper, it’s more that it’s not obvious that this is what we would like to understand.

  7. Victor,
    I thought I might comment further on this,

    The interesting part for him is the statistical (multi-fractal) modelling of the variability around the mean. Calling that variability “natural” is not the best choice of words. Any suggestions for a better term? Just climate variability?

    Why would one separate the influence of the different external forcings, though? Surely if you have a sepaate dataset for these, and the climate responds equally to all the different changes in external forcings, you can then truly separate out the forced response from the unforced response (or am I missing something).

  8. The 40 to 60 year period Stadium Wave factor has an amplitude of +/- 0.1 C over the last 130 years. In contrast, the effective log(CO2) trend has a peak-to-peak amplitude of at least 0.8C with a half wave period of at least 130 years, with no signs of cresting yet.

    Lovejoy is suggesting one look in the paleo evidence for Fourier components of that large a magnitude and that long a period. He can find plenty of the weaker Stadium Wave evidence but nothing like the CO2 profile — components with that amplitude and period are statistically very rare. By applying a conservative fat-tail as a PDF, he gives a 1 in 1000 chance of seeing it.

    Moreover, the phase alignment of the log(CO2) curve with the rising trend, after removing the Stadium Wave, ENSO, volcanic, TSI, etc factors is yet another statistical measure that can be quantified for significance.

    He is essentially asking what are the odds of this all lining up so tightly with a TCR of 2C per doubling of CO2 ? Very remote is his answer.

  9. This is an example of a model that includes the orbital forcings, particularly ones that Scafetta and followers are intrigued by. One aspect that skeptics do not seem to comprehend is that the lunar forcings will exert an influence on 8.85, 18.6, 6 year periods along with the longer trends. But the even longer periods are more likely the weaker factors, just like tides are stronger at diurnal intervals than lunar monthly, and lunar monthly stronger than multi-year, and so on.

    When incorporated, these can easily compensate a warming trend, thus creating a virtual pause on the time scales we are observing.

    The way I play ClimateBall is to use the suggestions of skeptics such as Scafetta (orbital periods), Curry (Stadium Wave), ENSO (Bob Carter), TSI (Archibald and the “It’s the sun” gang), etc, and throw it back in their face. Of course established climate scientists don’t particularly want to do this because they may consider it a waste of time. OTOH, I find it instructive because it shows how the skeptics are trying to hide the rather obvious pea in their uncertainty shell game.

  10. So, you think it’s to represent the anthropogenic forcings as CO2 only?

    The Lovejoy-Muller term parametrizes (models in a simplified way) the influence CO2 and aerosols relatively well for the last century, even if it is not physically right and thus will not stay that way in future.

    The other anthropogenic terms are in the variability, would be my guess without having read the paper.

    “Why would one separate the influence of the different external forcings, though?”

    A true multi-fractal believer would not do so. To generate power law power spectra over all temporal and spatial scales, you normally need multiple processes (because different processes work on different spatial or temporal scales). If you would separate the different processes, you would destroy the beautiful multi-fractal nature of nature.

  11. Victor,
    Ahhh, if their forcing term includes aerosols, then I’m slightly less critical than I was.

    The other anthropogenic terms are in the variability, would be my guess without having read the paper.

    As I understand it, yes, which was one of my issues. Some of the variability is presumably anthropogenically forced and some is naturally forced (volcanoes, solar). So, the confusion then is that the term natural variability is not quite what many would expect it to be. So, as I understand it, this paper doesn’t really tell us anything about the role of internal variability, just the role of non-GHG/aerosol forcings/variability.

  12. Tom Curtis says:

    Anders:

    “Therefore, not only is the natural variability in this paper not what Judith would regard as natural variability (i.e., it normally refers to unforced natural influences), it’s not even natural in any reasonable interpretation of the term as includes anthropogenic influences.”

    I have seen Curry make the same conflation herself, so I am not sure whether or not she would disagree with the tacit definition of “natural” implied by the formula. (She should reject the inclusion of anthropogenic forcings in addition to CO2, but I am not sure that means she will.)

    You quote Lovejoy as writing:

    “The hypothesis is that while the actual series Tnat(t) does depend on the forcing, its statistics do not. From the point of view of numerical modeling, this is plausible since the anthropogenic effects primarily change the boundary conditions not the type of internal dynamics and responses,”

    I do not think this follows at all.

    For discussion, imagine there is a warm current that penetrates deep into the Arctic, where increased salinity and cooling water causes the incoming water to sink to the ocean depths were it then goes south, and imagine there is an oscillation in the northerly limit of the warm current. The oscillation is driven by the rate at which water sinks in the Arctic, which is in turn driven by the warmth of the water with warmer water being more buoyant, but also evaporating quicker and hence being more saline with the former effect being the stronger.

    The consequence is that the warmer the water initially, the further into the Arctic it will penetrate, and the slower the current. Because the current is slower, the water travelling north loses more heat to the atmosphere thereby increasing regional land temperatures and OLR. The reduced temperature gradient from equator to pole, and increased regional land temperatures both serve to increase global temperatures. The increased cooling of the water, however, results in the water sinking earlier and a switch to the other phase of the cycle.

    Now throw in global warming.

    The current heading north will be gaining more heat due to the TOA energy imbalance. In the warm cycle, a greater increase in heat is needed to overcome the existing energy imbalance and result in net cooling. The consequence should be that the warm cycles will be longer in duration, and greater in amplitude. That is, it will change the statistics of purportedly one of the most important modes of natural temperature variability on the planet.

    It also raises a question of how much of “natural variability” is actually natural.

  13. Tom Curtis, good catch. Yes, the variability will also change due to climate change.

    That is the interesting part of the changes in extreme weather (the trivial part of changes in extremes are changes due to the changes in the mean). The daily and seasonal cycles will become weaker, the contrast between poles and equator as well. The storm tracks will change. Some recent papers study the change in the blocking of highs and lows (if they move slower, we will get more extremes and variability for a specific location). The hydrological cycle will become stronger. There is already a modelling paper that expect less variability of the annual mean temperature from year to year. Zeke Hausfather has a nice post about these model results.

    A lot of reasons to think that the assumption that the variability is natural and does not change is likely wrong. Not only be “boring” mean will change.

  14. Mack says:

    ,,,”the variability will also change due to climate change”
    “(the trivial part changes in extremes are changes due to the changes in the mean)”
    Yes Victor….As Obama says…”it’s changes you can believe in”.

  15. Tom,
    That is a good catch. Also, that paragraph clearly makes it seem that Tnat represents internal dynamics, when it very clearly represents much more than this.

    You’re right about the term natural. It sometimes appears to be being used as some kind of catch-all that means not anthropogenic.

  16. izen says:

    @-“You’re right about the term natural. It sometimes appears to be being used as some kind of catch-all that means not anthropogenic.”

    Sometimes it is used to mean Anything But CO2.

    Perhaps it is useful to view the approach used in this paper in this way. The only thing we know with any certainty is the amount of anthropogenic CO2 we have added to the atmosphere, because we can measure it. That it is not some vast chaotic burp of the natural carbon cycle is attested to by the isotopic fingerprint, the Oxygen depletion and past level stability.

    There are undoubtedly other forcings acting on the climate which are anthropogenic. But the magnitude of aerosols and particulates are uncertain in magnitude and effect. there is overlap between natural wildfires creating soot and warming causing drought increasing wildfire severity. Everything other than CO2 is open to uncertainty and attribution problems.

    CO2 however has been measured quantified and the effect it has on the thermodynamics of the atmosphere calculated with the radiative transfer equations. Climate science tends to be pretty smug about that knowledge, which allows them to project various degrees rise in surface temperature depending on the prevailing fashion in TCR and ECS.

    Some of those who might wish to minimise the impact of CO2 driven AGW point to the highly variable and chaotic nature of the climate without any CO2 forcing as evidence that the effects pf AGW are lost in the natural uncertainty.
    this can involve an emphasis on the unpredictable aspects of chaotic dynamic systems with the implication that the values, the state of the climate, can ‘randomly’ reach any magnitude as part of the chaos.

    The deterministic part of chaotic systems is less acknowledged. That the values while specifically unpredictable will fall within the constraints of the system

    This paper might be seen as considering the forcing from rising CO2 a a well measured change in the boundary condition of the chaotic dynamic system that forms the climate. It identifies that the temperature trend observed is not only consistent with the calculated effect of the CO2 forcing, but that it is highly unlikely that any chaotic execursion could have mimicked this observed trend.

    Everything else, natural, modified by the warming and purely anthropogenic can be regarded as the dynamic system that responds to the changing boundary conditions. Chaotic systems have a habit of increasing the envelope of possible states and becoming much more variable and less predictable when boundary conditions, in this case the thermodynamics of the system, are changed. For instance the ENSO cycle shows three states, negative, neutral and positive. This might be seen as typical of a chaotic system. The time and magnitude of the states is constrained (determined by ocean dynamics etc) but unpredictable. In many simple chaotic systems increase the energy and many would exhibit increased variability, increases in the extreme magnitudes and bifurcation of the neutral state into two slightly +/- possibilities.

    This is what makes the invocation of chaos a rather less reassuring comfort as a source of variation that will overwhelm the impacts of AGW. while climate science can calculate the surface temperature rise from the increased gradient in energy density from rising CO2 the big uncertainty is how the chaotic system of the climate will respond other than by just the average surface temperature going up a few degrees.

    It can be viewed as alarmist to point out that the LEAST serious effect that science can predict from the rising CO2 is the rising temperature. What that might do to the stability of the climate given that it is a chaotic system with geological evidence of major changes of state from small triggers SHOULD be alarming.

  17. An effective way to destroy an opponent’s denial of a truth is to grant the opponent’s premises and then show that what the opponent denies still holds.

    It seems to me that Curry promoting Lovejoy’s results could be one whale of an “own goal” phenomenon in that she pushes results that destroy her denial of mainstream climate science – her denial of CO2 as the control knob, which is that given continuing CO2 spewing, the very long-term upward *trend* that started in the late 1800s and will continue for the next century or perhaps much more is and will be essentially entirely from this increasing CO2.

    It seems to me she does not recognize that what she promotes turns her evident “anything but CO2” position on the very long-term upward trend into toast. If she eventually does recognize this, then it will be interesting to see what she then says.

  18. Another non-GCM paper making the rounds which has the WUWT’s up in arms

    [1]P. Kokic, S. Crimp, and M. Howden, “A probabilistic analysis of human influence on recent record global mean temperature changes,” Climate Risk Management, 2014.

    http://www.sciencedirect.com/science/article/pii/S2212096314000163

    They are also complaining that they got a wind-fall deal to go open access and bypass the publication charges ! Whine !

  19. Marco says:

    Iza cospirazee, WHT! Not paying publication charges? How dare they not use our tax money!

  20. Marco, only big oil should have access to the truth! It was humorous witnessing their indignation over the article being made available as open-access.

    The key method of the article is quoted below:


    To construct the statistical model we use GHG concentration, solar radiation, volcanic activity and the El Niño Southern Oscillation cycle as these are key drivers of global temperature variance (IPCC, 2007, IPCC, 2013, Meinshausen et al., 2011, Allan, 2000, Benestadt and Schmidt, 2009, Gohar and Shine, 2007 and Wang et al., 2005). This analysis uses recorded data (NOAA National Climate Data Centre, 2011) avoiding the uncertainties that can arise in the complementary climate model-based fingerprint studies (Hegerl and Zwiers, 2011).

    This looks like a common approach that others such as Tamino, Lean, Cowtan, Lovejoy, etc also use. I call my own model CSALT, with the letters standing for CO2, SOI, Aerosols (volcanic), LOD (for the long term natural variability), and TSI (for solar).
    The twist on this approach is that I believe the missing ingredient that other like-minded modelers need to consider is the LOD or “Stadium Wave” part. Add this to the model and it improves the fit immensely — helping to explain various pauses during the 20th century.

    Of course as KeefeAndAmanda suggest above, thanks go out to Judith Curry for the “own goal” stadium wave assist.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s