## The ECS is probably above 2K

I have quite a large conference starting tomorrow, so will probably be too busy to write. To keep things ticking over I thought I would post this seminar given by Andrew Dessler, discussing his recent work on constraining the Equilibrium Climate Sensitivity (ECS). I’ll note that JCH posted a comment about this yesterday, but I had already seen it.

I’ve written a couple of posts about Andrew’s recent papers, but the seminar explains it all very nicely.

The bottom line is essentially that we have strong reasons to think that the lower bound for ECS is probably above 2K, despite some recent work suggesting that it might be below 2K (i.e., we largely understand why this work gets that result and why it’s probably wrong, in the sense that the ECS is more likely to be above 2K, than below 2K). Essentially, this work suggests that the likely range (17% to 83%) for ECS is about 2.4K to 4.5K, with a median at 3.3K.

This entry was posted in Climate change, Climate sensitivity, Research and tagged , , . Bookmark the permalink.

### 110 Responses to The ECS is probably above 2K

1. Dave_Geologist says:

i.e., we largely understand why this work gets that result and why it’s probably wrong
For multiple meanings of why, I suspect 😉 .

And of course nicely in line with the PALAEOSENS 2.2 – 4.8 K.

Although I’m still not overly keen on ECS, as I still get the impression that what’s “in” and what’s “out” of the “equilibrium” is more to do with how tractable or not the slower feedbacks are in GCM implementations. And not necessarily because there is a large timescale difference between them. Especially as some of the ESS ones fall into the category of possible tipping points, where we won’t know we’ve tipped until we’ve passed the point of no return. Maybe I’d be happier if it was called something like “model” or “medium-term” rather than “equilibrium”. Must be the chemist in me 🙂 .

2. Dave,

And of course nicely in line with the PALAEOSENS 2.2 – 4.8 K.

Indeed, and I think it’s one reason why it’s worth questioning results that seem somewhat inconsistent with other analyses. If most analyses suggest ECS > 2K, and one suggest ECS < 2K, it's worth understanding why. If you then find reasons why the latter might be biased low, that probably explains the discrepancy.

3. Dave_Geologist says:

Exactly. Consilience. Once you throw away those ESS estimates which clearly incorporate slow feedbacks which are not implemented in standard climate models, and the ones from long ago when the continents were very different, and throw away the overly-simplistic “observational” ECS estimates, you meet in the middle. Where physics-base GCMs have been all the time.

To plagiarise The A Team, “I love it when a plan comes together!”.

I worry about focusing on ECS making us complacent (although it shouldn’t!) because (a) “slow might still mean “on human-civilisation timescales” (speaking as a European who’s lived in cities with 500-year “old buildings still in use, and visited Roman docks still at current sea level), and (b) tipping points.

And I have a nagging doubt about interglacials being taken out because they’re so close in time to today (yes huge continental ice sheets enabling much more albedo change than a confined Arctic Ocean can), and the PETM because it’s probably the closest analogue for speed and CO2/CH4 source. But yes, consilience 🙂 .

4. I seem to remember that analyses of ECS have had a median of around 3K for some time. What’s the betting that it eventually narrows down to that level, whatever attempts are made to suggest it’s lower?

5. John,

What’s the betting that it eventually narrows down to that level, whatever attempts are made to suggest it’s lower?

Andrew Dessler’s analysis does narrow the range somewhat, but mostly the lower limit (from about 1.5K to around 2.4K). However, he suggests it’s more difficult to reduce the upper limit, which is still around 4.5K. One thing he does comment on, though, is that the high end tail extends to ECS values that do seem physically implausible (i.e., something like 7K isn’t rule out by the analysis but does seem too high to even be possible at some low level).

6. angech says:

The ECS is probably above 2K.
:”The warming generated by CO2 itself. This comes from increased infra-red radiation (IR) from increased quantities of CO2 in the atmosphere. The accepted value of this component is 1.2, ie. the forcing from doubled CO2 on its own would raise global temperature by 1.2 degrees.”
Try as I might this has to be true and is a starting point for ATTP’s assertion.
I would say it is fairly accepted and extremely narrow in range for the current earth atmosphere composition.
It would not be hard to push it over 2 C with any reasonable positive feedbacks and if no negative feedbacks existed.
So, apart from cussedness, why fight the assertion?
Particularly if you get, when all the competing and contradictory records are removed, consilience, as DG said.
The reasons for questioning the assumption are simple.
It is not happening at the required rate, in line with the CO2 increases, as we currently observe it.
Is all.
If it was, Clive Best, JC, Roy Spencer and the rest would be warmists.
I would be [not that that would be of any consequence].
ATTP is possibly right, probably right is just a step too far at this moment.

7. Steven Mosher says:

Seems like there is an nice test for disqualifying models based on the comparison of
theta-iv to observations.

Dr. D mentioned they did comparisons, I wonder if he would be willing to discuss

8. Andrew E Dessler says:

Steve: comparison of models to observations of theta-iv are in Fig. 7 of the paper: https://www.dropbox.com/s/so4k4ljlusdphl5/Dessler_et_al-2018-Journal_of_Geophysical_Research%253A_Atmospheres.pdf?dl=0

We’ve had some electrical issues in our building and my computers are down, but I’ll also answer your question about variability in lambda in the 4xCO2 runs as soon as I can ssh into them.

9. Dave_Geologist says:

ATTP, I think it’s reasonable to say “7K is unlikely because physics”, just as we say “1.5K is unlikely because physics”. Otherwise we’re indulging in double standards. ISTM that one rationale for the 7K’s plus in the palaeo record is that they were triggered by some tipping point such as destabilisation of tundra, permafrost, clathrates etc., which would fall into ESS not ECS (in the sense that they’re not usually modelled in GCMs, although Antarctic permafrost, for example, has been incorporated in bespoke PETM models). And where CH4 is implicated, it depends very much on release rate and oxidation rate. If you think all the CH4 gets oxidised too quickly to matter, and it doesn’t, you’ll get the wrong ESS. Part of my discomfort with focusing too much on ECS is that unlike rock-weathering CO2 drawdown, which we know has a 100 ka timeframe, we don’t know the timeframe of those or other tipping points. It might be 10 ka, it might be 500 years, or it might be 50 years which would bring it right into ECS territory, even if is properly an earth system process.

Because such processes will be highly contingent, it would not be right to extrapolate directly from the past. If the permafrost model for the PETM is correct, it would not have happened had Antarctica not been parked over the South Pole. And it would have been much smaller had there not been a prior period of warm climate to build up the carbon store. It presumably couldn’t happen today, even if we lose the ice-sheets, because unlike the Arctic there’s no significant carbon store. But they serve as a warning about the sort of thing that might happen. We can take some comfort in the fact that some tipping points won’t tip everywhere at once. For example, shallow-water Arctic clathrates won’t go at the same time as mid-latitude shelf/slope clathrates. But how do we know that previous clathrate outbursts were global? Probably not, the physics of the stability field and the general profile of continental margins hasn’t changed. If the PETM was enhanced by clathrates, maybe the Arctic was enough. Or maybe it was all shelf/slope clathrates triggered by a change in deep ocean currents. Or maybe a huge submarine landslide released just enough shelf/slope CH4 to trigger Arctic destabilisation. (I know there are some ocean acidification pointers to a North Atlantic origin, so maybe we don’t have a completely free hand there.)

10. The history of best estimate of ECS

11. Steven Mosher says:

Thanks Dr. D.

I’m a little surprised at GISS not doing so well and GFDL ( one version at least) being bad.
meh.
It would be cool if the next IPCC report had a table of al the theta -iv values like they do for ECS.
and the ratio as well.

Also wondering about it as a screening tool/ tuning target.

thx

12. Steven Mosher says:

angech.

you need a better argument than that. you can.

a. show how the pattern of warming argument is wrong.
b. show how theta iv approach is over confident.

you cant just dodge.

13. Hyperactive Hydrologist says:

Would we not also expect an element of natural variability in the theta iv value?

If so I’m not sure it can be used for discounting models (or model runs) except when you are trying to find a natural variation pattern similar to the observed.

14. Dave_Geologist says:

It is not happening at the required rate, in line with the CO2 increases, as we currently observe it.

Masterful argument angech! Except for one minor flaw. It is happening at the required rate, in line with the CO2 increases, as we currently observe it.

15. JCH says:

you cant just dodge.

He can’t doesn’t mean he won’t.

The hiatus completely fooled a lot of very smart people. I mention this at Climate Etc. on a regular basis. On the other hand, the scientists with their heads in the clouds appear to have the wind at their backs..

16. Observed temperature trends imply low rates of warming.
Presumably, this is because of increases in oceanic heat content.
But this implies disequilibrium and it’s not clear that equilibrium is either necessary or will ever occur.
The entire discussion of ECS obscures the facts that equilibrium may never occur and that observed rates of warming are low.

17. Andrew E Dessler says:

Hyperactive Hydrologist: We investigated the variability in theta-iv in the Dessler et al. ACP paper (https://www.atmos-chem-phys.net/18/5147/2018/). We find that theta-iv has a much smaller internal variability than lambda, which is an additional advantage of the modified energy balance framework.

18. Hyperactive Hydrologist says:

Hi Andrew,

Thanks for that and the video, very informative. Out of interest is the data from the CIMP6 MPI-ESM1.1 ensemble available yet and did they also do a large ensemble of future runs? My wife would certainly be interested in the data for impact studies. She is particularly interested in internal variability.

19. Everett F Sargent says:

From the Dressler paper (p. 5) …

“Overall, our calculated ECS distributions overlap substantially with the IPCC’s range, although our distributions are shifted to higher values: we see a ~30% chance that ECS exceeds 4.5 K, while the IPCC assigns a 17% chance. And we see less support for low values of ECS: the chance of an ECS below 2 K is 6–15%, while the IPCC assigns a 17% chance that is below 1.5 K.”

AFAIK, the above is an all too common misinterpretation on the IPCC AR5 WG1 three ECS conditional criteria, stated as follows:

The IPCC ECS statement has three conditionals as follows:
(1) IF(ECS.LE.1)THEN(P.LE.0.05)
(2) IF(ECS.GE.1.5.AND.ECS.LE.4.5)THEN(P.GE.0.66.AND.P.LE.1.00)
(3) IF(ECS.GE.6)THEN(P.LE.0.1)

Technical Summary IPCC AR5 WG1 (p. 81)
https://www.ipcc.ch/pdf/assessment-report/ar5/wg1/WG1AR5_TS_FINAL.pdf
TS.5.3 Quantification of Climate System Response (1st full paragraph)

Estimates of the equilibrium climate sensitivity (ECS) based on observed climate change, climate models and feedback analysis, as well as paleoclimate evidence indicate that ECS is positive, likely in the range 1.5°C to 4.5°C with high confidence, extremely unlikely less than 1°C (high confidence) and very unlikely greater than 6°C (medium confidence). Earth system sensitivity over millennia time scales including long-term feedbacks not typically included in models could be significantly higher than ECS (see TFE.6 for further details). {5.3.1, 10.8; Box 12.2}

Box TS.1 | Treatment of Uncertainty (p. 36)
Extremely unlikely 0–5% probability (1st conditional)
Likely 66–100% probability (2nt conditional)
Very unlikely 0–10% probability (3rd conditional)

That let’s you assign up to 34% uncertainty outside the likely range (I’d choose the fat/heavy tail on the RHS for all of that available deep uncertainty).

I’ve fitted like 13 different PDF’s (Beta, Burr, Fréchet, Johnson SB, Generalized Gamma, Pearson 6, Pearson 5, Dagum, Fatigue Life, Lognormal, Inverse Gaussian, Log-Logistic and Gamma) using those three conditionals (exponential or power law fat/heavy RHS tails).

I might comment further as long as the discussion remains extremely civil.

20. TE,

The entire discussion of ECS obscures the facts that equilibrium may never occur and that observed rates of warming are low.

Define low. Not only is the observed warming consistent with expectations, the observed warming is also consistent with it depending linearly on emissions. So, how much we end up warming will probably depend on how much we end up emitting.

21. Observed temperature trends range from 1.5 to 1.8 C/century.
Observed temperature trends are low compared to AR4 projections ( ~ 1.8 to 4.0 C/century )
Observed temperature trends are low compared to AR5 projections ( ~ 1.1 to 4.1 C/century )
Observed temperature trends are low.

22. Dave_Geologist says:

Observed temperature trends imply low rates of warming.
No they don’tt. as I already pointed out to angech, since the “faux pause” we’re back on trend with predictions. Not only that, the trend remains on prediction with the “faux pause” included. Unsurprisingly, because if you use the correct null hypothesis, there never was a pause. Just a period when we were less than 95% confident (but still pretty damn confident) that the trend was increasing. The confidence level that there was an actual pause was so laughably small as not to be worth mentioning.

In one sense we’ll never reach equilibrium (Milankovitch etc.), but given that overall feedbacks need to be positive to explain glacials and interglacials, any disequilibrium other than ludicrous special pleading must be in the direction of those ECS estimates which assume equilibrium being underestimates.

23. BBD says:

Witness the legacy of Lewis and Curry’s mischief making.

24. Andrew E Dessler says:

Hyperactive Hydrologist: Yes, the MPI group has a 100 runs of 2000-2100 using the RPC4.5 scenario and 100 using the RCP8.5 scenario.

Turbulent Eddie: “Warming rates are low”: I’d disagree with that. In the MPI ensemble, the coolest member has warming of about 0.65 K from 1850-2005, while the hottest one has warming of 0.95 K from 1850-2005. Both of these trajectories come from a model with an ECS of 2.9 K. So it’s clearly possible to get a relatively small amount of warming, even with an ECS in the middle of the IPCC range.

25. TE,

Observed temperature trends are low compared to AR5 projections ( ~ 1.1 to 4.1 C/century )
Observed temperature trends are low.

Are you referring to projections to 2100? These depend mostly on how much we emit in future, not on how much we’ve already emitted.

26. BBD says:

27. angech says:

Dave_Geologist says: ” It is not happening at the required rate, in line with the CO2 increases, as we currently observe it.’
“Masterful argument angech! Except for one minor flaw. It is happening at the required rate, in line with the CO2 increases, as we currently observe it”.
JCH says: “The hiatus completely fooled a lot of very smart people.”
Turbulent Eddie says: “The entire discussion of ECS obscures the facts that equilibrium may never occur and that observed rates of warming are low.”
Dave_Geologist says: “No they don’t. as I already pointed out to angech, since the “faux pause” we’re back on trend with predictions.
Everett F Sargent says: [heavily and selectively redacted by me] the Dressler paper (p. 5) … And we see support for low values of ECS: the chance of an ECS below 2 K is 6–15%, while the IPCC assigns a 17% chance that is below 1.5 K.”
None of this obviates ATTP’s assertion, what it does do is illuminate the smaller counter assertion.
DG, What is the required rate in line with the CO2 increases, as we currently observe it.
If we have a warming for CO2 doubling of 3.5 C, and doubling is expected over a hundred years from whichever starting point you wish to choose, e.g. 1918, 1968, 1998 then logically one expects 0.35 C a decade.
Where is it, observationally?

28. angech,

If we have a warming for CO2 doubling of 3.5 C, and doubling is expected over a hundred years from whichever starting point you wish to choose, e.g. 1918, 1968, 1998 then logically one expects 0.35 C a decade.

No, you don’t. Equilibrium would occur a long time after we’ve doubled atmosphere CO2. At the time at which CO2 is doubled, we’d have the transient response, which is probably between 1K and 2.5K, which is entirely consistent with the observed warming. It’s almost as if you haven’t read any of my posts.

29. Everett F Sargent says:

angech sez …
“Everett F Sargent says: [heavily and selectively redacted by me] the Dressler paper (p. 5) … And we see support for low values of ECS: the chance of an ECS below 2 K is 6–15%, while the IPCC assigns a 17% chance that is below 1.5 K.””

… well you dropped a rather important word in the above misquote, “we see support” should be “we see LESS support”

“None of this obviates ATTP’s assertion, what it does do is illuminate the smaller counter assertion.”

My post was to explain the IPCC AR5 WG1 three conditional ECS statements.

Taken in that light, the Dressler results are, in fact, much closer to the three conditional ECS statements, than as stated in their own text.

30. Leaving aside angech’s predictable botch-up of confusing ECS for TCR, he’s also asserting a whopper with the claim that “(CO2) doubling is expected over a hundred years from whichever starting point you wish to choose, e.g. 1918, 1968, 1998” *and* that we can compare our observations of (TCR) temperatures to a trend (0.35° C/decade) based on this “observed” 100 years/doubling of CO2…

Of course, in order for CO2 concentrations to be “doubling over a hundred years”, starting from his dates of 1918, 1968, 1998, observed concentrations in 2018 would have to be ~590ppm, 495ppm and 455ppm. Which, um, they’re not.

Spare us, ok?

31. The correlation of global mean surface temperature anomalies with RF ( NOAA ):

This yields a response of about 1.8 °C per effective CO2 doubling.

The context of 1.8 °C in this range of NASA potential sensitivities:

Now, I’d agree that according to observations, oceanic heat content is increasing, meaning disequilibrium. But speculating on ECS is wandering away from empiricism because one doesn’t know when or even if there will be equilibrium, so ECS is not something that will ever be validated. The surest way for ocean heat to emerge from the oceans will be when atmospheric temperatures are lower. But the corollary is that when atmospheric temperatures are higher, the oceans continue to take up heat, making them global buffers in yet another context.

32. TE,

But speculating on ECS is wandering away from empiricism because one doesn’t know when or even if there will be equilibrium

Technically, ECS is a model metric but your suggestion that it is wandering away from empiricism is simply wrong. You can make estimates for ECS using paleoclimate data.

The surest way for ocean heat to emerge from the oceans will be when atmospheric temperatures are lower. But the corollary is that when atmospheric temperatures are higher, the oceans continue to take up heat, making them global buffers in yet another context.

I don’t think this makes any sense, but I can’t really tell.

33. angech says:

ATTP in essence.
“An important distinction is made between the equilibrium sensitivity — the temperature change realized after allowing the climate system to equilibrate with a higher value of CO2 — and the response on shorter time scales, before the deep oceans have had time to equilibrate, that is of more direct relevance to the changes we are likely to see in the 21st century. The latter is often quantified by raising the carbon dioxide in a model at the rate of 1% per year and examining the response at the time when carbon dioxide concentration has doubled, referred to as the transient climate sensitivity or response. (At a rate of 1% per year, doubling requires 70 years.)
Equilibrium sensitivities in global climate models typically range from 2 to more than 4C, while the transient climate responses are smaller, in range of 1.0-2.5C.”

34. angech says:

“the Dressler paper (p. 5) low values of ECS: the chance of an ECS below 2 K is 6–15%, while the IPCC assigns a 17% chance that is below 1.5 K.”
I have read a lot of your posts. They put forward in so many ways the myriad, compelling arguments for AGW, and it’s consequences.
The argument about ECS taking a long time to reach, if ever, its final value ” Equilibrium would occur a long time after we’ve doubled atmosphere CO2.” avoids the claim I made about expecting 0.35C a decade [averaged *] over a hundred years.
I take rustneversleeps comment “Leaving aside angech’s predictable botch-up of confusing ECS for TCR,” on board [TCR is for 70 years, but never mind].
A simple thought experiment.
If CO2 was doubled today and stayed at that level for a year.
How high would the temperature be tomorrow, in one month and in 1 year?
I would argue that according to theory with an ECS of 3.5 that it should be basically 3.5C warmer on all 3 dates. Some heat must slowly go into the oceans each day but the air must heat up to the temperature specified by that concentration every day.

35. Harry Twinotter says:

Angec.

“It is not happening at the required rate, in line with the CO2 increases, as we currently observe it.”

References to back up your claim?

36. angech,

If CO2 was doubled today and stayed at that level for a year.
How high would the temperature be tomorrow, in one month and in 1 year?
I would argue that according to theory with an ECS of 3.5 that it should be basically 3.5C warmer on all 3 dates.

You can argue whatever you like, but you’d be completely wrong. If we doubled atmospheric CO2 today, we wouldn’t warm to equilibrium by tomorrow, in a month, or even in a year.

37. angech says:

rustneversleeps says:
“Of course, in order for CO2 concentrations to be “doubling over a hundred years”, starting from his dates of 1918, 1968, 1998, observed concentrations in 2018 would have to be ~590ppm, 495ppm and 455ppm. Which, um, they’re not.”
Thanks for pulling me back into line.
In a very poor defense I did use the word, If, to preface my remarks but that is dodging.

38. angech says:

“You can argue whatever you like, but you’d be completely wrong. If we doubled atmospheric CO2 today, we wouldn’t warm to equilibrium by tomorrow, in a month, or even in a year.”
True.
It would be interesting for an atmospheric physicist to say how warm it would be though, 1 day, 1 month and 1 year after an instantaneous and maintained doubling of CO2. Some heat would be constantly going into the sea seeking equilibrium over a long time frame. Would the same rules of physics however dictate a temperature level of the atmosphere not too far of equilibrium.
CO2 double Temp up 3.5 C [or close]?

39. angech,

It would be interesting for an atmospheric physicist to say how warm it would be though, 1 day, 1 month and 1 year after an instantaneous and maintained doubling of CO2.

Noone is claiming that these are the types of questions that we could answer, or even why we’d want to. Predicting changes on specific dates probably isn’t possible. On those timescales, variability would play a big role. What we’re interested in is how perturbing the system changes the average/underlying state.

40. verytallguy says:

If CO2 was doubled today and stayed at that level for a year.
How high would the temperature be tomorrow, in one month and in 1 year?
I would argue that according to theory with an ECS of 3.5 that it should be basically 3.5C warmer on all 3 dates.

If we add one to one today, you would argue that according to theory the answer should be three on all three dates.

You keep on asserting things as fact that are simply and obviously incorrect.

You should ask yourself why you do this.

41. “You should ask yourself why you do this.”

As with the redneck, that’s a good question to ask the larrikin. I suspect it’s in their nature.

42. Joshua says:

angech | September 4, 2018 at 7:31 am |
The distance of the sun from the earth is obviously another potential factor in the different seasons…

Oy.

43. Chubbs says:

TE’s 1.8C per CO2 doubling is actually spot on with model mean predictions. Note that adjustments described in Richardson et. al. (2016) for observation coverage and use of SST instead of air temperature over the ocean lower model-predicted temperature trends by roughly 20%. Factoring in the current energy imbalance, the observed warming trend makes ECS<2C very unlikely. So recent observed warming, climate models, and paleo are all in alignment.

In hindsight, the flurry of EBM papers was well timed for the latter portion of the hiatus, leading to a brief period of prominence. Subsequent research and the end of the hiatus however shows they don't have much predictive power.

44. Dave_Geologist says:

angech, a day, a month or a year are all so short as to be transient. TCS not ECS.

45. Andrew E Dessler says:

If CO2 was doubled today and stayed at that level for a year.
How high would the temperature be tomorrow, in one month and in 1 year?

If you go to 1:20 in the video linked to this post, you can see the answer to your question (as predicted by a model, at least). It shows an abrupt 4xCO2 run, where CO2 is abruptly quadrupled and the system is then run for 150 years. You can see rapid warming for the first 10 years, as the ocean’s mixed layer equilibrates, then slower warming, which is paced by warming of the deep ocean. Even when the run ends after 150 years, the system is still warming.

46. Magma says:

…speaking as a European who’s lived in cities with 500-year-old buildings still in use… — Dave_Geologist

One of the things that drive me mad is when climate change deniers scoff at the idea of change by 2100 as if that was sometime in the unimaginably distant future. One of my grandfathers was born in 1900, and one of *his* grandfathers in 1798. I would like my own grandchildren, if and when they make an appearance on the planet, to have a future that is not torn by avoidable environmental damage and political strife.

47. Phil says:

Given Prof. Dessler’s comments in the video about the energy balance estimates of ECS being poor due to the natural variability in the GMST record, I wonder if anyone has thought to see what the energy balance calculation gives with data using the Foster and Rahmstorf method Figures 4 and 5 from here of removing natural variability.

It doesn’t seem an unreasonable thing to try – if only to gauge how other factors (non-linearities in the feedback response ?) also contribute to the divergence between the different methods of calculating ECS.

48. Andrew E Dessler says:

Phil: I don’t think that would be a particularly productive way to approach the problem. When I talk about internal variability, it’s not variability in the global average, but rather variability in the spatial pattern. This matters because two Earths with the same amount of global average warming but different patterns of warming can give you quite different responses in top-of-atmosphere energy balance. That’s what introduces spread in the lambda that we calculate in the ensemble.

49. Dave_Geologist says:

…speaking as a European who’s lived in cities with 500-year-old buildings still in use… Americans probably have a shorter perspective. Even in the Antebellum South. I don’t just mean cathedrals and stuff BTW. ATTP would probably know, but I think Edinburgh University had, at least until recently, student accommodation dating back to the 1600s or 1700s. The town where I grew up, perfectly ordinary, less than 100k population and not a county seat or bishopric, has a 300-year old inn with a mounting-stone outside for stagecoach passengers. Still in daily use as a pub (the stables were converted to a lounge bar and function suite when horses went the way of Stone and Bronze). Covenanters were mustered at the location in the 1640s, so there was probably an earlier inn on the site. The 18th century tollbooth for the turnpike road is still in commercial use (although not as a tollbooth).

The central church dates from the late 1700s, as do weavers’ cottages which are still in use as residential homes. The earliest church record, however, is pre-Reformation and dates back to 1175. (The town name, that of a small hill just outside the original village, and the radial pattern of ancient roads indicates that it was a religious centre since Celtic times. So in fairness it may have a bit more old stuff than average.) A couple of years ago I went to a relative’s wedding reception in a hotel dating from the 1780’s. Very flash it was inside too, an LCD screen above each urinal so you don’t miss the sports action when taking a necessary break 🙂 . Another local hotel was the shooting lodge for a now-derelict castle. And now I think about it, there’s another derelict castle a few miles downriver. OK, so either my ancestors were very warlike, or it used to be a bit more of a local centre and fell on hard times, probably in the Industrial Revolution for lack of coal or iron 😉 .

The old Laird’s mansion dating from 1737 is now a private home, but is the latest of a succession of buildings on the site since 1383. And of course there’s a castle dating from the 1400s. Still in pretty good shape. Someone lived in it until the 1980s and there have been various abortive attempts at commercial development since. The remains of a much older motte-and-bailey are in the grounds. The old castle was demolished to provide stone for the “new” one. Presumably the old one was outdated and not cannon-proof. Another castle/mansion dates from 1620 and was converted to offices in the 1940s and used until the 1990s (the downside of having lots of old stuff around is that it doesn’t always get the respect it deserves 😦 ).

To a European (at least this one), 2100 is the day after tomorrow. And to a geologist, 10,000 years hence is next week and a million years hence next month.

50. angech says:

Andrew E Dessler says: ” If CO2 was doubled today and stayed at that level for a year.
How high would the temperature be tomorrow, in one month and in 1 year?
If you go to 1:20 in the video linked to this post, you can see the answer to your question (as predicted by a model, at least). ”
Thanks. Appreciated.
Will do so and comment no more until I have taken it in.

51. izen says:

@-“To a European (at least this one), 2100 is the day after tomorrow. And to a geologist, 10,000 years hence is next week and a million years hence next month.”

But to the financial derivatives market (which is larger than the market in physical goods) the longest time horizon is five years, and most trades look to hedge the future for a year or less.
IOW a dominant component of the economic system has the foresight of a squirrel.

52. Steven Mosher says:

angech

remember i told you you had 2 choices.

attack the pattern argument
or
theta iv.

this is basic.

watch.

https://judithcurry.com/2018/09/05/warming-patterns-are-unlikely-to-explain-low-historical-estimates-of-climate-sensitivity/#more-24303

if the argument turns on which version of ice is used,
well thats a structural uncertainty that needs to be shown in the main text.

53. Dave_Geologist says:

Nic should submit a Comment to the journal. If he wants public peer review first, he should post it on https://eartharxiv.org/ or https://arxiv.org/, where it will be read by people a thousand times more informed and objective than the inhabitants of a famously partisan blog. One which is frequented by ill-mannered people who drive away serious readers. My suggested edits from a first look:

the authors’ key claim, that climate sensitivity estimates based on observed historical warming are too low, are highly sensitive to the SST and sea-ice dataset used. Results using the more recent dataset contradict confirm their claims, strike>largely due to in that differences between the two datasets in the evolution of sea-ice more than counteracting the effects of evolving patterns of SST change over the open ocean. I therefore think it is difficult to draw any strong conclusions from the simulation results presented in the paper simple, global-average energy-balance models, on the grounds that the spatial distribution of warming, sea-ice and probably other parameters can drastically bias the conclusions. Alternative approaches should be considered which more realistically represent the spatial variation of temperature and other parameters.

54. Dave_Geologist says:

Oops, fixed bad html (I hope 🙂 ).

Nic should submit a Comment to the journal. If he wants public peer review first, he should post it on https://eartharxiv.org/ or https://arxiv.org/, where it will be read by people a thousand times more informed and objective than the inhabitants of a famously partisan blog. My suggested edits from s first look:

the authors’ key claim, that climate sensitivity estimates based on observed historical warming are too low, are highly sensitive to the SST and sea-ice dataset used. Results using the more recent dataset contradict confirm their claims, largely due to in that differences between the two datasets in the evolution of sea-ice more than counteracting the effects of evolving patterns of SST change over the open ocean. I therefore think it is difficult to draw any strong conclusions from the simulation results presented in the paper simple, global-average energy-balance models, on the grounds that the spatial distribution of warming, sea-ice and probably other parameters can drastically bias the conclusions. Alternative approaches should be considered which more realistically represent the spatial variation of temperature and other parameters.

55. Hyperactive Hydrologist says:

Nic’s method is it is very dependant on the historic period of choice. Coincidentally the periods chosen tend to give low estimates of sensitivity. Nic tries to justify this choice by attempting to match natural variability by looking at ENSO and AMO but does not, as far as I am aware, undertake a quantitative analysis including any type of spacial assessment. Estimating the range of natural (internal) variability is not possible from the observed record particularly with the climate change signal superimposed.

56. Dave_Geologist says:

Shorter version of the above:

Even if Lewis’s scientific criticism is correct (and his needless ad-hom imputation of dishonesty does rather cast doubt on his objectivity), it doesn’t demonstrate that his ECS is right. Only that it may be too high, too low, or right-by accident, but that other than the right-by-accident possibility, we can be very confident that it’s wrong. And the mountain of other evidence that it’s higher makes the right-by-accident possibility extremely unlikely.

57. Hyperactive Hydrologist says:

One way to minimise natural variability would be to use much longer assessment period for the historic and present periods. That is what I tried in the estimates of TCR below for periods of 10, 20, 30, and 50 years for the historic and present using CRU temp data . I also sampled all periods between 1850 and 1950 for the historic period. Unsurprisingly the results are very sensitive to periods with high volcanic forcing and recent research has suggested the historic volcanic forcing is too high. Reducing this forcing by 50% gives quite interesting results.

https://ibb.co/gqBFES

58. Hyperactive Hydrologist says:

Plots again for CRU data:

and BEST:

59. Hyperactive Hydrologist says:

Try again

Plots again for CRU data:

and BEST:

60. Phil says:

Andrew Dessler:

it’s not variability in the global average, but rather variability in the spatial pattern.

Ah, OK, I understand. Thanks for responding !

61. Dave_Geologist says:

it’s not variability in the global average, but rather variability in the spatial pattern.

That’s what I was getting at. There have been a bunch of papers in the past year showing that you get a different ECS estimate from EBMs if you sector the Earth and add the results, rather than simply working with global averages. And different results if you use the same sectors and distribute temperature or other parameters differently. So even if you ignore aspects such as cherry-picked time intervals or datasets that “hide under-represent the Arctic warming”, it’s no longer tenable to claim that any simple, global-average EBM gives a robust ECS estimate. In fairness to LC18, they probably did their work and wrote the paper before that stuff came out. But instead of doubling down, they should wake up and acknowledge that the thing they’re now smelling is indeed coffee.

62. BBD says:

No patience with L or C. L definitely knew – because I have pointed it out to him at least twice, here – that his results are not compatible with palaeo-derived estimates.

63. HH said:

“… looking at ENSO and AMO but does not, as far as I am aware, undertake a quantitative analysis including any type of spacial assessment. Estimating the range of natural (internal) variability is not possible from the observed record particularly with the climate change signal superimposed.”

I think we can estimate this with greater precision than you would think. ENSO is characterized as a standing wave and thus always reverts to a mean value of zero. The standing wave characteristics can be deduced from the SOI measure.

64. Everett F Sargent says:

So wrt Nic Lewis and all his blog posts on his own, but more importantly, papers by all others, will those blog posts ever see the light of day in IPCC AR6 WGI?

I can see some of them appearing indirectly via NL citing those blog posts as references in his peer reviewed papers.

I don’t think blog posts even qualify as grey literature,

As the One Man Gang for low ECS (in terms of a single most prolific author publishing low ECS values), I’m afraid NL has an uphill (or steep cliff) battle. As the sum total of all his low ECS papers are but one single author’s worth of work.

My predictions for the next IPCC AR6 WGI three (or more) ECS conditional statements:

(1) IF(ECS.LE.1)THEN(P.LE.0.01)
(2) IF(ECS.GE.1.5.AND.ECS.LE.4.5)THEN(P.GE.0.75.AND.P.LE.1.00)
(3) IF(ECS.GE.6)THEN(P.LE.0.05)

65. JCH says:

1993 to 2010, the Eastern Pacific was blowin’ in the wind:

66. izen says:

Is there something odd about a paper that reduces the PDF of climate sensitivity at both the bottom and top of the range garners most attention, and contention, for shaving a few tenths of a degree of the low end rather than relief and acknowledgement of its reduction of the probability of ECS over 4C ?

It is almost as if there was an assumption that climate sensitivity will be at the lower end of any estimate we make and therefore losing ground below 2C is much more significant than gaining ground below 4.5C.

67. Andrew E Dessler says:

Is it just me or do other people find everything Nic Lewis writes to be impenetrably hard to read? I usually just give up after a few paragraphs.

From what I can understand, it seems that the main issue Nic is raising is that using different SST/ice data sets leads to very different inferred ECS values. I haven’t looked into the situation, but that just seems very unlikely to be true, although I suppose it’s possible if one of them has a much bigger reduction in sea ice than the other one.

68. Andrew,

Is it just me or do other people find everything Nic Lewis writes to be impenetrably hard to read? I usually just give up after a few paragraphs.

It’s not just you. I have read more than a few paragpraphs, but it does take some effort.

69. Nic also seems to be claiming that there are two pattern effects

1. that arising from the difference between the simulated spatial pattern in response to long-term CO2-forcing and the spatial pattern simulated when GCMs respond autonomously to evolving forcing over the historical period; and
2. that arising from the difference between the spatial pattern over the historical period simulated when GCMs respond autonomously to evolving forcing and the spatial pattern when they are driven instead by a specified, observationally-based, evolution of SST and sea-ice, with unchanging forcing.

This doesn’t really make sense to me. I thought the pattern effect was simply that the feedback response depends on the pattern of the surface warming.

70. Steven Mosher says:

Anyway, the spatial argument has always been obscure. ( as in no images) and so now it is obscurity about the obscure. meh. read harder.

The Ice dataset seems to be a simple question to answer.
Dr. D should not be using out of date data. IF the main paper does, then a update is required.
Funny that Nic L makes this complaint since he refused to use our latest data.!
I will tell Dr. D exactly what I told Nic
Use the most current data please, until such time, I suspend judgement on your claims.
goose gander and all that.

71. Steven Mosher says:

Other folks can now respond.

Do the work yourself with the newest data ( Nic L used this one too!!!, but supplied code)

I do wish Nic would post stuff on arvix instead of Judiths,

72. JCH says:

Maybe I’m wrong, but this isn’t about one paper; it’s a wall of papers. I’m just a cowboy, but in my town a tornado, which is sort of what the Eastern/Equatorial Pacific experienced during the hiatus, ain’t the average weather:

73. Steven Mosher says:
74. “Anyway, the spatial argument has always been obscure.”

Not necessarily in the ocean, where the spatial aspects are often cleanly separable. For example, ENSO is a standing wave that has a stable (or IOW a stationary) spatial geometry. The boundary conditions of the ocean effectively pin down the standing wave to specific wavenumbers (i.e. spatial wavelengths referenced to the earth’s circumference). Of course the intriguing part of this is the massive impact that ENSO has on temperature variability across the planet. Big breakthroughs coming soon on this topic — the fact that the spatial aspect is so stable is making scientists realize that the seemingly chaotic nature can be factored.

75. Steven Mosher says:

funny

76. obscure

77. BBD says:

Use the most current data please, until such time, I suspend judgement on your claims.
goose gander and all that.

Palaeoclimate.

Consilience and all that.

78. Chubbs says:

One of Isaac Held’s early blogs is relevant and consistent with the video above. The polar oceans, with their large heat uptake, slow the transient response in climate models; thereby fooling simple energy-balance models. Interestingly Nic Lewis asks a question in the blog comment section.

https://www.gfdl.noaa.gov/blog_held/time-dependent-climate-sensitivity/

79. Chubbs says:

I posted the numbers below in a comment on ATTP’s post on L&C18

Temp delta between L&C18 main periods 1869-1882 and 2007-16:

C+W – 0.86C
BEST – 0.96C
RCP6 (surface temperature) – 0.98
RCP6 blended (model mean from KNMI explorer blended 71%SST/29% land, like obs) – 0.81C

So the temp obs used in OBM are well predicted by climate models.

80. qwertie says:

Nic Lewis’s 2018/09/05 post seems to be talking about a different paper (“Accounting for Changing Temperature Patterns Increases Historical Estimates of Climate Sensitivity”) than the ones this video is based on (“The influence of internal variability on Earth’s energy balance framework and implications for estimating climate sensitivity” and “An estimate of equilibrium climate sensitivity from interannual variability”). So why are we talking about it in this thread?

81. qwertie says:

As a non-scientist I’ve been wondering…

1. If ΔR=ΔF+λΔTs doesn’t give correct results, where did the formula come from and why wasn’t it noticed to be wrong/dubious earlier?
2. Why we are using average temperatures in these calculations anyhow? Given Stefan-Boltzmann (j*=σT^4), why not, instead of defining ΔT as change in average temperature, define it as the change in the fourth root of the average T^4?

82. JCH says:
83. qwertie,
1. The point is that that equation assume that the response to a radiative perturbation is linear (i.e., introduce a change in forcing of $\Delta F$ and the resulting change in temperature, $\Delta T$, when equilibrium is reached – $\Delta R = 0$ – will be $\Delta T = \Delta F/\lambda$). This is approximately what we’d expect, but it seems that the reponse is not only potentially slightly non-linear, but the actual response can be influenced by internal variability so that on shorter timescales the temperature response can vary.

2. I explain a bit of that in this post. If the temperature is $T$ then the flux is $\sigma T^4$. If the temperature changes slightly, then the change in flux is $\Delta F = 4 \sigma T^3 \Delta T$. As long as we’re talking about small changes in $T$, then the change depends on $\Delta T$ (i.e., $4 \sigma T^3 \sim constant$).

84. niclewis says:

Dave_Geologist says:

“Nic should submit a Comment to the journal.”

Perhaps you are you ignorant of the fact that the journal Andrews et al published in (Geophysical Research Letters) has a policy of not accepting Comments?

85. niclewis says:

Andrew E Dessler says:

“When I talk about internal variability, it’s not variability in the global average, but rather variability in the spatial pattern. This matters because two Earths with the same amount of global average warming but different patterns of warming can give you quite different responses in top-of-atmosphere energy balance. That’s what introduces spread in the lambda that we calculate in the ensemble.”

It is of course such variabilty in spatial pattern that the Andrews et al (2018) paper iimplies is the cause of low historical period energy-balance estimates of climate sensitivity that use observational temperature and heat uptake data.

What is interesting is that the excess of the largest (absolute) lambda estimate from the MPI-ESM1.1 100-member historical ensemble (using 20-year averages at the start and end of the simulation) is only 16.6% greater than the average estimate. By contrast, the increase in lambda that Andrews et al. seek to attribute to the effect of spatial pattern differences is 37% (mean lambda_amip / lambda_hist per Table 1 in my article at Climate Etc), over twice as large – and that is for a one in a hundred possibility.

86. Nic,
Do you at least get the point that we’ve only experienced one reality and that there is a range of possible warming we could have experienced for the same change in external forcing and even a range of possible planetary energy imbalances, given this temperature change. Hence using the historical period to estimate climate sensitivity may give a best estimate that is not necessarily representative of what we will experience over longer timescales?

87. angech says:

…and Then There’s Physics says:
“Do you at least get the point that we’ve only experienced one reality ”
Only one team wins the Superbowl, ATTP.
Despite all the possible permutations that possibly could exist there is only ever one right answer.
If the theories cannot accept this answer then [Feynman – sorry] the theories were and are wrong. Somewhere, somehow.
ECS might be lower than 2.0C.

88. BBD says:

Hence using the historical period to estimate climate sensitivity may give a best estimate that is not necessarily representative of what we will experience over longer timescales?

The long shadow of climates past…

Just won’t go away.

89. Steven Mosher says:

“Palaeoclimate.

Consilience and all that.

Consilience: One set of evidence: say from documents, is the same as other evidence
from a different area, say from meaurement systems.
So, the thermometer tells you its hot, and other bits of evidence , say from
plants, animals, journals, etc tell you the same kind of thing.

Paleo: Tells us X
Modeling study tells us close to X.

Now, Comes the question, did the modelling study use the latest best data?
In other words DOES the apparant consilience hold?
Merely pointin to the apparent agreement, doesnt actually address the question.
Its the kind of logic I see at WUWT all the time. Its rather Angechian.

But I get your reaction. When I was pushing on Nic to consider using the berkeley dataset he had
a bunch of reasons why the simple task could not or should not be done.

My take: If asked to review paper X, from ANY ONE, studying ANY subject, taking ANY position
IF there is more than one credible source of data, and if the conclusions could change as a result,
Then the Importance of CONSILIENCE and the importance of structural uncertainty actually
demands that I take a look at the alternatives.

IN the end if we show that important conclusions turn on data selection, then the guys who do data will actually have good reason to go back and redo their work.

Hint: Long before the pause, Long before the general understanding that hadcrut was biasing the record by its treatment of the artic, Long before Cowatn and Way, gavin told me nothing would be gained by re looking at the temperature series. who would have thunk it?

Anyway. Like Dr Ds talk, liked his paper. Cool approach. Just one question. easy to put a nail in the coffin.

90. ” If the temperature changes slightly, then the change in flux is 4 \sigma T^3 \Delta T. As long as we’re talking about small changes in T, then the change depends on \Delta T.”

Take it one more step with this differential calculus tip: Divide by the total flux and sigma drops out.

dF/F = 4 dT/T

91. dikranmarsupial says:

angech says ” Will do so and comment no more until I have taken it in.”

a short time later angech says “Only one team wins the Superbowl, ATTP.”

a indicating that he STILL hadn’t taken the time to understand the issue. The point is that if we roll a die and get a four, does that tell us whether it is a d4 (a four sided die) a d6, a d8 a d12 or a d20? No. There is a limit to what you can discover from a single realisation of any chaotic process. This has been explained many many times on this blog and it is a shame that angech still hasn’t picked up on this basic issue.

92. qwertie says:

@ATTP
1. Okay, I still don’t know where the formula came from, but let me reformulate the second part of my question: is solving for λ in ΔR=ΔF+λΔTs a new technique? If this is just a new thing people are doing in the last few years, that would explain why we are only just now noticing problems in that approach.
2. I found that post to be unhelpful. For example, from my perspective the formula N_t = N_0 + \Delta F – 4 \epsilon \sigma T^3 \Delta T + W_{\rm feed} \Delta T pops out of nowhere without a derivation, and I don’t really see how it answers my question anyhow.

Here’s how I look at it: all physics is local. Perhaps this formula makes sense to a first approximation where Earth’s surface is a point source of radiation (at least, it makes sense to those who understand the formula in the first place, ie not me as a non-climate-scientist), but that’s where climate science was in the 19th century. By now the field should be looking at things a little more carefully. So the way I’m looking at it, many places on Earth have 20°C temperature swings every day; that’s not a small change in T, let alone T^3 or T^4. Or, looking at climate change over hundreds of years, we’re expecting global mean temperature to change 3°C which is about 1% of T, so the T^4 changes by about 4%. So, I don’t understand: from what perspective can we treat T as a constant?

93. Dave_Geologist says:

Is it just me or do other people find everything Nic Lewis writes to be impenetrably hard to read? I usually just give up after a few paragraphs.

Nic needs to learn how to use the Oxford comma. or write shorter sentences.

94. Dave_Geologist says:

Or, if you prefer: Nic needs to learn how to use the Oxford comma, or write shorter sentences. 😉 .

95. Dave_Geologist says:

Geophysical Research Letters has a policy of not accepting Comments

I was unaware of that. Not a policy I’d agree with. I did wonder why I sometimes encounter what looks like it should be a Comment appears as a paper in another journal. My comment still applies though. As long as it stays in the grey literature it will be a second-class citizen as far as the scientific community goes. One of the arxiv servers is still better than a blog. You’ll attract a more informed readership and be taken more seriously

96. qwertie,
1. I don’t think it’s new. I think the new thing is people claiming that results using this simple energy balance approach are somehow superior to results using different methods.

2. Okay, I’ll try and explain where that comes from. Imagine the planet is in energy balance. Then $N_0 = 0$. Now we instantaneously add a change in forcing $\Delta F$. This will produce a planetary energy imbalance $N$. So, at this stage.

$N = N_0 + \Delta F = \Delta F.$

Now consider what happens if the surface warms in response to this by an amount $\Delta T$, but that there are no other feedbacks. The surface flux is $\sigma T^4$. If we warm by an amount $\Delta T$ then we can write the change in surface flux as $4 \sigma T^3 \Delta T$. However, not all of this is radiated into space, so the change in outgoing flux is $4 \epsilon \sigma T^3 \Delta T$. Now this will change the planetary energy imbalance so that we now have

$N(t) = N_0 + \Delta F - 4 \epsilon \sigma T^3 \Delta T.$

Now image that other feedbacks operate and that they depend linearly on change in temperature. We can write this as $W_{feed} \Delta T$. This will also change the planetary energy imbalance so that we now have

$N(t) = N_0 \Delta F - 4 \epsilon \sigma T^3 \Delta T + W_{feed} \Delta T.$

I’m leaving $N_0$ in even though we said it was zero just to show the more general case of it not being zero (I could have said this at the beginning but that would require rewriting this comment 🙂 ).

We can rewrite this as

$N(t) - N_0 = \Delta F - 4 \epsilon \sigma T^3 \Delta T + W_{feed} \Delta T.$

If $\Delta T$ is small, then we can treat $T$ as constant and then rewrite this as

$\Delta N = \Delta F + \lambda \Delta T,$

where $\lambda = -4 \epsilon \sigma T^3 + W_{feed}.$

97. BBD says:

Steven

Hint: Long before the pause, Long before the general understanding that hadcrut was biasing the record by its treatment of the artic, Long before Cowatn and Way, gavin told me nothing would be gained by re looking at the temperature series. who would have thunk it?

Nothing was ‘gained’ be re-looking at the temp reconstructions. Certainly from a ‘sceptic’ POV. They were essentially validated, with very minor tweaks.

As for consilience, all the evidence points to an ECS of ~3C per doubling of CO2 or equivalent forcing change except NL’s stuff. Something he seems reluctant to acknowledge but is painfully evident to everyone else in the room except the ‘sceptics’. The tail does not get to wag the dog.

98. Dave_Geologist says:

Use the most current data please, until such time, I suspend judgement on your claims.

Is that the new “pause”? Since every new paper inevitably uses data at least a couple of years old, due to calculation time and publishing lead times, no publication will ever be valid. Except for mickey-mouse back-of-envelope calculations done in a spreadsheet and posted directly onto a blog.

Surely the more important question is “does the new data make any difference?”. In general, apart from UAH, we’re not talking blanket recalculations from the year dot. Just adding a couple more years to a multi-decade time series. In a consilient field such as climate science it’s extremely unlikely that a couple more years of data will make any real difference. They don’t contradict the mountain of data that existed previously, just add a small cairn to the top.

99. Everett F Sargent says:

SM sez …
“Now, Comes the question, did the modelling study use the latest best data?”

I would not fathom what is the “latest best data” but I am pretty sure that “so called” holy grail of latest best data stops at yesterday, as it always will.

So, paleo not so much.

Who get’s to decide the latest best data?

Or does one do something called a sensitivity analysis?
https://en.wikipedia.org/wiki/Sensitivity_analysis

So, for example, assume the GMST (the single global index) is applied globally (appropriate fractions for ocean and land time series rates of change). If that in any way-shape-form gives a lower best estimate for ECS then game over. The spatial pattern of global temperature matters in real time.

But we can’t do that with paleo, but we do have time, all the time, as there is no tomorrow or today or yesterday.

The only recourse we have for the future, that we can use today, are found in climate models.

100. izen says:

This may be a dumb version, please correct its errors. I am just trying to get a clear, simple view on what the issue is.

1) GCM’s with an emergent ECS of 3C for CO2 doubling can provide a GMST record during a few decades of the increasing CO2 period.

2) The pattern of that record can vary between different runs of the GCM, (because internal variations) although the final ECS is about the same.

3) Using an EBM on the derived records from the GCM can give a range of ECS depending on the pattern of warming in the run used.

4) Therefore using an EBM on the single run we have on the real Earth is not a reliable way to constrain the possible lower limit of ECS. We MAY be following a pattern of warming that would cause ECS to be underestimated as seen with some of the GCM data.

5) Then there is the problem of how accurate our observational record may be from sources that were only intended as a topical report, not a longitudinal survey. So there are several versions with dispute over which is BEST…

6) GCMs can indicate alternative measures of the pattern of warming that can better constrain estimates of ECS if adapted for use in an EBM. (?)

I have not been able to integrate this with the method of deriving ECS from the annual cycle… ?

101. Andrew E Dessler says:

If people are interested in the linearized energy balance equation, this is a good review article: https://journals.ametsoc.org/doi/abs/10.1175/BAMS-D-13-00167.1

102. Steven Mosher says:

“Or does one do something called a sensitivity analysis?
https://en.wikipedia.org/wiki/Sensitivity_analysis

That is what I am suggesting. Only been doing them since 1985.

103. Steven Mosher says:

“Is that the new “pause”? Since every new paper inevitably uses data at least a couple of years old, due to calculation time and publishing lead times, no publication will ever be valid. Except for mickey-mouse back-of-envelope calculations done in a spreadsheet and posted directly onto a blog.”

Not really: It appears they had the data in question ( orginally published in 2014).

But in practical terms it is a non issue when you publish your code. So for example, even though Nic refused to use the Berkeley data even as a sensitivity test, he at least supplied the code so that anyone who had a new dataset that was published “in the gap” could do the work for themselves.

So not the new pause, just another example of how publishing you code and data helps and doesnt hurt.

Folks can disagree about the necesity of doing sensitivity to dataset selection. They can disagree about whether it should appear in the main text or supplement. But I certainly do not suggest that a Publication IS NOT VALID. I am saying this: It is rational, and defensible, and warranted to WITHOLD JUDGEMENT if there were Newer datasets AT THE TIME OF SUBMISSION, that were not explored. Withholding judgement is logically distinct from rejecting or accepting.
It is merely saying, “I dont decide”. ” Do newer datasets make a difference?” If so, why, if not Cool! even better for the paper. Now of course YOU may have another rational standard. take Nic for example. He would argue that CRU is the “accepted” standard used by the IPCC, so why even look at other data sets? Ive heard that appeal a bunch of times: Use the accept recognized standard.
Meh.
Thats a rational standard. It depends upon accepting a consensus decision about the preferred data. please note: rational people can have differing rational standards. So by your standard, you accept the paper, but my decision is to merely withold judgement. And to withold judgement according to a predetermined rule. More importantly the decision of witholding judgement is related to the question of what work one chooses to rely on as a foundation.

104. Dave_Geologist says:

So by your standard, you accept the paper

No, I accept the paper as another brick in the wall.

I base my view of the science not on the latest paper, newest dataset or not, but on the preponderance of evidence from all the papers that have been published. Where it is outside my area of expertise, I rely on the consensus view of those with expertise. They could of course be lying to me, but when the claims that they are come from sources which I can tell from my own personal knowledge have lied publicly in the past, or who have been fact-checked on more than 4,000 lies, I take it with a large bucket of salt. When it is a controversial (not scientifically, of course) issue like AGW, where hundreds of millions, probably billions of dollars have been spent on trying to find fault or fraud, all coming up with diddly-squat, I regard the possibility I’m being lied to by the consensus as vanishingly small. If there had been a there, there, it would have been uncovered. Instead the only there’s which have been uncovered are covert donations from right-wing think-tanks to deniers and lukewarmers.

105. JCH says:

There is there a statement that Andrews does not agree, so we go straight to fraud, dishonesty, retraction, and prison time.

106. Joshua says:

JCH –

But if course, no one could have anticipated that would result from a purely objective analysis that focuses only on scientific analysis!

Plausible deniability is a high art.

107. Chubbs says:

*) Surface temperature observations in the 19’th century are uncertain. Biggest issues are lack of coverage and different methods and correction factors for SST. Plus there are no 19’th century obs of the earth’s energy imbalance.
*) Aerosols forcing is uncertain, not globally uniform, and correlated with 19’th century surface obs.
*) Climate models generally predict that EBM will be biased low. The main issue appears to be the southern ocean, where warming is delayed by large ocean heat capacity, impacting cloud and sea ice feedback.
*) The warming over the past 40 years plus the current energy imbalance suggest OBM are underestimating TCR+ECS

108. izen says:

@-Chubbs
Thank you for the sharper points on this issue.

I have recently encountered push-back against the idea that aerosols could have caused cooling, or the idea that there was any global dimming in the 50s-90s.

Recently Lanser/Pedersen is invoked as showing that there has been no warming inland, just at the coast.
http://journals.sagepub.com/doi/10.1177/0958305X18756670

And the suggestion this may be aerosol dimming (cleaner air at the coasts) gets dismissed by Stanhill et al which is quoted as refuting any global dimming from aerosols.
https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1002/2013JD021308

109. JCH says:

I am not a big believer in aerosols causing wide fluctuations in the GMST record. For instance, the cool period from ~1945 to ~1970, I think that was most likely caused by regime change in the Eastern Pacific. Skeptics love the PDO because they think it is about to cause a cooling like ~1940 to ~1970 to happen again, and instead, which I thought was obviously about to happen, it likely instead caused 2013 to present.

This site uses Akismet to reduce spam. Learn how your comment data is processed.