A recent comment asked about James Hansen’s recent paper in which it is claimed that equilibrium global warming for today’s GHG level is 10°C. I’ve finally had a chance to look at it and I think I understand what is being suggested.
One suggestion in the paper is that the GHG forcing today is already 4 Wm-2, which is equivalent to a doubling of atmospheric CO2. It also argues that the aerosol forcing is large enough that this implies an equilibrium climate sensitivity (ECS) of 4oC. This is somewhat higher than the typical best estimate of 3oC, but within the likely range (2oC – 4.5oC).
The paper then considers the glacial cycles and argues that an ECS of 4oC is consistent with the glacial cycles, given global temperature changes of 6oC and total changes of radiative forcing (ice sheets plus GHGs) of 6 Wm-2.
The paper then argues that if the GHG forcing is 2.5Wm-2, this implies an equilibrium response to the GHG forcing of 2.4 Wm-2/oC, giving an equilibrium temperature change to a 4 Wm-2 GHG forcing of 10oC.
As I understand it, the problem with this is explained in this Realclimate post. The glacial cycles are driven by Milankovitch cycles (orbital variations) that don’t, by themselves, have a big impact on global temperatures, but can have large regional impacts that influence ice sheets. Changes to the ice sheets can influence temperatures, which can then change atmospheric GHG concentrations, which then also influence temperatures, and then ice sheets, etc.
Hence, these are coupled in a complex way. If you want to estimate the ECS, you can combine the longer-term changes (GHG concentrations, ice sheets, etc) into a single forcing and you get an ECS of around 3oC. To determine the response to the GHGs alone would require separating the influence of the orbital variations and GHGs on the ice sheets, which isn’t straightforward.
It will also depend on the climate state; if there are large ice sheets the response will probably be larger than when the ice sheets are much smaller, as they are today. So, I don’t think it’s correct to claim that equilibrium global warming for today’s GHG level is 10°C. There are a few other comments I could make about the paper, but since I’m trying to keep these posts short, I’ll stop there.
Forcing due to orbital variations is overlooked. One of the co-authors of the Hansen paper is Leon Simons and I responded to a tweet of his on Lindzen’s arguments:
10C does sound high. It is early days but the paper has good co-authors, e.g. Matthew Osman. Matthew was the lead author for the paper responsible for Fig. 8. He worked with Jessica Tierney at the U of Arizona but is now at Cambridge. Based on the acknowledgements it also sounds like she had “eyes” on the paper. As she has pointed out, the ECS is a function of temperature and higher for warmer climates, e.g. ECS for the PETM is 6.5 C, https://twitter.com/leafwax/status/1579563795277828098 .
The work by Osman et al. has implications for the shape of the “hockey stick” as described here, https://arstechnica.com/science/2021/11/scientists-extend-and-straighten-iconic-climate-hockey-stick/
Just Dean,
I hadn’t seen that comment from Jessica Tierney about the PETM ECS. That is higher than I had realised.
Regardless of the proposed median ECS for the PETM of 6.5 degrees, the 95% confidence range of 5.7 – 7.4 (max just 30% above min) seems extraordinarily narrow.
It implies that we know the PETM ECS far more accurately than the current ECS (IPPC 90% confidence level 2.0-4.5, max 125% above min) , which to me at least is implausible.
Can anyone explain this?
The full paper describing the work that is not behind a pay wall can be found here, https://research-repository.st-andrews.ac.uk/handle/10023/26261 .
Let me add that the Hansen paper has not been published or released. Hansen purposely put it out on arXiv for review and discussion by the broader scientific community, http://www.columbia.edu/~jeh1/mailings/2022/Pipeline.arXiv.13December2022.pdf .
Maybe because he/they anticipated that it would be somewhat controversial.
“Eventual global warming due to today’s GHG forcing alone — after slow feedbacks operate — is about 10°C. Human-made aerosols are a major climate forcing, mainly via their effect on clouds.”
Like the comment follow the money the question here is not why the forcing is 10C
But why is the forcing claimed 10 C?
The answer is simple.
This is the answer needed to fit climate model prognostications.
Only with a high CS included could one possibly claim that the models would be both right and right with horrible consequences.
Put a CS of only 2 in the models or affecting the outcomes and no Friday the 13th horror effect.
People know intuitively such a high level is impossible but must defend an indefensible outcome.
Yuk. Why the high forcing will cause a temperature rise of 10 C, not a forcing of 10C.
Would that I could have a rubber or just be more organised.
Isn’t this just his “sensitivity is 6 oC”, re-heated and exaggerated somewhat further? https://wmconnolley.wordpress.com/2007/11/23/hansen-again/ and https://wmconnolley.wordpress.com/2008/03/17/hes-at-it-again/
Pingback: He’s at it again – wmconnolley: scienceblogs.com/stoat archive
A couple of comments:
1) We are not locked-in to today’s GHG levels. If we stop emitting the oceans will take-up CO2. Methane has a short lifetime and could be drawn down more quickly than CO2.
2) Regarding ice sheets and ice ages. What would have happened if the Milankovich cycles didn’t exist, but CO2 still varied up and down as in the ice age cycles. Since GHG are more potent than Milankovich cycles for changing global temperatures, I am guessing we would have had the similar ice age cycles, i.e. 280—->190 ppm CO2 would have resulted in 6C of cooling, once ice sheets and vegetation had equilibrated. Similar warming going the other way 190–>280 ppm. Results in a high number for climate sensitivity once all the long-term feedbacks have fully kicked-in.
WMC,
I can’t remember the details of the earlier work, but I think it is just an “update” to that. Increased estimate of LGM temperature change and reduced GHG forcing giving larger GHG-only equilibrium warming.
Chubbs,
1) Yes, and I was thinking of writing another post about this.
2) This seems to be the key point. In the absence of Milankovitch Cycles would a 190 ppm – 280ppm change in atmospheric CO2 lead to the same amount of warming/cooling. My understanding is that it isn’t quite that simple, but maybe there are arguments to the contrary.
vtg,
I thought James Annan might have commented on this, but all I kind find is a post from last year where he suggests that Tierney et al.’s estimate for the LGM temperature anomaly was too large and too certain (6.1C +- 0.4C). His estimate is 4.5C +- 1.7C which – if the total change in forcing is ~ 6 W/m^2 – gives an ECS of 3C.
http://julesandjames.blogspot.com/2022/05/blueskiesresearchorguk-egu-2022-how.html
Thanks AT – but my comment was at the PETM rather than the LGM.
I find it hard to believe that the kind of proxies available for both CO2 and temperature allow a more precise estimate from 56 million years ago for ECS than current measurement and recent proxies do for current conditions…
“Maybe because he/they anticipated that it would be somewhat controversial.”
He had a very similar so-called controversial paper in 2015/2016. As a research coastal engineer that one was an insult, as in, is it easier to roll a rock or lift a rock. And do they know the difference in dynamic pressure when presented with a recurved surface. Doubling times were also a very dubious exercise without any confidence intervals whatsoever.
But why listen to a subject matter expert or subsequent papers by several other authors in climate science when you have Three Days Before The Day After Tomorrow logic.
Hansen is like the Old Testament God in that he is trying to scare you all into doing his right thing.
Remember the children, the 50th or 100th generation children, as long means millennial.
Tierney PETM ECS paper is here …
https://www.pnas.org/doi/10.1073/pnas.2205326119
Like VTG, I too find the 95% CI a bit hard to believe.
JA work on LGM has been published …
https://cp.copernicus.org/articles/18/1883/2022/cp-18-1883-2022.html
Which suggests that Tierney ECS for the LGM was to high and too narrow (if I recall correctly).
YMMV
ATTP, I’m not so convinced of your Milankowitch argument, after all the authors did include an I’ve sheet forcing in their estimate of ECS.
As someone wrote above their large ECS is due to using the Tierney et al (2020) reconstruction. Annan et al. (https://doi.org/10.5194/cp-18-1883-2022) show that their estimate is cold biased and overconfident. You end up with ECS around 3 K if you use Annan’s reconstruction.
The other major issue is that they assume aerosols go away, but greenhouse gases stay constant forever. This is of course only possible if one continuous to emit, and so the resulting warming isn’t solely due to past emissions. The true committed warming due to past emissions is much smaller than 10 degrees.
There is a lot more that could be said about this paper, but overall I would always welcome what could be viewed as attacks on the mainstream. I think this is good for the science.
“I’ve sheet forcing” above should be “ice sheet forcing” one assumes.
Does the IPCC AR6 WG1 really present the PETM ECS as follows (top hat from 4-5C)?

… or is the PETM better represented, for example, such as this, by other reconstructions …

You know, something that looks more so-called curvy? Just call me curious.
Is not a PDF supposed to be unitary? Meaning the area under the curve sums to one? Or are all my many graduate level courses just, you know, wrong?
A quick guide to paleoclimate in the IPCC AR6 2021 report
https://pastglobalchanges.org/news/quick-guide-paleoclimate-ipcc-ar6-2021-report
Perhaps the best summary of IPCC AR6 WG1 wrt paleoclimate?
Thorsten,
Thanks for the comment.
As I understand it, the Hansen et al. paper is arguing that the LGM is consistent with an ECS of 4C, which they get by using the Tierney et al. temperature anomaly (6C) and combining the GHG and ice sheet forcings (2.5 W/m^2 and 3.5 W/m^2). In other words, the total forcing is 6 W/m^2 and the temperature change is 6C, so the ECS is 1C/W/m^2, which gives a temperature change of 4C for a doubling of CO2 (4 W/m^2). As you say, if you use the Annan et al. temperature, you get an ECS of 3C.
The 10C, I think, comes from trying to estimate the equilibrium warming due to GHGs only, which they do by assuming that they can then use the GHG forcing (2.5 W/m^2) and the temperature change (6C). This gives a sensitivity of 2.4C/W/m^2, or a warming of 10C per 4 W/m^2.
The problem – as I understand it – is that this is essentially assuming that the ice sheet forcing is essentially a feedback to the GHG warming, which may not be correct given that some of this is due to the Milankovitch forcing. This is what I thought Gavin’s Reaclimate post was also suggesting.
So, not only are they possible using to large a temperature anomaly, they may also be assuming a higher a GHG sensitivity than is warranted.
I would also be interested in your views on Tierney et als. analysis suggesting that the PETM ECS is 6.5C (5.7C – 7.4C). As I understand it, it’s calculated by assuming that all the forcing is due to CO2. However, this would seem to be ignoring other possible forcing agents and might also be more indicative of the ESS than the ECS.
I’ve only glanced at the paper – because it seems very familiar to prior (and, by now, very old) arguments from Hansen. As William Connolley asks/suggests above.
And I think it’s a barely relevant line of research anyway.
First of all, it’s working from a scenario that – as Thorsten notes – assumes that (non-aerosol) ghg concentrations are held constant. Which means that CH₄ emissions would need to stay roughly constant, and CO₂, N₂O emissions would need to be maintained indefinitely at some significantly non-zero positive rate.
But we’ve essentially already agreed (long-ago, at this point) that we are not aiming to stabilize concentrations (original terms of reference of the UNFCC notwithstanding!). The Paris Agreement makes it clear that we are trying “avoid dangerous interference in the climate system” by stabilizing *temperatures* instead, and that this temperature target is “well below 2.0°C”.
So, stabilizing *temperatures* at this level is really the question of critical interest. And the very strong scientific consensus is that this requires (approximate) net-zero emissions of the long-lived ghg’s (chiefly CO₂ and N₂O), which in turn implies *declining* concentrations over time (but at a ~80-100 ppmv decline in the first century after net zero CO₂).
But the *reason* we want to keep warming well below 2.0°C is to *avoid* things like significant ice sheet melt.
So, *assuming* we will (a) continue emitting at levels assured to (eventually) melt all the ice, and then (b) adding the temperature feedback from the melted ice back into the calculation of the near-term temperature response is not particularly relevant to the sensitivity or TCRE of policy or scientific relevance.
*Maybe* it’s of some academic interest.
But as abundantly clear from the uptake/interpretation/dissemination of this pre-reviewed, pre-publication paper by primarily alarmist/doomist/activist social media types (and *not*, as far as I can tell, climate scientists such as, say, IPCC AR6 WGI authors), it’s main takeaway seems to be stoking hysteria.
I’ve often pointed to this 2009 presentation by Myles Allen taking down the “but Hansen, et al., say real ECS is 6.0°C!” as both an attempted answer to the wrong question *AND* circularly invoking earth state/regime changes that essentially presume we ignore the imperative to act in the first place! Like, what’s the point, dude?
And just to add, yes, I do realize that the value of the ECS (at approximately the current initialized Earth system state) is an important input to deriving the TCRE value.
But from a relevant policy point of view, it’s less important what the very long-term equilibrium is to constant concentration ghg’s.
A point Myles Allen is turning to time and again in his 6-part lecture series on the science, origins, implications and why of net zero emissions.
Part 3 of which – “The Ocean Physics Behind Net Zero” – resumes January 31.👇
https://www.gresham.ac.uk/watch-now/series/net-zero
@ATTP, I suppose their logic follows from assuming greenhouse gases stay constant forever will eventually activate also ice sheet feedbacks. They also assume the ice sheet feedback (though you would argue it is a forcing, and I will not argue with that, though probably the truth is a combination) is the same going into warmer territory as it is in glacial cycles. But in reality it should be substantially smaller and eventually end when they are gone. We do know that ice sheets formed at about 5 K above current temperatures, so they do exert a feedback of unknown magnitude, although it should weaken to zero over the discussed range of warming.
The underlying problem is the “forever”, which means all of the above is not relevant.
Thorsten,
Thanks, yes that makes sense. I agree that the underlying problem is the effective assumption of “forever”. I might try and expand on that in another post, if I get a chance.
Mark Lynas must be feeling s bit whipsawed after a decade of living down the illustration of St Pauls apse-deep in the Thames estuary on the cover of Six Degrees
Let’s see if the asking price for the remainders rises on Amazon
@-tm
“The underlying problem is the “forever”, which means all of the above is not relevant.”
The assumption that humans continue to burn fossil fuels long enough to maintain or even increase CO2 levels over the present day does not seem extreme for at least as long as they are geologically accessable.
The assumption that we can get to net zero in less than 50 years looks less likely. At least not without a major breakthrough in both an energy source and a fundimental change in the economic-political functioning of society.
How credible the ice sheet feedbacks are is beyond my pay-grade.
But considering that during some interglacials and outside the ice-ages the poles were ice-free and warm, it may not be beyond the realms of possibility.
Although probably it is beyond the lifetime of anyone alive today.
@izen, the argument is about committed warming from past emissions, warming in the pipeline
@-tm
“I suppose their logic follows from assuming greenhouse gases stay constant forever will eventually activate also ice sheet feedbacks.”
I do not find the assumption that CO2 levels remain or increase to be that outlandish.
Given that over the 40 years human society has known the danger of burning fossil fuel it has increased. Fossil fuels are too convenient a source of energy to be abandoned until the energy used in extraction far exceeds the energy that they provide.
At least while human society ‘values’ the immediate profit for users over the long term cumulative damage.
izen,
Sure, it’s certainly possible that humans will continue burning fossil fuels for some amount of remaining time, but sustaining a forcing for millenia would require continuing to burn fossil fuels for that timescale. I’d hope that we’d get pretty close to net zero at some point in the not too distant future, at which point the forcing should decay slightly as the oceans continue to take up some of what has been emitted.
I caution people from jumping to conclusions, including myself, about the broader CC community’s thoughts on this paper by Hansen, e.g. suggesting Jessica Tierney is in agreement or supports the conclusions drawn here.
Jessica Tierney was a lead author for the WG1 and in this video, https://www.youtube.com/watch?v=Fa7-5KlisnU, her NSF Waterman award presentation, October 25, 2022, actually points out that their LGM modeling lead to constraining the ECS adopted in WG1 AR6 and therefore helped rein in some of the large ECS values from previous “hot models.” That particular presentation was given a day after their PNAS PETM paper came out. She makes no reference or suggestion that that work could or should influence future estimates for higher values or bounds on ECS. In addition, at the end of the video she explicitly cautions against the use of using aerosols to cool the earth which I believe is in contrast to what Hansen is suggesting should be reconsidered.
I am not an expert. I am a retired engineer from a national laboratory and have a new found passion for climate change and learning about the science and doing what I can at a personal level to help mitigate global warming.
Welcome to Climateball, Dean!
Thanks. As my manager used to say, “Big science is a contact sport!”
@-ATTP
“I’d hope that we’d get pretty close to net zero at some point in the not too distant future,”
So do I. But I see little evidence of that happening beyond greenwashing, and some additional energy supplied from renewables.
The link in the last post was intended to show the Keeling curve from the 1950s to the present.
Fixed.
The URL needs to end with an extension like png or jpeg for the WP parser to kick in.
izen,
That Keeling CO2 curve will continue to grow at least at its current quasi linear rate as long as emissions remain high or even if they were to plateau or level off at say our current level of about 40 GtCO2. Only if we enter something akin to a true net zero by 2050 or even say net zero by 2100 would we eventually see that curve begin to slow down. But even then, to get that curve to even flatline does require a true net zero by XXXX, for several decades even, because CO2 is such a long lived GHG and we will never completely wean ourselves off of some FF usage for the foreseeable future.
So as not to fool anyone, one would think there would be papers showing the evolution of this curve under realistic long term CO2 emissions reductions (and then realistic global removal technologies). Meaning forget about all the talk about net zero by whenever as that will never happen in reality, at least not with our current knowledge. The net zero movement is so back loaded as to be literally facetious, as in another joke by humanity for humanity. As in then a miracle occurs. Because there is no removal technologies that are efficient and even remotely possible on the horizon. So that you need a real removal technology and a realistic ramp or build out.
Hypothetical scenarios or pathways are just that hypotheticals. All the foreseeable renewable technologies need something called storage (like hydro which has a very large reservoir of water). Large scale build out of storage technologies has its own downsides (basically mining and removal and/or reuse), something that we can not even begin to imagine at this current point in time (the ironing board law of unintended consequences, because as we all know there are always unintended consequences in any, and all, human endeavors).
Sort of like ending world hunger or whatever Miss America wishes for the last several decades. If wishes were fishes then maybe.
If you have something else than hypotheticals to investigate the future, Everett, please do share. And yes – beliefs powered by incredulity counts as hypotheticals.
Just Dean,
Care to clue me in on this twit? Because it is an outright lie! As if that is what the IPCC AR6 WG1 actually meant and presented, an overtly simplistic top hat profile of all previous estimates of the PETM ECS. A plot that does not even adhere to the most basic of concepts of a unitary PDF (please note the lack of y-axis units as what is shown in not a unitary PDF). Not even wrong!
[Chill, Everett. -W]
EFS is right on, in my opinion. The net zero talk has always seemed like a cosmic joke to me. EFS talks about true net zero by 2050 or 2100 in his comment. I would love to actually be around to hear discussion about whether we have hit true net zero, or net zero, or approximate net zero yet, but I don’t expect to be around for those arguments. I am amused by the arguments and agreements as to whether a term or goal like approximate net zero is more useful. Sure, use that one.
It’s similar to the peak emission discussion that is always going on. Oh, we are just about to hit peak emissions or we hit peak emissions this past year, but we have to wait for the data… or if you extrapolate electric cars and heat pumps, we will definitely hit peak zero by 2025.
All such criticism or commentary is definitely provocative and political, but I am not sure that I can avoid or manage that problem. As Aristotle said, It is what it is.
I think the current level of extreme weather events are sufficient evidence that our efforts need to crank up to reduce emissions or grab them directly out of the air or oceans right away, but we have to spend a lot of $$ helping defeat tyrants around the globe and defending our way of life, so there’s only so much we can do.
Well, so be it. I am actively leaving a record as best I can to my friends who survive me that I put up a good fight to turn this around. My efforts are in the range of a stubborn ant trying to stop a foolish kid on a trike from riding into traffic. I am pushing back but the kid keeps pedaling away.
as Plato said, $h1t happen$.
Cheers
Mike
Hey, Everett. I hope you are finding things you can do each day that make your days worthwhile. Walking is pretty painful for me, so I am getting to a lot of tasks that have piled up in my collection of broken down musical gear. Hoping to resurrect a rather cool old no-name strat copy today. Started on it yesterday. Might finish today. I think it will be a player. Anybody need an interesting approximate strat?
I will miss your voice, Everett. Maybe we can time our exits to maximize our chats.
Cheers, brother
Mike
“beliefs powered by incredulity counts as hypotheticals”
Well d’oh!
Hansen’s incredulity counts just as much as others too, as we all do it, so d’oh again.
[Mod: As Willard would say, let’s Chill.]
@-EFS & sbm
It dawned on me some time ago that reducing CO2 levels was a purely political problem.
And that given the disparite local forms of governence there was no simple way of solving such a global problem.
I reckon we should regret we wont be around to see how things turn out if there is some tech advance and social change that makes a significant reduction possible.
And glad we wont be around to see things if our worst fears come to pass.
ATTP,
This is my last entry. I will not tolerate or accept the lack of civility and respect that is exhibited by certain individuals on this forum. Life is too short.
Regards,
Dean
“I am amused by the arguments and agreements as to whether a term or goal like approximate net zero is more useful.”
All terms and goals are useful for somebody, the only unreasonable thing is to expect there to be a term or goal that will guarantee success (at least in probability). The science is easy compared to the sociology or the politics.
Having said which, the net zero goal is not just a motivational goal. IF we want to stop warming, it is what we have to do to achieve that. That is true whether it motivates us to do something or not, so there is a point in discussing net zero that is independent of policy choices, but which ought to inform those choices.
As a technical point, the IPCC AR6 range is clearly labelled as a range rather than a p.d.f. It is also not at all unusual to see un-normalised probability distributions in statistical analyses, especially if they are Bayesian posterior distributions.
I find the posterior distributions to be quite disturbing.
Amen, brother.
Finished a quick read. Its a long paper that covers many topics. At the end didn’t feel that the 10C headline was that important. More important and new to me: the paper argues that larger the ECS, the longer the ocean response time and the less informative the current energy imbalance is of our climate state. This has major implications for energy imbalance models and the amount of forcing reduction needed to get the climate system back in balance.
Throw in the cooling offset from aerosols and we are in a larger warming hole than anticipated, no matter what emission pathway is followed. Its almost like physics has transpired to take advantage of our human tendency to procrastinate. The paper does make a testable prediction – warming will accelerate by 50% between 2010-50 as aerosol forcing decreases. That’s large enough to be detectable in the near term and hopefully will motivate the procrastinators if it shows up.
Having a stretch goal of net zero is not really a bad thing–as long as everyone realizes how unlikely it is, so they won’t be furious if/when it doesn’t happen.
We hit 602 quads last year, a decade earlier than was predicted a decade ago. Net zero is getting tougher, not easier, further away, not nearer.
Mechanisms matter. If car batteries keep catching fire in EVs, it won’t help. If modular nuclear reactors make it to market, it will. In other words, what we do now makes a difference, even if we miss our stretch goal.
And if fission never happens, it never happens.
And if renewables prices continue to beat nuclear pipe dreams to a pulp, so be it.
Meanwhile, Climateball players will focus on the wrong instrument to monitor progress from the energy industry.
Willard
Much ingenuity is called for in making renewables pay for their supersized externalities— as surely as dead automobiles filled North America’s back yards in the 21st century, defunct wind turbines and silicon roofs will feature in the Maine Decor and Red Green Show scripts of the very near future.
At best we can expect desert turbine boneyards, or groves of former turbines like the ones in Livermore Pass, for unlike airplane aluminum , fiberglass and balsa turbine blades defy recycling , and already are a drug on the market
You are three months too late, Russell:
https://www.energy.gov/eere/wind/articles/carbon-rivers-makes-wind-turbine-blade-recycling-and-upcycling-reality-support
Normalizing constant
https://en.wikipedia.org/wiki/Normalizing_constant
“The concept of a normalizing constant arises in probability theory and a variety of other areas of mathematics. The normalizing constant is used to reduce any probability function to a probability density function with total probability of one.”
“Bayes’ theorem says that the posterior probability measure is proportional to the product of the prior probability measure and the likelihood function. Proportional to implies that one must multiply or divide by a normalizing constant to assign measure 1 to the whole space, i.e., to get a probability measure.”
Posterior probability
https://en.wikipedia.org/wiki/Posterior_probability
“The posterior probability distribution of one random variable given the value of another can be calculated with Bayes’ theorem by multiplying the prior probability distribution by the likelihood function, and then dividing by the normalizing constant … “
Chubbs,
Figure 19 of said paper one would assume?
Not so much of a prediction/projection as a guess with 0.18C/decade (last say 53 years of warming 1970-2022 inclusive) multiplied by 1.5 and 2.0 with no CI (note again as per 2015/2016 doubling time paper) or any sort of analysis to support said guess as far as i know). We are now at 2023 though or 13 years past 2010, so currently a bad guess, by quite a lot actually, meaning to reach their guess, hmm, err, so-called theory. they have 37 years left in which to do so.
50%, 66%, 80%, 90%, 95%, or 99% at least try to look credible.
EFS, I’d give it another 5 years. Three straight years of La Nina have slowed warming. Ten years ago, we were also exciting a La Nina similar to today. Here is the ten year change in temps: 2012 – 0.65, 2022 – 0.89; so more than 0.18C increase in 10 years, but not a 50% increase. Just a snapshot of course. We’ll see what happens in the upcoming El Nino.
I expect I will need to update this graph soon:
Don’t get me wrong, the only real difference between what Hansen says and others is the X-Axis timeline with a coefficient attached. Hansen’s coefficient is like 0.5, whereas others is somewhere between say 1 and 10, but mostly closer to 1, in my honest opinion.
“37 years” above should be “27 years” or somewhat roughly 1/3 of the way between 2010 and 2050. So that applying an inflection point today, instead of say 2010, would lead to, say very roughly 1.35/2.7 = 0.5C/decade.
Also, playing connect the dots is not a very good idea, someone else should tell Hansen that, as that is something that deniers do.
EFS if that was directed at me, you are quoting Wikipedia stats pages to a statistician. As I pointed out, the IPCC range is correctly labelled as a range, not a distribution.
And I already posted the new (or one of the new, in case there are several) IPCC AR6 WG1 PETM peer reviewed paper with associated PDF graphic.
In other words, if one posts what looks like a distro then it is incumbent on that poster to show like-for-like, in my honest opinion.
That means, and by all means, do so for all other PETM distros if you are showing your own distro. That’s called being honest, in my honest opinion
EFS I pointed out your specific error, and you have repeatedly ignored it. The IPCC range is labelled as a range, so there is nothing to integrate to one. There is nothing remotely deceptive about the diagram. The style of plotting makes that even more clear – the red distribution has a line with shading underneath, the IPCC range does not, because it does not depict a distribution.
This is the problem with calling an expert a “twit”, when you error is pointed out, the loss of face is too great to simply admit it (no big deal, we all make mistakes) and instead double down and make yourself even more ridiculous.
EFS,
I don’t think this is your main point (although it is important with respect to the Hansen paper estimating climate futures that assume a constant composition of atmospheric CO₂ concentrations), but this statement👇 of yours above is incorrect:
It is expected that the Keeling Curve would begin to fall *well* before we reduced annual CO₂ emissions to zero.
Consider that the current airborne fraction of current emissions is about 47%. Which means that about 47% of annual emissions of ~40 GtCO₂ are currently added to the stock of atmospheric CO₂.
This means that the huge *cumulative* perturbation of atmospheric carbon is *already* “forcing” about 53% * 40 = 21.2 GtCO₂/yr into the ocean and land carbon resevoirs.
This is driven almost entirely by the *already existing* huge disequilibrium between the three broad carbon reservoirs, *not* by the annual emissions.
Now, this rate of carbon uptake by the land and oceans would tend to taper over time as the sinks moved towards equilibrium. But, at least at the outset, if we *quickly* got annual CO₂ emissions by ~50%, we’d expect the Keeling Curve to approximately flatten. And to begin falling if we got significantly below 50% reductions from current, but definitely well before hitting net zero.
There are complications to the above related to what happens if you cut emissions and then stay on some plateau for an extended period of time, or if you are only *very* gradually reducing emissions on the pathway to zero.
But the conclusion above broadly holds: the Keeling Curve should begin to flatten and fall *well* before we get to net-zero CO₂ emissions. (Albeit a point many people are confused about, 🤷)
Note well, however, that merely getting the Keeling Curve to flatten or slowly fall is not sufficient to stop further warming. For that rate of fall in the Keeling Curve to happen – i.e. sufficient to stop further *warming* – we need (approximately) net-zero CO₂ emissions.
And we disagree. So not an error. I have ignored nothing, you have ignored the bleeding obvious, it is called like-for-like.
Oh and please point me to the actual IPCC PETM graphic. Thank you so very much.
Sorry, if you think disagreeing means that it isn’t an error, then you are truly impervious to correction and there is no better recipe for error than that.
Ranges do not integrate to one, they only have values on the x-axis, not the y-axis, so there is nothing to integrate.
rustneversleeps,
Yes to all you had to say above, but I do not expect anything even remotely close to net something by some date. I expect at least 50% FF emissions in 2050 even if we truly reach a peak before 2050.
Right now, there is every reason to believe continued increases in FF emissions, I am afraid.
Actually, I will offer up a mea culpa, I was wrong to confuse someone else using IPCC ranges with any sort of distro that they also show on the same graphic.