Guest post: A ‘new’ measurement of climate sensitivity?

This is a guest post by Mark Richardson, who is currently a Caltech Postdoctoral Scholar at the NASA Jet Propulsion Laboratory. Mark has a particular interest in the role of clouds in climate change. This post is a response to a suggestion that it is possible to more tightly constrain Equilibrium Climate Sensitivity (ECS). This article is all personal opinion and does not represent NASA, JPL or Caltech in any way.

A ‘new’ measurement of climate sensitivity?

The oceans are massive and their deeper layers haven’t caught up with today’s fast global warming. Unfortunately we don’t know exactly how far behind they are so it’s hard to pin down “equilibrium climate sensitivity” (ECS), or the eventual warming after CO2 in the air is doubled.

Blogger Clive Best proposes that data support an ECS range of 2–3 oC, with a best estimate of 2.5 oC. The 2013 Intergovernmental Panel on Climate Change (IPCC) consensus range was 1.5–4.5 oC with a best estimate of 3 oC. He asks “why is there still so much IPCC uncertainty?” Here we’ll see that part of the reason relates to the oceans, and that surprisingly Best’s results actually agree with IPCC climate models.

Clive Best mixes temperature data with a record of heating due to changes in gases in the air, solar activity, volcanic eruptions, air pollution and so on. Apparently without realising it, he accurately reproduced a textbook calculation including a reasonable way to try and account for the oceans lagging behind surface warming. This is a good start!

This calculation is often called a “one-box energy balance model” but by 2010 it was known to have issues with calculating ECS. Clive Best misses some of these because he uses a 1983 climate model to estimate that the oceans lag about 12 years behind the surface, which combined with the HadCRUT4 data gives an ECS of about 2.5 oC.

But in a like-with-like comparison HadCRUT4 warms about as much as the IPCC climate model average since 1861. Given this agreement, anything that uses HadCRUT4 and gets a lower ECS than the model average 3.2 oC has some explaining to do!

Figure 1: Temperature change over 150 years in abrupt 4xCO2 simulations of four climate models. Black lines are a one-box fit with ECS and response time (τ) allowed to vary. Legend lists model name, true ECS and fit parameters.

The reliance on a 1983 model is the explanation. The 1983 NASA GISS Model II was mostly designed for the atmosphere and had a simple ocean. For example, its ocean currents couldn’t change. Modern models are more realistic and the graphs to the right (Figure 1) show their temperature after an immediate 300 % increase in CO2. Each legend has the known model ECS, along with the ECS and time lag (labelled τ) calculated for the one-box model.

The ECS is off and the time lag can be as long as 21 years instead of 12! On top of that the fits are bad because the oceans aren’t just 12 years “behind”, instead the system acts as if the ocean has multiple layers and each one can respond on a different timescale. Now let’s look at simulations of the climate since 1861 and the one-box fits.

Figure 2: Simulated temperature change from 1861–2015 inclusive in 4 climate models using historical-Representative Concentration Pathway 8.5 scenarios (RCP8.5, blue). Model output is sampled in the same way as HadCRUT4. The thicker lines are fits using a one-box model with either the lag from Figure 1 or assuming a 12-year lag. Radiative forcing is the Forster et al. historical-RCP8.5 in all cases.

Consider the Figure on the left (Figure 2). Imagine living in the world of the top left panel. In this world we might read a blog that says ECS is around 1.7 oC but in reality it would be 3.8 oC. Now let’s compare the one-box and true ECS values for 18 models.

Figure 3: Model true equilibrium climate sensitivity (True ECS) as a function of that calculated as in Figure 2, using historical-RCP8.5 temperature change with the Forster forcing and a one-box model with a 12-year lag. All of the points are above the 1:1 black dashed line, showing that the one-box model underestimates true ECS in all 18 cases. The red line is a best fit to the models, although the fit is weak.

If this one-box calculation works, then it should give the right answer when applied to complex climate models where we know the answer (e.g. Geoffroy et al. (2013) do this sort of test). With this data being free online, anyone can work out that climate models with ECS from 2.3–3.8 oC are consistent with the data & one-box approach. A little exploration shows us that the climate’s response time matters, and measured ocean heating shows a single 12-year lag doesn’t make sense (Figure 3).

Clive Best asked why the IPCC give a range for ECS that’s bigger than his calculated 2–3 oC. This post shows that partly this is because his approach missed lots of uncertainty related to ocean layering. A 2013 paper found that the way in which oceans delay warming could even affect future sea ice and clouds while a 2017 study brought together the key physics and data. The conclusion? Observational data support a “best estimate of equilibrium climate sensitivity of 2.9 oC”, with a range of 1.7–7.1 oC.

Advertisements
This entry was posted in Climate change, Climate sensitivity, Global warming, The scientific method and tagged , , , . Bookmark the permalink.

149 Responses to Guest post: A ‘new’ measurement of climate sensitivity?

  1. Griff says:

    Nature Climate Change 7, 331–335 (2017)

    Emergent constraint on equilibrium climate sensitivity from global temperature variability

    Peter M. Cox, Chris Huntingford & Mark S. Williamson
    Abstract
    ……/

    Here we present a new emergent constraint on ECS that yields a central estimate of 2.8 degrees Celsius with 66 per cent confidence limits (equivalent to the IPCC ‘likely’ range) of 2.2–3.4 degrees Celsius. Our approach is to focus on the variability of temperature about long-term historical warming, rather than on the warming trend itself. We use an ensemble of climate models to define an emergent relationship2 between ECS and a theoretically informed metric of global temperature variability. This metric of variability can also be calculated from observational records of global warming3, which enables tighter constraints to be placed on ECS, reducing the probability of ECS being less than 1.5 degrees Celsius to less than 3 per cent, and the probability of ECS exceeding 4.5 degrees Celsius to less than 1 per cent.

  2. Griff,
    Thanks. Yes, I just saw that this evening.

  3. Clive Best says:

    Thanks for writing a response to my post !
    Just how realistic are modern models if they increase uncertainty rather than reduce it ?
    late at night here 😉

  4. John Hartz says:

    Speaking of Climate Sensitivity, this just in…

    Climate scientists on Wednesday suggested that they may be able to rule out some of the most dire scenarios of what would happen if greenhouse gas levels in the atmosphere were to double.

    Unfortunately, the same scientists say the best-case scenarios are also probably unrealistic.

    How a doubling of atmospheric greenhouse gases would affect the climate is of tremendous importance, as humans are running out of time to avoid that outcome. With current atmospheric concentrations at 405 parts per million, as opposed to about 280 parts per million before the dawn of the industrial era, the planet is already about halfway there.

    In the new study in the journal Nature, Peter Cox and Mark Williams of the University of Exeter and Chris Huntingford of the United Kingdom’s Centre for Ecology and Hydrology attempt to recalculate the “equilibrium climate sensitivity,” a highly influential metric that describes how much the planet will warm if carbon dioxide doubles and the Earth’s climate then adjusts to the new state of the atmosphere.

    Climate scientists say they may be able to rule out the worst-case scenarios — and the best ones by Chris Mooney, Energy & Environment, Washington Post, Jan 17, 2018

  5. Steven Mosher says:

    “Thanks for writing a response to my post !
    Just how realistic are modern models if they increase uncertainty rather than reduce it ?”

    SMH.

    When you first start war modelling you use simple old models

    https://en.wikipedia.org/wiki/Lanchester%27s_laws

    Then you add detail and your answers get more realistic… and more fuzzy.

    Then you understand which exact components require more attention to reduce the uncertainty.

  6. Steven Mosher says:

    war and marketing
    http://lanchester.com/

  7. MarkR says:

    Griff,

    You can see in that paper they use some similar techniques – their Figure 2(b) is basically Figure 3 from this post with some added stats. They then use their observation-based numbers to restrict ECS based on the CMIP5 models.

    They report a “likely” range, for comparison their 5-95 % range is about 1.8-3.7 C, versus the 2.3-3.8 C implied by Figure 3 in this post. I didn’t calculate it rigorously though so it’s only a rough look.

    I’m skeptical about the “true” uncertainty being this narrow right now though. There are other emergent constraints and at least the satellite & physics based ones tend to support higher sensitivities. Until all these are reconciled the believable range is going to stay quite large.

  8. MarkR says:

    Clive Best: “Just how realistic are modern models if they increase uncertainty rather than reduce it ?”

    I don’t think they increase uncertainty! They only increase the amount of uncertainty that we can see and (hopefully) understand. The importance of an old versus a new model also depends on your question: “what is ECS?” is a different question to “what are the typical response timescales?”

    It’s totally possible for models to give the same ECS but different response timescales. ECS is set largely by atmospheric physics mixed with warming patterns while response timescales are largely set by ocean circulation, mixing, layering etc. It’s possible to improve the ocean bit and find out that it doesn’t have much of an effect on the combined atmospheric physics & patterns.

  9. Griff says:

    I posted that new paper as a question for comment rather than a definitive answer.
    The paper has been spun as not as bad.
    Smegged up vrs really smegged up is not good.
    It would seem that the most likely value is still around 3C. Risk enough to drive significant corrective action even without a long tailed uncertainty monster.

    It was my understanding that aerosols could be the largest uncertainty?

  10. Griff,
    In terms of trying to estimate climate sensitivity from observations, then I think that aerosols are a big uncertainty. However, I think this does not really factor into the type of analysis presented in that paper that you highlighted. I’d agree with MarkR’s earlier comment about it being surprised if we really could constrain the ECS quite that tightly at this stage. There are, for example, potential non-linearities that I think even the method in that paper would not be able to include (i.e., I think there ECS estimate is under the assumption that it’s not state dependent, and it probably is).

  11. dikranmarsupial says:

    testing…

  12. dikranmarsupial says:

    Clive Best asks “Just how realistic are modern models if they increase uncertainty rather than reduce it ?”

    I suspect that is a rhetorical question, but here goes: If I make a “spherical cow” model of some physical system (for instance the Earth’s climate) that doesn’t include some of the relevant physical processes involved, then the uncertainty on the predictions made by the model will probably be too narrow (i.e. it will be over-confident). This is because the spherical cow model doesn’t include the uncertainties in the processes that it doesn’t include. Normally I would expect the uncertainties in the model predictions to increase as I made the model more realistic (and hence more complex), unless I had a sufficiently large increase in the volume and types of observations to adequately constrain the parameters of the model. This crops up again and again in statistical inference, sometimes an “unrealistic” model gives better performance than a more “realistic” one because we don’t have enough data to constrain the model, however the unrealistic model tends to be over-confident in its predictions. For example, naive Bayes models often give good performance on text classification problems (e.g. spam filtering) even though most texts are not constructed by picking words from a bag at random with a fixed distribution (although with some blog comments, I do wonder ;o).

    To paraphrase Rumsfeld, there are things we know, there are things we know we don’t know and there are things we don’t know we don’t know. We make our models more realistic by incorporating the things we don’t know we don’t know, and that can increase the uncertainty in the conclusions.

    There is a branch of computational statistics about this sort of thing called “uncertainty quantification”, and it is important because quite often using decision theory to decide a course of action we need an accurate quantification of the uncertainties.

  13. dikranmarsupial says:

    sorry should be “We make our models more realistic by incorporating the things we know we don’t know,” the things we don’t know we don’t know are usually rather harder to do anything about! ;o)

  14. Clive Best says:

    Mark,
    Some comments.
    1. Yes model comparisons should be made against what is actually measured, which in the case of HADSST3 is SST rather than air temperatures. Concerning coverage bias in HadCRUT4, I have another solution which in my opinion is better than kriging into the Arctic, especially when there is no satellite data available – Spherical Triangulation. http://clivebest.com/blog/?p=8014
    2. I used this old model (GISS Model II) simply because it is the only one I can download and run on my iMac. The time dependence I get is a 15 years e-folding time lag of the oceans This is exactly the same as you show in the figure for HADGEM2-ES. So the only real difference is that the CMIP5 models do not seem to stabilise at any fixed temperature but continue to rise well after 100 years or more. The only reason I can think of, off-hand, why that might happen, is that this is due to models assuming some long term cryosphere/albedo feedback from slow melting ice caps. What perhaps you should also mention is that each model has its own built-in assumptions about carbon cycle feedbacks, cloud feedbacks and so on. Therefore this results in such a large spread in ECS values.
    3. So yes I am using a zero-dimension energy balance type model to fit a zero-dimensional global temperature record. That sounds a reasonable thing to do to me, because even if you use an ESM you still have to integrate over all dimensions to get just one annual global temperature. The only real difference is the temperature stabilisation time dependence.
    4. Do all CMIP5 models stabilise temperature following a step doubling of CO2?
    Assuming the answer is yes. How long do they take to stabilise?
    Does the same model give different values depending on initialisation state?
    Is ECS ‘temperature’ dependent?
    So if ECS means anything at all then it must have a fixed value. This means that some models are wrong, while others are approximately correct. The next AR6 report must make a much clearer statement about climate sensitivity and future warming. Otherwise the public will simply lose interest.

  15. Clive,

    So the only real difference is that the CMIP5 models do not seem to stabilise at any fixed temperature but continue to rise well after 100 years or more. The only reason I can think of, off-hand, why that might happen, is that this is due to models assuming some long term cryosphere/albedo feedback from slow melting ice caps.

    I think this can simply be because of the time it would take to esentially bring the entire ocean to equilibrium. Heating the deep ocean is very slow, so while this is happening there will be a planetary energy imbalance and the system will continue to warm. This can, as far as I’m aware, take centuries.

  16. Clive,

    So if ECS means anything at all then it must have a fixed value.

    No, I don’t think this is correct. It is probably state dependent. So, the ECS we get starting from an average surface temperature of 288K is potentially different to that we would get if we started at some other initial surface temperature.

  17. BBD says:

    What perhaps you should also mention is that each model has its own built-in assumptions about carbon cycle feedbacks

    I’m not sure that CCFs are modelled in current AOCGMs.

    The next AR6 report must make a much clearer statement about climate sensitivity and future warming. Otherwise the public will simply lose interest.

    It’s about 3C per doubling. Now let’s stop buggering around and start mitigating.

  18. BBD,
    Good point. An estimate of ECS does not depend on carbon cycle feedbacks (it’s purely based on the equilibrium temperature after a doubling of atmospheric CO2 – specifically, if that is increased at 1% per year for 70 years).

  19. Roger Jones says:

    If you correlate the 1861 to 2005 warming in the CMIP5 ensemble with the publicly available model sensitivities (we picked up 91 simulations), the correlation between the amount of warming and ECS is -0.01. Now, there may be a way to disaggregate this, but I don’t think there is without getting into the physics of the models, because aerosols are a shortwave effect and greenhouse gases are a long-wave effect. However, because of their warming properties in individual model runs, the feedbacks cancel each other out to the degree that the ensemble does not have anything useful to say (see the above correlation).
    We will know by 2030, but then the 1.5 or 2C argument will be settled, answering which side we can fall on on for any of these thresholds. Every time there is another ECS study, I am reminded of the difficulty in aligning the statistical attributes of a blue whale with that of a bowl of petunias.
    For any of these results to be valid, the climate system has to be working in a certain way, and I am certain it is not.

  20. dikranmarsupial says:

    “Do all CMIP5 models stabilise temperature following a step doubling of CO2?”

    Given the Stephan-Boltzmann law is a component of the models, I’d be extremely surprised if they didn’t!

    “Assuming the answer is yes. How long do they take to stabilise?”

    How long would you expect the physical Earth to take to reach full equilibrium? Given the deep ocean circulation takes a matter of thousands of year to overturn, fully equilibriating the oceans seems likely to take a long time. A simple model that equilibriates in decades is the sort of over-confident prediction you get from a spherical cow (or perhaps a “point mass cow” in this case? ;o) model.

  21. dikranmarsupial says:

    “This means that some models are wrong, while others are approximately correct.”

    How do you distinguish between “wrong” and “approximately correct”. Personally I would say that a model is “wrong” if the correct value is outside the credible interval for its prediction, so proper representation of the uncertainties is important.

  22. angech says:

    “I think this can simply be because of the time it would take to esentially bring the entire ocean to equilibrium. Heating the deep ocean is very slow, so while this is happening there will be a planetary energy imbalance and the system will continue to warm. This can, as far as I’m aware, take centuries.”

    While technically true would not the bulk of the warming be expected to occur in the first 100 years. Arguing over a 0.01 increase in ECS over another thousand years seems a little pedantic.
    As pointed out by several commentators on both sides, ongoing sizeable increases in warming over hundreds of years (high ECS) are physically highly if not extremely unlikely.
    Models that fail to show constraint should be regarded with suspicion.

  23. angech,
    Yes, most would probably happen in the first, but I don’t the residual is as low as 0.01K.

  24. JCH says:

    I suspect the deep ocean will continue look pretty much like it does today: profoundly cold, just a little less so.

  25. Pingback: O's digest - Nature e le vaccinazioni - Ocasapiens - Blog - Repubblica.it

  26. John Hartz says:

    Recommended supplemental reading:

    While global climate models do a good job of simulating the Earth’s climate, they are not perfect.

    Despite the huge strides taken since the earliest climate models, there are some climatic processes that they do not simulate as accurately as scientists would like.

    Advances in knowledge and computing power mean models are constantly revised and improved. As models become ever more sophisticated, scientists can generate a more accurate representation of the climate around us.

    But this is a never-ending quest for greater precision.

    In the third article in our week-long climate modelling series, Carbon Brief asked a range of climate scientists what they think the main priorities are for improving climate models over the coming decade.

    These are their responses, first as sample quotes, then, below, in full:

    In-depth: Scientists discuss how to improve climate models, Carbon Brief, Jan 17, 2018

  27. Clive Best says:

    “BBD,
    Good point. An estimate of ECS does not depend on carbon cycle feedbacks (it’s purely based on the equilibrium temperature after a doubling of atmospheric CO2 – specifically, if that is increased at 1% per year for 70 years).”

    No but mitigation does. What matters is just how much carbon emissions remains before we commit to 2C warming. It looks like we have 4 times as much left as was previously thought (Miller et al.)

  28. dikranmarsupial says:

    angech wrote “Arguing over a 0.01 increase in ECS over another thousand years seems a little pedantic.”

    The diagram in the original post suggests that the changes in question are much larger than 0.01C (relative the the point at which equilibrium is reached by the one-box model, which is few decades).

    “As pointed out by several commentators on both sides, ongoing sizeable increases in warming over hundreds of years (high ECS) are physically highly if not extremely unlikely.”

    Citation required.

    ECS is the response to a doubling of CO2 after equilibrium is reached however long it takes, so I think you may have misunderstood. I would have thought you can have a model with low ECS where a lot of the warming is “in the pipeline” (emerges on long timescales), provided it also has low TCR (transient climate response).

    “Models that fail to show constraint should be regarded with suspicion.”

    I would argue that very simple one-box models that suggest equilibrium is reached after only after a decade or so should be regarded with suspicion (as pointed out in our comment paper on Loehle’s low ECS paper).

  29. Catalin C says:

    Interesting, is this the same Clive Best who was trying a few years back to explain AGW as coming from moon cycles or who did not know about aerosol impacts? Who was claiming to have made research I believe at CERN or something like that but nothing of such kind could have been traced?

  30. JCH says:

    For any of these results to be valid, the climate system has to be working in a certain way, and I am certain it is not.

    New:

    Big Jump of Record Warm Global Mean Surface Temperature in 2014-2016 Related to Unusually Large Oceanic Heat Releases

    Abstract

    A 0.24°C jump of record warm global mean surface temperature (GMST) over the past three consecutive record-breaking years (2014-2016) was highly unusual and largely a consequence of an El Niño that released unusually large amounts of ocean heat from the subsurface layer of the northwestern tropical Pacific (NWP). This heat had built up since the 1990s mainly due to greenhouse-gas (GHG) forcing and possible remote oceanic effects. Model simulations and projections suggest that the fundamental cause, and robust predictor of large record-breaking events of GMST in the 21st century is GHG forcing rather than internal climate variability alone. Such events will increase in frequency, magnitude and duration, as well as impact, in the future unless GHG forcing is reduced.

  31. dikranmarsupial says:

    angech, see also this from the IPCC WG1 FAR report (source):

    In this case there appears to be over a degree of warming that happens after the 100 year point.

  32. Clive,

    No but mitigation does. What matters is just how much carbon emissions remains before we commit to 2C warming. It looks like we have 4 times as much left as was previously thought (Miller et al.)

    I think the Millar et al. result was for 1.5C, not 2C. The large relative difference is mostly because we’re close enough to 1.5C than a small change can produce a big relative difference. I don’t think you can use their result to infer we have 4 times as much as previously thought for 2C.

  33. John Hartz says:

    Speaking of heat in the Earth’s ocean system…

    The consequences for Alaska were stark: dozens of whales died, as did thousands of common murres and tufted puffins, while sealife native to the tropics came up in nets pulled from sub-Arctic seas.

    But an unusual mass of warm water nicknamed “the blob,” which appeared off Alaska and hung around through 2016, didn’t occur in isolation. In northern Australia in 2016, high ocean heat bleached hundreds of miles of corals, killed mangroves, and destroyed giant clams. Off New Zealand, an ocean hot spell wiped out black abalone and brought an oyster-killing disease.

    Just as atmospheric shifts can bring droughts and nasty heat waves on land, shifts in weather or ocean circulation also can spark deadly marine heat waves, which can thoroughly scramble life at sea. But until recently scientists understood little about what role climate change might play in these extreme sea events.

    Human Emissions Made Ocean Heat Wave 53 Times More Likely by Craig Welch, National Geographic, Jan 16, 2018

  34. Clive Best says:

    Incidentally, there is some important physics that climate models don’t include – namely tidal mixing in deep oceans. This must stabilise equilibrium temperature faster than envisaged.

    Open-ocean tides are important in mixing deep-ocean water. Ocean scientists long assumed that wind was the principal mixing agent of the open ocean, but satellite altimeter data now show that tidal mixing in the deep ocean is about as important as the wind. Perhaps as much as half of the tidal energy in the ocean is dissipated in mixing processes when tidal currents in the deep ocean flow over seamounts, ridges, and other rugged features on the ocean floor or weave through passages between islands.

    Tidal currents flowing over topographic irregularities on the ocean floor generate internal waves that propagate away from their source. These internal waves arise from the fact that water density increases gradually with increasing depth. As tidal currents encounter a seamount or submarine ridge, relatively dense water is forced upward into slightly less dense water. Then to the lee of the obstacle gravity pulls the denser water downward. However, the descending water gains momentum and over shoots its equilibrium level and descends into denser water. The water then ascends thereby forming an oscillating wave that propagates horizontally. Because these waves are generated by tides, they occur at tidal frequencies and are called internal tides. Internal tide waves can travel thousands of kilometers beyond the obstruction that formed them and can have very large wave heights. They also break, like surf on a beach but under water, locally mixing waters above and below the internal wave. Internal tides are important in mixing cold bottom waters with warmer surface waters as part of the global oceanic conveyer belt circulation.

  35. John Hartz says:

    JCH: Looks like we cross-posted about the same study. Keep up the good work. 🙂

  36. Pingback: A ‘new’ measurement of climate sensitivity? | Climate Change

  37. Clive,

    Incidentally, there is some important physics that climate models don’t include – namely tidal mixing in deep oceans. This must stabilise equilibrium temperature faster than envisaged.

    That certainty again? I’m not sure what you say is strictly true. As I understand it, some of the energy transfer to the deep ocean is modelled as a diffusion process. Maybe have a look at the dicussion around Figure 7 here. My understanding is that if we can mix energy into the deep ocean quickly, then we’d expect a larger planetary energy imbalance, but we’d approach equilibrium quickly (although the surface would initially response slowly – I think). On the other hand, if we mix energy into the deep ocean very slowly, then the planetary imbalance would remain small and it would take much longer to reach equilibrium (the surface response would be fast). There is some suggestion that the observations support something in between (intermediate).

  38. dikranmarsupial says:

    ” This must stabilise equilibrium temperature faster than envisaged. ”

    But how much faster? Any evidence that the it makes a non-negligible difference?

    The quote appears to be from here and the only reference it makes to heat is:

    Internal tide waves can travel thousands of kilometers beyond the obstruction that formed them and can have very large wave heights. They also break, like surf on a beach but under water, locally mixing waters above and below the internal wave. Internal tides are important in mixing cold bottom waters with warmer surface waters as part of the global oceanic conveyer belt circulation.

    Note that while internal tides are important in mixing cold bottom waters and warmer surface waters, that doesn’t mean that “internal tide waves” has a big effect on that.

    This is the joy of skepticism, you can quote anything you like from the web and leave it to others to do the work of finding out whether the argument has any validity. Good rhetoric, but bad science.

  39. MarkR says:

    1) That’s a nice demonstration. Have you tested the full record when you know the answer? You could try spherical triangulation on CMIP5 historical runs with HadCRUT4 masking and see if it works all the way back.

    2) There are good reasons to start with the 1983 model – that’s what GISS did! This post shows that the one-box model doesn’t validate though since the blue dots aren’t snug around the dashed black line in Figure 3. The Armour, Bitz & Roe (2013) paper I linked to shows that the delayed warming in these simulations is mainly due to the oceans.

    3) Since the early energy balance studies we’ve seen that some “hidden” assumptions matter, like how there’s no single time response. Also, global feedback f = integral(f(φ,θ)T(φ,θ) sinθ dθ dφ) / integral(T(φ,θ) sinθ dθ dφ). Since feedbacks change with location (e.g. the Arctic has sea ice feedback, the tropical Pacific doesn’t) then the result of the integral is not unique for a given global temperature change. Any test that’s relies on data where T(φ,θ) is not the same as the equilibrium temperature pattern is likely to get the wrong answer without additional information.

    4) The models go to equilibrium, but if you’re using a simple energy budget model then the behaviour is much better explained if you allow multiple time responses. See e.g. the Held et al. (2010), Geoffroy et al. (2013) & Armour, Bitz & Roe (2013) papers I link to.

    “The next AR6 report must make a much clearer statement about climate sensitivity and future warming. Otherwise the public will simply lose interest.”

    I’m not sure what you mean by “clearer”? 3 C with a possible range of 1.5-4.5 C seems about as clear as you can make a complicated PDF. And the IPCC’s job is to best report the scientific understanding, not keep the public interested!

  40. JCH says:

    In the 21st century there were trade winds so powerful they blew a bunch of Pacific heat all the way from Peru into the Indian ocean.

    That just might have driven some extra heat into the deep a ocean a bit faster than normal. Think hydraulic hammer.

    They’ve subsided. The current La Niña doesn’t have enough cold punch to fill the thimble the 2016 La Niña failed to fill. 2017 is the 2nd warmest year. Why? Because the trades are weak and the PDO is, while weak, positive.

    We are in the midst of a fairly hard hitting warm regime. This is my confidence in observation-based ECS: -1.45 to -4.5 c (Confidence).

    The PDO has been in positive territory for a record number of consecutive months in a row, 4 full years:

  41. MarkR says:

    Roger Jones:
    “For any of these results to be valid, the climate system has to be working in a certain way, and I am certain it is not.”

    For simple one-box models I totally agree. The long-term near-global-mean temperature change *alone* doesn’t contain anywhere near enough information to say much about ECS, except that we know beyond reasonable doubt that ECS > transient climate response (TCR) and our best estimate of TCR is ~1.6-1.7 C right now.

    I’m personally a fan of the studies that use observations to constrain key physical processes. I liked DeAngelis et al. (2015, doi: 10.1038/nature15770) and studies like Tan, Storelvmo & Zelinka (2016, doi: 10.1126/science.aad5300). Not that I think that Tan’s reported ECS is right, but the fundamental physics mixed with data improve our understanding.

  42. Windchaser says:

    dikran:

    This is the joy of skepticism, you can quote anything you like from the web and leave it to others to do the work of finding out whether the argument has any validity. Good rhetoric, but bad science.

    Sure, but if your goal is to try to bolster your own beliefs, rather than to develop a careful and well-informed opinion, it’s more than sufficient. In other words, that’s motivated reasoning — you develop a belief, then look for evidence or reasoning to back it up.

    Clive:

    Just how realistic are modern models if they increase uncertainty rather than reduce it ?

    Modern models moved a lot of stuff from the “unknown unknowns” column to the “known unknowns” column. In other words, they do a better job at capturing what’s uncertain and what isn’t, rather than assuming that whatever we leave out isn’t significant.

    If you ignore uncertainties, say, by using a one-box model that leaves out our uncertainty about ocean equilibration, then you will end up with a model result that looks more certain than it really is. That’s bad.

  43. Windchaser says:

    MarkR

    We are in the midst of a fairly hard hitting warm regime. This is my confidence in observation-based ECS: -1.45 to -4.5 c (Confidence). [emphasis added]

    Negative 1.45 to negative 4.5 C? Whew!

  44. Windchaser says:

    (sorry, that should be to JCH, not MarkR)

  45. JCH says:

    MarkR doesn’t say stuff that stewpudd. c is confidence, not ℃.

  46. Willard says:

    Just in case it’s not obvious to everyone: MarkR is the author of that post.

  47. Hyperactive Hydrologist says:

    Thermohaline circulation transit time is around 1000 years. Would it not take at least that until the system oceans are at equilibrium?

  48. “This is the joy of skepticism, you can quote anything you like from the web and leave it to others to do the work of finding out whether the argument has any validity. Good rhetoric, but bad science.”

    Wunsch and Munk were the main proponents of deep tidal mixing. There are GCM’s that include this feature. The puzzling feature is that tidal forces clearly have an effect on the surface (conventional tidal analysis) and clearly have an effect on the deep ocean, yet the amount of research that has been reported in the middle reaches of the ocean, such as on the thermocline is non-existent. And the thermocline is where there is a highly reduced effective gravity, so one would imagine that any gravitational forcing at that point would be significantly magnified.

    I would suggest that the null hypothesis for a behavior like ENSO is to include the lunisolar forcing, as it is entirely a thermocline effect – whereby hot and cold ocean water is exposed to the surface as the thermocline tilts. AFAICT, no one has actually done the work on applying the appropriate lunisolar gravitational cycles to see if it has an effect. I have done it, reported at last month’s AGU, and the modeled effect is striking.

  49. Clive Best says:

    @dikranmarsupial

    Lunar Tides clearly have a huge effect on sea level every 6 hours, yet somehow we dismiss them as everyday occurrences ignoring any longer term effects on climate. I am sure that’s untrue. Certainly there is an 18.6 year modulation on the polar reach of tides, accentuated in long term with orbital changes of the earth. There are also short term and medium term oscillations in tidal forcing, which most likely play a role in oceanic oscillations, and influencing Jet Stream flow in the atmosphere.

    http://clivebest.com/blog/?p=7278

  50. Ragnaar says:

    New:
    Big Jump of Record Warm Global Mean Surface Temperature in 2014-2016 Related to Unusually Large Oceanic Heat Releases

    We’re doing fine. There was a hiatus or a slow down or a warming slog and it was caused by GHG forcing. GHGs warm things like the oceans until that deal takes a hiatus. So first we had the oceans ain’t giving it up hiatus then a hiatus of that hiatus and the warmest ever years in the modern record. What next? GHGs still being able to hold more joules than before in the oceans. A different point of view is that GHGs will fail to do that because of, see WUWT. Or that it was temporary natural variability that really did that CO2 warms the oceans deal. CO2 is our go to warmer. And oceans are our go to thing to be warmed. These extra joules were supposed to be outside my window in Minnesota. But they prefer being in the oceans.

  51. izen says:

    @-Ragnaar
    ” These extra joules were supposed to be outside my window in Minnesota. But they prefer being in the oceans.”

    Also the Arctic circle, Siberia and large portions of the Southern hemisphere

  52. Everett F Sargent says:

    RJ and CB are so wrong that they’re not even wrong.

    CB better post a link to whatever it is that he appears to quote with respect to the ‘so called’ deep ocean tidal velocities. The tides are very long waves and fall into the category of shallow water waves (Airy theory, e. g. c=sqrt(gd) with a whole table of linear relationships, but you really want u,v,w at the bottom, from that linearized table of formulas, see the online version of CEM (note, I predict the E will very likely inferred in an unkind way by at least one individual here, SOP and all that).

    This link is mostly BS …
    http://oceanmotion.org/html/background/tides-ocean.htm

    We’ve known about deep ocean tides like forever even, at least at the USACE ERDC CHL. There is a standard boundary condition (starts with the letter K) applied to all local circulation models like ADCIRC at the offshore deep water boundary (e. g. the deep water tidal harmonics).

    It also helps to look into densimetric Froude number, gradient Richardson number and Reynolds numbers. FFS! The oceans are, in general, highly stratified, except for the overturning zones, upwelling and downwelling regions, the ocean density does increase with depth because of the bulk modulus of salt/fresh water (say ~300ksi) is finite (also known like forever).

    Most global ocean circulation models only pick up the 8-10 primary longer period deep water tidal harmonics from either GRACE (now dead) or satellite altimetry (JASON/1/2/3 , et. al., Jason does a complete global cycle in just under 10-days, as such, most of the harmonics are strongly aliased and that has to be accounted for and is). But you only need to consider the major two S2 and M2 harmonics, because these have the largest deep water tidal diurnal amplitudes (necessary to calculate u,v,w at the ocean bottom).

    Or FFS, just look at some deep ocean bottom water video. 😦

    The AMOC takes ~800 years. Or so I’ve been told.

    I’m bored. But not that bored, to be bothered with these very simple deep (meaning the abysmal depths) ocean tidal bottom velocities.

  53. Everett F Sargent says:

    The ‘so called’ internal waves, in the strictly 2D sense have to occur at an INTERFACE! Rossby waves, Kelvin–Helmholtz instability Kelvin waves are all forms of internal waves that occur at the thermocline in our oceans. These waves can and do break. I did OTEC research (published even, twice even and a DOE report to boot) in the late 70’s. Instead of g you use g’ instead of h you use h’, g is to g’ as h’ is to h. g’ = delta rho times g, top hat (discrete) density profile assumed.

  54. Everett F Sargent says:

    There should be a rho in the denominator, but to a first approximation, for water rho ~1. Dimensional analysis for consistency of units. My bad. Sorry.

  55. Everett F Sargent says:

    Internal Tide Generation in the Deep Ocean
    http://www.annualreviews.org/doi/pdf/10.1146/annurev.fluid.39.050905.110227
    (paywalled, but …)

    “It does seem, though, that some of the internal tide energy flux from large topographic features in mid-ocean propagates across ocean basins to break on distant continental slopes (Nash et al. 2004). Thus, continental slopes may be significant sinks, rather than just sources, of internal tide energy! It is possible that dissipation on continental slopes partly occurs in homogenous boundary layers and is relatively inefficient at producing a vertical buoyancy flux, making internal tides less of a contributor to global ocean mixing than is sometimes assumed. This needs further investigation. The internal tides generated by the other class of deep-sea bottom topography, the rough regions of fracture zones, are in low-to-intermediate modes, and therefore more likely to cascade to turbulence in the open ocean, either by further bottom scattering or by nonlinear interactions in the ocean interior. Further discussion of the fate of internal tides is beyond the scope of this review.

    That full paragraph seems or appears to suggest; (1) “making internal tides less of a contributor to global ocean mixing than is sometimes assumed” and (2) ” therefore more likely to cascade to turbulence in the open ocean, either by further bottom scattering or by nonlinear interactions in the ocean interior”

    My conclusion from this 2007 review paper? Whatever internal deep ocean tidal dissipation occurs stays mostly in the deep abysmal ocean (Las Vegas gambit, what happens in Vegas stays in Vegas). YMMV.

  56. JCH says:

    Well, OHC to the end of 2017 is out for 0 to 700 meters, but for whatever reason, 0 to 2000 meters is not yet updated. Anyway, OHC is not down:

    https://imgur.com/r7EmDNv

  57. Everett F Sargent says:

    A few more references for those who are truly technically interested (I do get really tired of the hand waving of one particular individual hereabouts).

    Density Stratification, Turbulence, but How Much Mixing?
    http://centerforoceansolutions.org/sites/default/files/publications/Ivey%20et%20al%202008-Density.pdf

    Ocean Circulation Kinetic Energy: Reservoirs, Sources,and Sinks
    http://ferrari.mit.edu/wp-content/uploads/publications/FerrariWunsch2008.pdf
    http://arjournals.annualreviews.org/article/suppl/10.1146/annurev.fluid.40.111406.102139?file=fl.41.ferrari.pdf

    Internal Wave Breaking and Dissipation Mechanisms on the Continental Slope/Shelf
    http://www.annualreviews.org/doi/abs/10.1146/annurev-fluid-011212-140701
    (paywalled, but …)

    From Topographic Internal Gravity Waves to Turbulence
    http://cfdlab.ucsd.edu/cfdpapers/Sarkar_Scotti_ARFM_2016.pdf

    “Although these findings help us understand how turbulence generated during generation and reflection influences mixing in the proximity of boundaries, the way internal waves propagating in the water column affect mixing remains more obscure. Over complex topography, nonlinearity and turbulence can be generated by the interaction of beams coming from multiple generation sites, but internal waves can also interact with critical layers, or with a region of strongly varying background stratification. Most modeling work in this area has been 2D (with exceptions) and thus not able to directly address the role of turbulence.”

    All five papers are from the Annual Review of Fluid Mechanics (2007, 2008, 2009, 2014 and 2017) which, along with JFM, was always on my reading list.

  58. Everett F Sargent says:

    Ragnaar sez …

    “A different point of view is that GHGs will fail to do that because of, see WUWT.”

    You mean like the person over there, with two posts now, who thinks that there are 13 months in a year. The same person who takes GMST, removes said trend and then sez … wait for it … look PROOF of no trend an 99% confidence, no make that 90% confidence (proof should never be located anywhere near a probability statement, the irony is killing me)! The same person who plots 10 year trends (121 months mind you not 120 months) at the end of the 10.0833333333333333333333333 year periods.

    [Chill. -W]

  59. Steven Mosher says:

    “I’m personally a fan of the studies that use observations to constrain key physical processes.”

    ditto

  60. Clive,

    Lunar Tides clearly have a huge effect on sea level every 6 hours, yet somehow we dismiss them as everyday occurrences ignoring any longer term effects on climate. I am sure that’s untrue.

    Are you sure this isn’t just another example of you asserting that something is important and that it’s being ignored by the science community, only to discover that it’s either not important and/or it’s not being ignored by the science community?

  61. dikranmarsupial says:

    Clive wrote: “Lunar Tides clearly have a huge effect on sea level every 6 hours, yet somehow we dismiss them as everyday occurrences ignoring any longer term effects on climate. “

    The whole body of water going up and down together doesn’t necessarily result in mixing. The point was that you are making a claim without bothering to find out whether there is an evidence to support the intuition. Fortunately others have done it for you. In scientific discussion, the onus is on yourself to provide the evidence to support your assertions, not leave it to your interlocutors to shoot it down for you. Skepticism ought to start with self-skepticism, otherwise you are on the road top hubrisville.

  62. dikranmarsupial says:

    By the way Clive, I note that you did not respond to the point that the source you quote does not actually explicitly support there being a thermal mixing effect from tide waves, rather than just from the tides themselves. Again this is good rhetorical technique, but bad science.

    Note that while internal tides are important in mixing cold bottom waters and warmer surface waters, that doesn’t mean that “internal tide waves” has a big effect on that.

  63. “The ‘so called’ internal waves, in the strictly 2D sense have to occur at an INTERFACE! Rossby waves, Kelvin–Helmholtz instability Kelvin waves are all forms of internal waves that occur at the thermocline in our oceans. These waves can and do break. I did OTEC research (published even, twice even and a DOE report to boot) in the late 70’s. Instead of g you use g’ instead of h you use h’, g is to g’ as h’ is to h. g’ = delta rho times g, top hat (discrete) density profile assumed.”

    Great reading what Everett has to say. This is the schematic of his description, with the black wavy line indicating the thermocline :

    I think the research on topological insulators that Brad Marston is applying to climate science is going to lead to some new breakthroughs. The key thing to understand about this physics is that it applies to boundaries, which could be stratified surfaces (such as the thermocline or the stratosphere) and also to reduced dimensionality regions, such as along the equator. There is much interesting math that can be potentially applied.

    Clive said:

    “Lunar Tides clearly have a huge effect on sea level every 6 hours, yet somehow we dismiss them as everyday occurrences ignoring any longer term effects on climate. I am sure that’s untrue. Certainly there is an 18.6 year modulation on the polar reach of tides, accentuated in long term with orbital changes of the earth. There are also short term and medium term oscillations in tidal forcing, which most likely play a role in oceanic oscillations, and influencing Jet Stream flow in the atmosphere.”

    No doubt. Forcing plays a huge role in climate variations, whether it is due to excess CO2, the annual and daily solar variation, or to the strong monthly and fortnightly lunar tidal forcing. Yet not enough emphasis is placed on the latter

  64. angech says:

    dikranmarsupial January 18, 2018 at 3:19 pm makes a number of interesting points in that and his following graph January 18, 2018 at 3:51 pm
    He says angech wrote “Arguing over a 0.01 increase in ECS over another thousand years seems a little pedantic.”
    “”The diagram in the original post suggests that the changes in question are much larger than 0.01C (relative the the point at which equilibrium is reached by the one-box model, which is few decades).”
    The original post was for a x4 increase whereas my comment was directed at a doubling. The author suggested on average a 43% increase over the simple model which none of the graphs show [time frames too short] so obviously there is a long slow fat tail implied over a thousand or more years as it is not obvious in the truncated graphs given.
    For an ECS of 3.0 this would imply a 1.3C increase over a thousand years for a x4 model or 0.3 C for a simple doubling over a thousand [or so] years.
    So he is right.

    “In this case there appears to be over a degree of warming that happens after the 100 year point”
    Only for the x4. The x2 shows very little increase after 100 years, possibly 0.3C.

  65. angech says:

    Mark .
    The graphs all show that the models do a much quicker initial rise in temp, then go below Clive’s simple model for quite a while and then slowly accumulate temperature and obviously must continue doing so for a lot longer than the 150 years shown on the graphs.
    Would it hurt to show a longer time frame so the true increase of these models and the time to near true ECS for them could be shown? [say 1000 years.]
    One presumes the initial rise is due to the poor mixing and the delayed cross over and eventual ongoing rises are all due to the layer mixing theory.

    “Consider the Figure on the left (Figure 2) for the years 1861-2015. Imagine living in the world of the top left panel. In this world we might read a blog that says ECS is around 1.7 oC but in reality it would be 3.8 oC”
    This is a bit unfair.
    the change in CO2 over that time period is quite small compared to a doubling or a x4 scenario.
    No matter what reality ECS you put in in a 1.5 to 4.5 range it would tend to match the historical graph. Arguing that 3.8 from a model is more real than 1.7 from a model [when you yourself said the model average is 3.2 C] is not a goer.
    I could as easily say in the graph of 1.7 we could read a blog saying it was 3.8 when in the reality of the 1.7C graph world it was 1.7C.

  66. BBD says:

    @JCH

    Well, OHC to the end of 2017 is out for 0 to 700 meters, but for whatever reason, 0 to 2000 meters is not yet updated. Anyway, OHC is not down:

    Seems to be up now:

  67. Clive Best says:

    Tidal forcing certainly seems to provoke some strong opinions!

    No-one has yet mentioned the new ECS study in Nature so I will. I can’t actually read the paper behind the paywall, but AFAIK Cox et al. used natural variability to compare to that generated by different CMIP5 models. Only those with low sensitivity matched the temperature data. As a result they are able to put a limit on ECS. They find.

    ECS = 2.8 ± 0.4 C

    which is almost exactly what I found. Piers Forster writes in Nature:

    “In my view, Cox and colleagues’ estimate and the estimates produced by analysing the historical energy budget carry the most weight, because they are based on simpler physical theories of climate forcing and response, and do not directly require the use of a climate that correctly represents cloud.”

  68. Olof R says:

    It’s interesting that Clive Best’s and other estimates suggests a temperature lag of around 15 years.
    I have been playing with Gistemp loti vs ln CO2 and found that the lag 14 years gives the best linear fit. So here’s a chart that we can call the “Layman’s estimate of ECS”:

    FWIW, A doubling from preindustrial 280 ppm to 560 ppm (from left to right wall) suggests a temperature increase of precisely 3.05 C.

  69. Clive,

    Tidal forcing certainly seems to provoke some strong opinions!

    What strong opinions? Just because not everyone bows down to your assertion doesn’t imply some kind of strong opinion.

    As a result they are able to put a limit on ECS. They find.

    ECS = 2.8 ± 0.4 C

    which is almost exactly what I found.

    I wondered if you would mention this. It’s an interesting result, but that doesn’t mean that it’s now accepted (see James’s post, for example). That they get a similar result to what you got doesn’t somehow validate your analaysis, or invalidate the critiques. It’s been clear for quite some time that the middle is probably somewhere around 3C. Therefore any attempt to narrow the range is likely to produce similar results, even if some analyses are more robust than others.

  70. BBD says:

    Presumably the Cox et al. methodology doesn’t take into account nonlinear feedbacks to future warming? And may therefore yield a low estimate?

  71. BBD,
    Yes, I think their method assumes that the ECS is not state dependent.

  72. dikranmarsupial says:

    angech wrote “The original post was for a x4 increase whereas my comment was directed at a doubling.

    Goalpost shift: Your original comment did not mention a doubling, quoted below for convenience:

    While technically true would not the bulk of the warming be expected to occur in the first 100 years. Arguing over a 0.01 increase in ECS over another thousand years seems a little pedantic.
    As pointed out by several commentators on both sides, ongoing sizeable increases in warming over hundreds of years (high ECS) are physically highly if not extremely unlikely.
    Models that fail to show constraint should be regarded with suspicion.

    Moving on:

    The author suggested on average a 43% increase over the simple model which none of the graphs show [time frames too short] so obviously there is a long slow fat tail implied over a thousand or more years as it is not obvious in the truncated graphs given.
    For an ECS of 3.0 this would imply a 1.3C increase over a thousand years for a x4 model or 0.3 C for a simple doubling over a thousand [or so] years.
    So he is right.

    And clearly that is nothing like the 0.01 increase mentioned in your original comment.

    “In this case there appears to be over a degree of warming that happens after the 100 year point”
    Only for the x4. The x2 shows very little increase after 100 years, possibly 0.3C.

    Again clearly that is nothing like the 0.01 increase mentioned in your original comment.

    If you want to know why you tend to irritate people, this sort of disingenuous goalpost shifting is a good example. You claimed that “ongoing sizeable increases in warming over hundreds of years (high ECS) are physically highly if not extremely unlikely.”; a warming of 0.3C after the 100 year mark is a sizeable increase by any reasonable standard. It would be better just to admit the claim was incorrect, it isn’t a big deal, it is how we learn.

  73. dikranmarsupial says:

    Clive Best wrote “Tidal forcing certainly seems to provoke some strong opinions!”

    No, posting unsupported (and somewhat hubristic) assertions provokes responses. I have no strong views on tidal forcing, but assertions of incredulity (“it must have an effect”) are not at all convincing, in the way that observational evidence and/or physics are.

  74. Griff says:

    Thanks for the reply ATTP MarkR
    I set out to find more known unknowns I did not know and now know I know even less.

    https://sepm.org/CM_Files/NSF%20items/NRC2011DeepTimePrepubfinal.pdf

    This suggests that lce sheet dynamics are significant for perturbing ECS over longer periods.
    It also suggests this process is not linear instead can reach thresholds then change rapidly to new states. Sans ice , the Albedo feed back would be interesting as would the geological effects.
    Over geological time our Uncontrolled experiment in atmospherics will change the shape of continents, depending on the path we take.

    Without doubt there are other feedbacks we can expect that will not necessarily be liner or resolved in decades..
    Oceancurrents, biota,atmospheric circulation,cloud formation,The hydrodynamic cycle. etc

    I think I do understand a little why looking at the recent past will not be capable of restraining the ECS to such a narrow range.
    .
    .

  75. Griff,
    This is a useful illustration.

  76. verytallguy says:

    Griff,

    what AT posted, and my amateur understanding:

    It’s not widely understood but is important that ECS is *not* the ultimate equilibrium of the climate system.

    It’s rather a *definition* of what the equilibrium might be if only a certain set of “fast” feedbacks operate. Vegetation and ice sheets are excluded.

    It may be better seen as a way to quantitatively characterise the system than an actual prediction of response.

  77. There are actually some strong opinions regarding tidal forces in the geophysics literature. Last year, a USGS group found a correlation in the triggering of earthquakes with tides.

    van der Elst, Nicholas J., et al. “Fortnightly modulation of San Andreas tremor and low-frequency earthquakes.” Proceedings of the National Academy of Sciences (2016): 201524316.

    Delorey, Andrew A., Nicholas J. van der Elst, and Paul A. Johnson. “Tidal triggering of earthquakes suggests poroelastic behavior on the San Andreas Fault.” Earth and Planetary Science Letters 460 (2017): 164-170.

    and also this Japanese group

    Ide, Satoshi, Suguru Yabe, and Yoshiyuki Tanaka. “Earthquake potential revealed by tidal influence on earthquake size-frequency statistics.” Nature Geoscience 9.11 (2016): 834-837.

    Yet within the last week a scientist at USGS said flatly that there is no correlation between large (>= 8 magnitude) earthquakes and the phases of the moon. I haven’t read the paper yet but it is being quoted online:
    https://gizmodo.com/study-with-one-word-abstract-finds-moon-phases-dont-pre-1822190714

    The funny part of her paper is the nature of her abstract. All the abstract says is No. !
    That’s being awfully dismissive.

    What Hough doesn’t acknowledge is the triggering effect. When an earthquake is triggered, it may be influenced by a small nudge due to the vicinity of the lunisolar path. This isn’t just the phase of the moon but the perigee effect, the nodal declination effect, etc. Thus, she is contradicting the recent research by her colleagues at the USGS.

  78. The Very Reverend Jebediah Hypotenuse says:

    Clive Best,

    Incidentally, there is some important physics that climate models don’t include – namely tidal mixing in deep oceans. This must stabilise equilibrium temperature faster than envisaged.

    I’ll see your “tidal mixing in deep oceans”, and raise you a “baroclinic eddies that maintain ocean stratification”.*

    Once we get to invoke the majick science-beans of “important physics that climate models don’t include”, I can easily find a “must stabilise equilibrium temperature SLOWER than envisaged” for any of your “must stabilise equilibrium temperature faster than envisaged”.

    Deriving a “must” from an important physical non-model may be sub-optimal.

    Marshall et al. (2002) suggests that baroclinic eddies (baroclinity) may be an important factor in maintaining ocean stratification.

  79. Ragnaar says:

    The first attribute is what it can do in the short term and the second is its sustain in the long run.

    1) Atmosphere: Agile, Low
    2) Upper Ocean: Agile, Medium
    3) Oceans: Slow, High

    We compare now to 1950. Since then transfers from 1) to 2) have been significant and from 2) to 3) are important but less so than from 1) to 2). 2) Now does what? About the same as the last 65 years. What is driving 1) to 2) will only increase. What can counter that is the Pacific trade winds. 3) Will draw for a long time. 2)’s equilibrium if reached will see 3)’s current temperature and react. Data on 3) to date suggests it has drawn.

    Latitudinal changes. 2) being warmer emits more to 1) in lower humidity regions in higher latitudes. And lower humidity higher latitude regions show warming. So with an increased down force in equatorial regions there is an increased up force towards the poles. There is a shift using the upper ocean medium. With this, the system is pursuing equilibrium.

  80. MarkR says:

    angech,

    “The graphs all show that the models do a much quicker initial rise in temp, then go below Clive’s simple model for quite a while and then slowly accumulate temperature and obviously must continue doing so for a lot longer than the 150 years shown on the graphs….Would it hurt to show a longer time frame so the true increase of these models and the time to near true ECS for them could be shown? [say 1000 years.]…”

    The Held et al. (2010) paper is a good start point, it discusses how a two-box model does much better. 150 years is a “standard” CMIP5 abrupt4xCO2 run, although some do more. There are longer runs elsewhere & the results are interesting, e.g. Knutti & Rugenstein (2015, doi: 10.1098/rsta.2015.0146). See also LongRunMIP: http://www.longrunmip.org/

    “…No matter what reality ECS you put in in a 1.5 to 4.5 range it would tend to match the historical graph. Arguing that 3.8 from a model is more real than 1.7 from a model [when you yourself said the model average is 3.2 C] is not a goer.
    I could as easily say in the graph of 1.7 we could read a blog saying it was 3.8 when in the reality of the 1.7C graph world it was 1.7C.”

    The idea is that if we lived in the world of ACCESS1-0 the temperature record would look like the top left. *We* know the true ACCESS1-0 ECS is 3.8 °C but if we lived in it then we wouldn’t have that information, and Clive Best’s one-box approach would give 1.7 °C. So that’s what you might read on a blog.

    If we lived in GFDL-ESM2G then Clive’s blog would be closer (2.1 °C vs true 2.4 °C) but if we lived in HadGEM2-ES then it would be further (1.6 °C versus true 4.6 °C).

    This is just evidence that a one-box model can’t yet narrow the ECS range so long as the oceans below the mixed layer take longer than 12 years to respond (they do). It hides lots of uncertainty and undershoots the truth. That’s not to say true ECS must be higher than 2.5 °C – uncertainty in radiative forcing might cut the other way for example.

  81. MarkR says:

    verytallguy:

    “It may be better seen as a way to quantitatively characterise the system than an actual prediction of response.”

    I agree, it’s useful for comparing things and determining which processes are important.

    That said, we expect it to be related to the warming over a century or so for the sorts of changes in CO2, volcanism etc that’re ongoing and seem realistic in the near future. And if we choose to emit enough CO2 that inferred ECS destabilises the Greenland or West Antarctic ice sheets then that’s equivalent to saying we’ve decided that we don’t need southern Florida or Louisiana or lots of other places where in total hundreds of millions of people live. They haven’t publicly said it but it’s clearly the position of some political groups and they are pushing policies that encourage the destruction of those areas. A more precise ECS value would let us know when it might be too late to save those areas, which is useful. But it doesn’t necessarily tell us how quickly we’d need to evacuate those areas, which is not useful.

    Like most ways of rating real world things like GDP or BMI it has uses and limits.

  82. The Very Reverend Jebediah Hypotenuse says:

    MarkR:

    …that’s equivalent to saying we’ve decided that we don’t need southern Florida or Louisiana or lots of other places where in total hundreds of millions of people live.

    Ah – but Dr Judith Curry, who is currently crowd-sourcing blog-scientist expertise on sea-level rise so that she can package it and sell it to her business clients, says:

    Substantial sea level rise that happens quickly, or over a year, is difficult and costly to address. Sea level rise near the end of the century that affects a tiny percentage of the land area of a country, even with a large percentage of population living there today, can be a relatively minor problem if it is managed appropriately.

    and

    The focus on emissions reductions as some sort of solution to sea level rise (apart from any determination of cause) is distracting from developing better land use policies and coastal engineering practices.

    So there –
    Not only is there not much to worry about, especially in the long term – But your focus on emissions is distracting everyone except Dr Curry from the appropriate forms of non-advocacy.

  83. MarkR says:

    Clive Best:

    “No-one has yet mentioned the new ECS study in Nature so I will…”

    Credit goes to Griff’s first comment, I think you missed it!

    I’m still digesting it but you can see their Figure 2(b) through the paywall, and it’s the same principle as Figure 3 above. It’s a common sense test for any proposed obs-based ECS estimate such as a one-box fit and it’s common in research but often missing on blogs. I think it’s a sign of quality and transparency when this sort of test is done and shown, although in my experience, 2 % of science is finding the number and 98 % is testing whether that number is reliable and working out its uncertainties so you can tell if it’s useful… still, simple tests like my Fig. 3 or their Fig. 2(b) are pretty quick and should become a standard climate blogger tool IMO.

    Their ECS of 2.8 ± 0.4 °C is the “likely” range, pixel counting their CDF gives a 5–95 % of 1.8–3.7 °C. The “true” uncertainty might be larger depending on how robust their assumptions are. One mystery we have right now is there’s a cluster of satellite- and physics-based constraints that favour higher ECS: Sherwood et al. (2014), Su et al. (2014), Tian et al. (2015), Zhai et al. (2015), Brient & Schneider (2016), Tan et al. (2016) and Siler et al. (2017) spring to mind.

    Until the methods are reconciled then the uncertainty range isn’t going to shrink much – it doesn’t make sense to me to pick your favourite number or range and run with that and just throw out e.g. the important results you get when you combine CALIPSO lidar measurements of supercooled liquid clouds and Maxwell’s equations.

  84. MarkR says:

    Reverend:
    From Judith Curry:
    “Substantial sea level rise that happens quickly, or over a year, is difficult and costly to address. Sea level rise near the end of the century that affects a tiny percentage of the land area of a country, even with a large percentage of population living there today, can be a relatively minor problem if it is managed appropriately.”

    I’d need “substantial”, “minor”, “relatively minor problem” and “managed appropriately” to be well defined, clearly described and costed. I want to hear from the people whose families Judith Curry seems so willing to uproot and throw out of their homes to be sure that this is a “minor problem”. I’d like a clear explanation of how you’re going to get investors to act rationally for the common good over decades and slowly roll down and abandon a ~$1 trillion southern Florida housing market while doing the same thing across the rest of the world.

    If I remember right, the last thing I saw Curry explaining clearly was something about how “greater than 50 %”, “more than half” and “most” are somehow different in a super important way, so we don’t know enough to act on climate change. Did I dream that?

  85. Everett F Sargent says:

    CB sez,

    “Tidal forcing certainly seems to provoke some strong opinions!”

    I HAD a strong opinion about your own opinion. I HAD some a priori knowledge that you cleanly did not have. I found your link myself and immediately said BS based entirely on my rather naive (hindsight is 20/20) a priori knowledge. Turns out that I was WRONG! This is not the first time that I have been WRONG with respect to climate science. Go figure.

    So, as is my nature, I looked, or turned to Google Scholar …
    “tidal mixing in the deep ocean”
    https://scholar.google.com/scholar?hl=en&as_sdt=0%2C25&q=tidal+mixing+in+the+deep+ocean&btnG=
    Fifth link was my 1st link here (2007 so somewhat recent (and one of my go to journals no less)).
    411 references, hmm, so let’s see those …
    https://scholar.google.com/scholar?cites=5528163024992671207&as_sdt=5,25&sciodt=0,25&hl=en
    1st link there has 323 cites (2009 my 3rd link here), 2nd link there has (2008 my 2nd link here)
    2nd link references …
    https://scholar.google.com/scholar?cites=1474006214736411071&as_sdt=5,25&sciodt=0,25&hl=en
    4th link there has 71 references (2014, so I.m getting closer to the current state-of-the-art, but not quite there yet, circa 2017)
    Using a standard Google search for a *.pdf copy of my 4th link here …
    https://www.google.com/search?q=%22Internal+Wave+Breaking+and+Dissipation+Mechanisms+on+the+Continental+Slope%2FShelf%22&oq=%22Internal+wave+breaking+and+dissipation+mechanisms+on+the+continental+slope%2Fshelf%22&aqs=chrome.0.69i59j69i60j0.4767j0j7&sourceid=chrome&ie=UTF-8
    No 4th link *.,pdf there, but I found a *.pdf copy of my 5th link there, circa 2017 even (it is down that list a ways, but, for some reason, that title looked interesting.

    Moral of this ‘so called’ story? Well, 1st I learned something new, something that I had not a clue about beforehand, and it only took a little over an hour to gain that new knowledge. Second, it turns out that I was mostly right, but for the WRONG reason.

    Throughout this quick and dirty search, I was looking for buoyancy flux or mixing from the abysmal ocean to the highly stratified upper ocean, a number if you will. A number that could be put in perspective to all other currently known oceanic mixing processes. A number that cascades UPWARDS or “punches” through buoyancy stabilizing thermocline and into the upper ocean (for example, see the SOM flowchart from my 3rd link here).

    This appears to happen only somewhat at the continental shelves, and is quite analogous to surface water waves shoaling (or moving into shallower bathymetry) and then breaking as they approach the shoreline.

    I never did find that number, but it would appear to be a rather small number (from the review literature I’ve cited above) with respect to all other known mixing processes in the UPPER oceans (meaning abysmal tidally forced waters moving and mixing into the upper highly stratified ocean, I now understand that there is mixing WITHIN the abysmal waters due to tidal forcing, however that mixing needs some perspective with respect all other mixing processes, but for now, with respect to only the abysmal waters, what mixing down their that does occur, appears to be primarily tidally driven). It would appear, circa 2017 literature reviews, that the overall oceanic mixing cascade is still dominated by DOWNWARDS (and LATERAL) mixing processes in the UPPER oceans.

  86. Willard says:

    > If I remember right, the last thing I saw Curry explaining clearly was something about how “greater than 50 %”, “more than half” and “most” are somehow different in a super important way, so we don’t know enough to act on climate change. Did I dream that?

    I’m afraid not:

    A while ago Judith Curry wrote a rather confusing post about the IPCC’s attribution statement (that more than 50% of the warming since 1950 was anthropogenic). Gavin Schmidt responded on RealClimate, and Judith Curry has been promising to respond to Gavin’s article for quite some time. Well, the response is now here and it is a classic.

    https://andthentheresphysics.wordpress.com/2015/01/20/more-than-half-is-the-same-as-50/

    My favorite bit:

    Help from the dictionary

    https://judithcurry.com/2015/01/19/most-versus-more-than-half-versus-50/

  87. Everett F Sargent says:

    MarkR,

    Please don’t mention JC with respect to SLR, she does not have a lick of domain expertise with respect to sea level rise. Same goes for James R Houston (director emeritus USACE ERDC), he has published, but it is mostly crappy JCR type stuff IMHO (that plus ASBPA and FSBPA conflicts of interest), he used to be my boss, we have certain issues, disclaimer notice, I’m biased, but for good 1st hand research work experience related reasons. I used to be a coastal engineer at CHL. I know this person too, a good friend (at least I’d like to think so) ,,,

    USACE honors researcher for a career reducing coastal risk
    http://www.erdc.usace.army.mil/Media/News-Stories/Article/1383691/usace-honors-researcher-for-a-career-reducing-coastal-risk/

    I started out here, for a year (1983) …
    Corps facility serves as epicenter of coastal research
    http://www.erdc.usace.army.mil/Media/News-Stories/Article/1342193/corps-facility-serves-as-epicenter-of-coastal-research/

    RE: Louisians, blame oil and gas and groundwater interests combined with nearly a century of USACE river engineering.

    RE: Florida, blame real estate interests and people like James R Houston (and Dr. Dean while he was alive). Blame Miami, FL for spending like ~$500M to hide nuisance flooding (see also real estate interests). See also gentrification of Miami, FL highstands.

    RE: Coastal planning, blame the coastal real estate industry as well as state interests (e. g. NC and FL).

    RE: Sea Level Rise (SLR), blame, well blame SLR! It is currently rather linear or very weakly nonlinear (1993-2017). Of all the things that will kick in with respect to AGW, SLR has got to be the one with the longest lag time IMHO.

    Sorry for the USA(CE) centric messaging.

    We now return you to your regular scheduled programming. 😉

  88. Ragnaar says:

    Yes it looks like Curry said,

    “The focus on emissions reductions as some sort of solution to sea level rise (apart from any determination of cause) is distracting from developing better land use policies and coastal engineering practices.”

    For the efficient use of dollars, for specific regions down to individual parcels of land, such things as mentioned above are the correct answers. If we could answer the question, How much does each local act impact a specific location near the ocean, emission reductions would fare poorly.

    Fairly of not, emission reductions are spread to the whole world. (So local benefits are diluted) And we can consider how is the question framed? I want to get the most out of this piece of land. Or, I feel guilty for harming most of the planet and its people and animals too. This compares nicely to the conservative liberal divide. So I want to protect my stuff. But we need to save the whole world, plus, look what we did to the whole world. Then corporations come into the game. They should protect their stuff. But they get persuaded to protect to the whole world. The recent stock market rally in part signals the perception, it’s Okay to for a corporation to protect its stuff and to care less about saving the planet.

  89. Ragnaar says:

    It looks like world GDP is about $80 trillion a year, now.

    “Without adaptation, 0.2–4.6% of global population is expected to be flooded annually in 2100 under 25–123 cm of global mean sea-level rise, with expected annual losses of 0.3–9.3% of global gross domestic product. Damages of this magnitude are very unlikely to be tolerated by society and adaptation will be widespread. The global costs of protecting the coast with dikes are significant with annual investment and maintenance costs of US$ 12–71 billion in 2100, but much smaller than the global cost of avoided damages even without accounting for indirect costs of damage to regional production supply.”

    http://www.pnas.org/content/111/9/3292

    Huh? If my math is right, 0.1% of GDP annually for coastal defenses. Seems minor to me. In the United States, there is the possibility of inland financial support. The United States is one of best positioned countries to deal with sea level rise. Now ask, who are Curry’s SLR clients? People with money in a country with money.

  90. Everett F Sargent says:

    Ragnaar sez …

    “Now ask, who are Curry’s SLR clients?”

    Certainly, no one on the Left Coast (WA/OR/CA), and no one north of NC (I certainly hope that those mostly blue states are not that stupid plus they are loaded with tons of SLR subject matter experts already). Think coastal real estate interests in NC/SC/GA/FL/AL/MS/LA/TX, but mostly SC (Charleston), GA (Savannah) and all of FL. People who know nothing, want to be reassured that they are not buying present/future swampland, with input from a strongly biased academic nobody with respect to SLR.

    That’s the beauty of being a coastal consultant, there is a very high probability that they will already be dead long before it hits the fan. JC has maybe 20 years to put people in harms way. 😦

    Who needs a ‘so called’ red team when you have red team clients trying to sell present/future swampland to present/future red state idiots (that includes the present/future northern snowbird hordes).

    I would think that whomever is unethical enough to trust JC uninformed boots-on-the-ground lack of expertise, would all be better served by at least going to the Coastal Resilience Center …
    http://coastalresiliencecenter.unc.edu/

  91. jacksmith4tx says:

    The US just had the most expensive natural disaster year on record ($200 billion+) and the economic result was a 3%+ GDP and record stock valuations. Let’s all hope next year is even worse! But isn’t this just another example of thinking fast and slow (Daniel Kahneman)?
    Maybe whats actually important is what is happening to the biosphere. Worrying about a few tenths of a degree decades in the future seems so trivial when the stuff that really maters is happening right now and we seem blind to the cumulative damage we are doing to the waters of the planet.
    https://www.theatlantic.com/science/archive/2018/01/suffocating-oceans/550415/
    “The ocean is losing its oxygen. Over the past 50 years, the volume of the ocean with no oxygen at all has quadrupled, while oxygen-deprived swaths of the open seas have expanded by the size of the European Union. The culprits are familiar: global warming and pollution. Warmer seawater both holds less oxygen and turbocharges the worldwide consumption of oxygen by microorganisms. Meanwhile, agricultural runoff and sewage drive suffocating algae blooms. Declining oxygen in the oceans like we see today is a feature of many of the worst mass extinctions in Earth’s history.”

  92. angech says:

    MarkR says: January 19, 2018 at 7:50 pm
    Thank you for your considered reply.
    I feel a bit out of depth at the moment as I feel the tenor of the arguments re ECS above are in tune with the expectations of impending problems and I have no role in that discussion..
    The explanation you have given re model ECS calculation is scientific and ties together well with your premise.
    Clive’s model gives a simple ball park figure for ECS calculation. The degree to which it could and should be improved is the debate of course but it is nice to have a quick reckoner even if it turns out to be a bit under the odds.
    It was interesting that ATTP’s graph showing all the confounding influences had none that were negative, even the aerosols and clouds only went one way.
    Both his graph and Dikran’s added a lot the overview of the situation.

  93. angech says:

    JCH says: “Well, OHC to the end of 2017 is out for 0 to 700 meters, but for whatever reason, 0 to 2000 meters is not yet updated. Anyway, OHC is not down:”
    At another site which shows recent global ocean temps with NH,SH,Tropics and overall there seemed to be a downward trend in recent times which is at odds with your graph. If correct it would imply that the next few months may show some cooldown in temperatures?

  94. BBD says:

    At another site

    NODC OHC is a reference data set. What ‘other site’ and which data are used?

  95. BBD says:

    It was interesting that ATTP’s graph showing all the confounding influences had none that were negative, even the aerosols and clouds only went one way.

    If we burn less coal, anthropogenic aerosol loading will decrease. Explosive volcanism could counter that for a few years, discontinuously, but probably not in the longer term. Atmospheric dust is associated with cold climate states and decreases during warm climate states. If clouds were a net negative feedback, then it’s difficult to explain hyperthermal events like the PETM without recourse to improbably large releases of carbon.

  96. As I understand it, the graph I presented only shows the feedbacks, not the forcings. As BBD says, the aerosols/dust refer to natural aerosols/dust, not anthropogenic.

  97. angech says:

    SST monthly anomalies as reported in HadSST3 starting in 2015 through December 2017.
    Global SSTs are claimed to be the lowest since 3/2013.
    It is probably not important yet but might lead to a cooler 2018.
    The graph of feedbacks has vegetation, a potential negative feedback and clouds which act as a dampener to runaway warming and aerosols and dust which are sometimes a negative effect.
    I just thought the arrows might run both ways in the graph.

  98. John Hartz says:

    ATTP: When is the last time that a comment thead of one your OPs did not contain a reference to Judith Curry?

    The context of the above question:

    Back in the days when I crossed swords with climate denier drones on the comment threads of MSM articles about manmade climte change, I all too frequently chasitied opponents for demonizing James Hansen. Now we seem to be emplying the same tactic against the likes of Curry.

  99. JCH says:

    I have no idea what angech mens. Nobody ever does. Anyway, there is cooling, and then there is cooling. 2018 is very likely going to end up top 5. That is not cooling; it’s warming. A lot of it.

  100. angech wrote “At another site which shows recent global ocean temps with NH,SH,Tropics and overall there seemed to be a downward trend in recent times which is at odds with your graph.”

    … of course if you actually gave a link to the relevant page on that site we could go there and see if what you say has any validity.

  101. BBD says:

    I’m guessing WUWT, but I’m not going to go and look because I need to shop around for a new mouse and keyboard. One must keep one’s priorities straight.

  102. JCH says:

    There is a La Niña. Some places might get chilly and that’s exciting angech. But the La Niña is disintegrating, so soon those places will get warm again. Because OHC is high and the weakling trade winds are subsiding.

    I’m getting lazy. Has Mass published a blog yet proving AGW was not the cause of Stormy Daniels?

  103. Everett F Sargent says:

    Try UAH, that one has GLOBAL,NH,SH,TRPC (monthly).

  104. Joshua says:

    I doubt WUWT. More likely the lukewarmer’s gambit: Throw some (WUWT) “skeptics” under the bus while employing (Judith’s crib) selective reasoning w/r/t uncertainty.

  105. Ragnaar says:

    Curry had a recent post at her blog on SLR:

    “To put a 1-2 foot increase in global sea level by 2100 into context, consider the following natural variations in sea level…”

    “Storm surges from hurricanes or intense midlatitude storms can reach 30 ft [link]”

    The first sentence is what I come up with more or less using the two middle emission scenarios of the IPCC.

    The second sentence, I’ll just cut that number in half. So we have 1.5 feet by 2100 and 15 feet by maybe next year. The 1.5 feet has more certainty and the 15 feet might never happen if the storms miss each time. And surges are temporary while SLR is not. It is true, the surge number whatever it is will grow slowly with SLR.

    A person or entity will decide which problems to work on.

  106. Everett F Sargent says:

    “intense midlatitude storms”

    Otherwise known as extratropical storms. And AFAIK those have never produced 30 ft of storm surge, not ever and not anywhere (but I have a friend who has a database, so don’t quote me just yet, mkay). 😦

    Water Levels = Tides + Waves + Surge + SLR (for inland waterways see hydrology or HEC or NOAA or USGS or USBR or FEMA or …) …
    http://www.hec.usace.army.mil/

    But please, do let us all know when JC removes her training wheels, mkay.

  107. JCH says:

    One of the JC bullet points:

    Storm surges from hurricanes or intense midlatitude storms can reach 30 ft [link]

    What percentage of the shoreline exposed to that sort of storm surge possibility has ever actually experienced a storm surge that size, or even 1/3rd that size? What is the point of the bullet point? Is it an honest point?

  108. Everett F Sargent says:

    For someone with my formal and informal training (self taught) and experience, when someone else sez …

    “The focus on emissions reductions as some sort of solution to sea level rise (apart from any determination of cause) is distracting from developing better land use policies and coastal engineering practices.”

    I go, you are joking right, I mean we should listen to you, why, because book learning in 2018 is something that kids do in college, then they get a real job working in an A&E and get a PE or go into academia or government and gain the necessary hands-on experience.

    Even the USACE consults with Wallingford (UK), or Delft (NL), or Japan, or Australia, or …

    Maybe JC should go here …
    36th International Conference on Coastal Engineering (Baltimore, MD even this year)
    http://www.icce2018.com/

    That would at least be a good start.

    She should also know that anything in the coastal zone must be approved by the USACE (federal mission statement) (EIR/EIS), then get state approval (another EIR/EIS), then get local approval (another EIR/EIS and/or zoning requirements), then build something (or not in GC’s case) that almost always requires a PE licence!

    Or go to one of these (“This meeting is open to the public.”) …
    https://www.federalregister.gov/documents/2017/05/12/2017-09648/board-on-coastal-engineering-research

    At some point, this becomes just s-o-o-o-o-o-o-o-o-o-o sad, that it does become COMIC GOLD! 🙂

    But can’t do that … because of the d-e-e-e-e-e-e-e-e-e-p state … can’t trust those gooberment infernal reeveeneeueers, dontcha know.

    Citizen Watts to the rescue!

  109. Ragnaar says:

    I get a United States GDP of $18 trillion for 2016. It depends who does the counting for damages. It if was $270 billion of them, that’s about 1.5% of GDP.

    The economic news is good. Wow my IRAs have never been worth this much before.

  110. Everett F Sargent says:

    ECS? Sorry for going so OT.

    JCH, I think your italics is a question (not a comment on JC’s blog).

    “What percentage of the shoreline exposed to that sort of storm surge possibility has ever actually experienced a storm surge that size, or even 1/3rd that size? What is the point of the bullet point? Is it an honest point?”

    Currently for extreme events, it is THE most important factor. You do joint PDF’s though, thousands and thousands of them along the entire coastline, GOM (including all USA possessions like PR), Atlantic Coast, Pacific Coast (including HI/AK). You do a ship load of high fidelity numerical models. Then you calculate the risk (my friend, like knows at least one-two orders of magnitude more about risk than I do). Most of the time you will end up plotting return periods (but it is actually frequency of occurrence not time of occurence), now using Peaks Over Threshold (again see friend). For SLR you can usually get away with linear superpositioning of the frequency of occurrence (but be careful, because you are superpositioning into the frequency of occurrence domain something that ought to be convolved into the time of occurrence domain, or vise versa (e. g. add a meter to a return period of one femtosecond and you immediately get one meter, today, not 100 years from now, I’ve seen that done in the climate science literature ALL THE TIME, bad climate scientists bad)).

    If you do do SLR with joint PDF’s, then you include a probabilistic in time SLR curve, the other stuff is IID (or assumed to be AFAIK, except with tides and numerical models going forwards, but then after say a typical 19-year epoch everything is out of phase, so to speak, NOAA does 19-year epochs due primarily to the 18.6-year nodal tide, but only carry out their predicted tides to the annual hamionic, S1 I believe, but don’t quote me on that one, mkay). I actually don’t currently know what the state-of-the-art is currently in coastal engineering (I sort of left the building in 2002, came back in 2004-8 and again in 2012), all I know is that I am not, or have ever been, or will ever be, a Sand Engineer. 😉

  111. Ragnaar says:

    Godzilla super storm surges: https://en.wikipedia.org/wiki/Minimax

    You aren’t getting me this time. A number like 30 feet says do something to a number of people. It recommends a transition. In a contest between more seawalls and more wind turbines, I hope to see seawalls gain and wind turbines flatten their growth. I think that overall it is the better answer with 100s of factors under consideration.

  112. JCH says:

    Sorry EFS, should have read you prior comment more carefully as you covered it.

    The message in her link is the shoreline can withstand 30-storm surges, and that the puny little 1.5′ addition would barely change that. And, that smaller surges are surely harmless again a shoreline that can defend 30 foot.

    Memory is the Hurricane Ike surge was forecast at 20+ feet and came in at under 20 feet. In Galveston it hit a seawall. One of the few areas of the USA coast that is actually armored. It did enormous damage, and would have been far worse if landfall had been down the coast a bit.

    They have talking about seawalls and barriers in the Houston region for decades. Nothing of substance has been built. Why? It’s just way too expensive. Rice University has a pragmatic plan that will protect the ship channel and historic Galveston. The rest, I guess, is left to the swamps and the waves.

  113. JCH says:

    I don’t vote by my 401K, which Obama saved from disaster.

  114. Everett F Sargent says:

    Ragnaar,

    There are certainly 30 ft tropically sourced storm surges (e. g. typhoons/cyclones/hurricanes), but 30 ft EXTRATROPICAL storm surges (storms sourced outside the tropics), I’ll need to see some data, I can see maybe 20 ft, but maybe RT will show up and tell me something I don’t know.

    But, I really don’t know what you are trying to say, to be honest with you.

  115. Ragnaar says:

    JCH:

    President Obama takes office.
    The S & P 500 index is at about 800.
    President Obama leaves office.
    The S & P 500 index is at about 2200.
    Thank you President Obama.

  116. Everett F Sargent says:

    “The message in her link is the shoreline can withstand 30-storm surges, and that the puny little 1.5′ addition would barely change that. And, that smaller surges are surely harmless again a shoreline that can defend 30 foot.”

    The only coastline I know of, safe from a 30 ft storm surge, is a coastline above say 60′ MHHW! Waves break ~0.8 water depth (depth limited wave breaking, H/d~0.8), add runup and overtopping, go even higher.

    The East and Gulf coasts are mostly ~10′ MHHW barrier islands, with inlets even (what doesn’t hit you in the face will come back around and hit you in the ass).

    After the Corps Jettified the NJ shorelines and made deep water navigation ports possible in every state, shoreline erosion went bigtime, the Corps now does not normally advocate for hardened coastal structures, so they are left with … wait for it … SAND! I don’t do sand, but the Corps and states do do sand bigtime (see ASBPA and/or FSBPA).

    All it takes is one storm surge and you can kiss that sand goodbye, as it will move offshore and longshore. But hey, in the meantime, you get to flip burgers in a coastal location near you until ole Mother Nature comes a calling.

    I have never advocated for building (or even living, although I did live there for about one year in 1983, must pay my bills and put food on said table, otherwise never going to catch me within the 10,000 year floodplain, if I can help it) in the coastal zone, but that’s just me being me. That non-advocacy predates my knowledge of SLR, as it is likely to play out in the coming decades/centuries.

    SLR is but one component, where you are, whatever the planform is, be aware and be cautious. I thought all civil engineers were to err on the side of less risk, not more risk. Where did I go wrong. 😉

  117. JCH says:

    How is all this talk about warming patterns – as far as know, always in the Eastern Pacific – any different than what I have been saying? I mean, I know I don’t know what I’m talking about, but I have been hammering on it for around 6 years, maybe longer.

  118. Ragnaar says:

    Minimax:
    “…minimizing the possible loss for a worst case (maximum loss) scenario.”

    That’s worrying about storm surges. So if an entity solves its storm surge problem, they can say we protected our stuff. If they worry about 2.3 inches of SLR per decade only, what can they say? Not much if a storm surge occurs on their property. Or they could say, we’re helping to solve the world’s SLR problem with wind turbines. At which point the shareholders will consider dismissing the board. This area might be argued not to be climate science. But I’d say one’s applying some science. Projections of SLR. Minimax is just something they thought I should know.

  119. Everett F Sargent says:

    Ragnaar.

    Pretty much most entities are not now providing adequate storm surge protections other than sand and/or sand dunes.

    Cost benefit analysis is the usual metric. In the USA, at the federal level, it is a 50-year economic life, but designed for a 100-year event.

    The thing right now, that has most people bothered, is nuisance flooding. Completely different design criteria, not a storm design but a downtime design (and you really need virtually zero downtime for flooding and subsequent damage purposes). Additional height allowances are required for any future building, which is rather easy to do up front rather than after the fact.

    But that is stuff that is usually left to coastal zone planners/managers, after they receive the necessary design guidance. I am not a coastal zone planner but neither is JC. I know that JC can’t do the necessary heavy lifting, because she has zero subject matter expertise, in either planning or design.

    Right now, I don’t even know what JC is, other than someone that criticizes others. That’s the only real reason that anyone would want to bother with, or visit her blog, simply because she criticizes others.

    It is enough for me to point out the silliness of whatever it is that she is going on about, simply because she does not have a lick of domain expertise with respect to the coastal zone (she should post her cv and give a copy to each of her ‘so called’ clients, then they can compare that cv with others that already have 20-50 years of experience working in the coastal zone.

  120. Everett F Sargent says:

    TL;DR …

    Real short now. Very simple question.

    Do you want a complete rookie building your house or do you want a complete professional building your house? They both charge the exact same hourly rate or provide the exact same bid.

  121. angech says:

    JCH says: ” there is cooling, and then there is cooling. 2018 is very likely going to end up top 5. That is not cooling; it’s warming. A lot of it.”
    When El Nino was coming there was a lot of build up for 18 months and two failed attempts before it fully kicked in. It lasted a long time and there was a lag effect into 2017 and a near reforming that no-one picked.
    So.
    I know that currently it is cooler in the pacific and has been for months.
    I expect that the lag effect from this should give some cooling for the next 5 months.
    You are more savvy at getting the latest data and have said that this La Nina [not yet 5 months but they keep changing the definition] is weak and will blow out.
    I accept that for as far as we can go into the future with the currents etc which is only 3-4 months.
    At that stage the die is cast.
    If the La Nina conditions consolidate and increase who knows?
    If they die and the warming resumes you will be right re likely in top 5 [Don’t care which set is used.
    I will go with the colder forecast of course with this kicker. If the year starts cold it usually ends up cold and USA and now Europe suggest a cold January.

  122. angech,
    I really don’t have much of a clue as to what you’re suggesting. While we sustain a planetary energy imbalance, which is largley a consequence of us emitting GHGs into the atmosphere, we will – on average – continue to warm. Variability about this long-term trend does not present some kind of challenge to this reality.

  123. JCH says:

    He’s suggesting a miracle.

    The one-off event has ended, and, unless it comes back, La Niña events are going to be meek.That’s why I tell the cultists at CargoCult Etc. to pray for the Kimikamikaze: Kim’s Divine Wind. Whether they know it or not, that is what they believe in. Kim is the high priest of Koolaid. Professor Curry’s prediction that 2018 will drop out of the top 5 is the head cultist’s prayer. So they are all on their knees praying.

    What has them nervous is it is looking very possible the .2 ℃ per decade prediction of the crummy models over the first two decades of the 21st century is now easily within reach. Defeat snatched from the jaws of the hiatus victory. It’s hard for the believers to accept. On to 2020.

    2001 to end of data, .19 ℃:

    https://imgur.com/kb2NgX5

  124. JCH says:

    2001 to end of data, .19 ℃ decadal:

  125. Joshua says:

    If the year starts cold it usually ends up cold and USA and now Europe suggest a cold January.

    Maybe Trump can help attack the deficit by simply reducing funding for climate analysis to the month of January and then just extrapolating?

  126. izen says:

    @-angtech
    ” If the year starts cold it usually ends up cold and USA and now Europe suggest a cold January.”

    Australia suggest a much hotter one.
    https://www.accuweather.com/en/au/sydney/22889/january-weather/22889

  127. Hyperactive Hydrologist says:

    I am still a bit baffled about the obsession with climate sensitivity. When you have an understanding of risk management it sort of makes the issue irrelevant. Let me give you an example from hydrology. In general there is not much difference in terms of peak flow or rainfall depth between a a 2% and a 1% Annual Exceedance Probability (AEP) flood or storm, maybe 10-15%. Therefore, if rainfall increases by 10-15% it doubles the risk. All basins in the UK are expected to see this level of change in peak flow in the 2020’s. By the 2080’s the increase is expected to be in the range of 20-35% so your 1%AEP event today would be a 3-4% AEP in the 2080’s.

    There is also the issue of unrealised natural variability. We have very short rainfall and flood records which are used to derive design flood events. It is very unlikely that these records capture the full range of uncertainty and therefore you can add another 10-20% to the 1%AEP event for the yet unrealised natural variability.

    Adapting to this level of risk and magnitude of event is incredibly challenging. The 2015/16 floods saw flood defence designed for 0.5% AEP floods over topped by 0.5m. Increasing the height of flood walls is often not practical and can increase flood risk downstream and upstream storage would need to be huge. You get to the point where the only viable solution is managed retreat.

  128. HH,
    I’m not really baffled by it, because I can see that it is s useful metric. However, it can confuse things (intentionally, and unintentionally). We have the Lukewarmers who promote any suggestion that it might be small and who seem to suggest that because it might be small we should behave as if it actually will be. The other confusion is that it is sometimes presented as if it indicates how much we will actually warm, rather than as a metric representing how sensitive our climate is to external perturbations. How much we will actually warms depends on both what climate sensitivity is and on how much we end up emitting.

    In some sense, this is why I quite like the carbon budget framework a it folds everything (climate sensitivity and the carbon cycle) into a single metric.

  129. Hyperactive Hydrologist says:

    aTTP,

    You really need to consider the impacts that go with the metric. Whether we warm by 2oC or 4oC the increase in risk is still severe. Developed countries in northern latitudes may be able to cope but developing countries and countries with Mediterranean climates are pretty much screwed even with moderate levels of warming.

  130. HH,
    Yes, I agree that we should consider the impacts that go with the metric. I was mainly suggesting that something like a carbon budget can possibly overcome some of the confusion of focusing primarily on climate sensitivity.

  131. John Hartz says:

    ATTP: Recommendation:

    Append the IPCC/WMO definition (below) of Climate Sensitivity to each and every article you post on the topic.

    Climate Sensitivity

    In IPCC reports, equilibrium climate sensitivity refers to the equilibrium change in the annual mean global surface temperature following a doubling of the atmospheric equivalent carbon dioxide concentration. Due to computational constraints, the equilibrium climate sensitivity in a climate model is usually estimated by running an atmospheric general circulation model coupled to a mixed-layer ocean model, because equilibrium climate sensitivity is largely determined by atmospheric processes. Efficient models can be run to equilibrium with a dynamic ocean.

    The effective climate sensitivity is a related measure that circumvents the requirement of equilibrium. It is evaluated from model output for evolving non-equilibrium conditions. It is a measure of the strengths of the climate feedbacks at a particular time and may vary with forcing history and climate state. The climate sensitivity parameter (units: °C (W m–2)–1) refers to the equilibrium change in the annual mean global surface temperature following a unit change in radiative forcing.

    The transient climate response is the change in the global surface temperature, averaged over a 20-year period, centred at the time of atmospheric carbon dioxide doubling, that is, at year 70 in a 1% yr–1 compound carbon dioxide increase experiment with a global coupled climate model. It is a measure of the strength and rapidity of the surface temperature response to greenhouse gas forcing.

    Definition courtesy of IPCC AR4.

    All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

  132. MarkR says:

    angech:
    “Clive’s model gives a simple ball park figure for ECS calculation. The degree to which it could and should be improved is the debate of course but it is nice to have a quick reckoner even if it turns out to be a bit under the odds.”

    This model is pretty old & Held et al. (2010) showed that we’ve understood its major problem for a while.

    Testing versus CMIP5 its median ECS is 0.9 °C too low, with a full sample range of 0.1-3.0 °C too low. So this method puts a lower bound on ECS but doesn’t really narrow the range. It’s physically wrong in a way that matters for the result so I’d say that nowadays it’s at best a simple sanity check on ECS. Clive’s calculations combined with the model validation test says odds are that ECS is bigger than 2.5 °C (corrected best estimate would be 3.4 °C) with a big range. Not really that interesting in the context of the studies published over the last decade or so.

    The interest and usefulness comes from interpreting it and working out why its results are different and we’ve learned lots from this approach.

  133. John Hartz says:

    MarkR: The following caught my eye:

    WMO uses datasets (based on monthly climatological data from observing sites) from the United States National Oceanic and Atmospheric Administration, NASA’s Goddard Institute for Space Studies, and the United Kingdom’s Met Office Hadley Centre and the University of East Anglia’s Climatic Research Unit in the United Kingdom.

    It also uses reanalysis datasets from the European Centre for Medium Range Weather Forecasts and its Copernicus Climate Change Service, and the Japan Meteorological Agency. This method combines millions of meteorological and marine observations, including from satellites, with models to produce a complete reanalysis of the atmosphere. The combination of observations with models makes it possible to estimate temperatures at any time and in any place across the globe, even in data-sparse areas such as the polar regions.

    WMO confirms 2017 among the three warmest years on record, WMO Press Release, Jan 18, 2018

    My question: Do GCM’s ever use the WMO’s “blend” of temperature datasets in their projections? If not, why not?

  134. Ragnaar says:

    Everett F Sargent:

    The following is from a Forbe’s article:

    Mind the say-do gap. This is all about trust, which is the bedrock of effective leadership.

    Make the complex simple.   Your employees and customers are being bombarded 24/7 by information…

    Find your own voice. Use language that’s distinctly your own.

    Be visible. Visibility is about letting your key stakeholders get a feel for who you are and what you care about.

    Listen with your eyes as well as your ears. Stop, look and listen.
    ————————————————————————

    I’ve seen some comments where the question is raised, What does Curry know?

    Teachers aren’t necessarily domain experts. Your high school physics teacher was not one. And teachers are not necessarily practicing engineers. And there is this thing, seen at least weekly in the news called climate science. A teacher conveys that to the rest of us.

    From my reading her blog, I think she has some of each of the above five qualities. There is a phrase, effective communication, that can probably be defined a number of ways. I’ll try by saying it is when you’ve added value to your target audience. No matter what she says about SLR, it still requires that the decisions makers audience, end up adding value to whatever they rule over. People were convinced say in Germany to add a bunch of renewables. Was value added in total? It is possible that a more conservative business orientated path would have resulted in a higher total value today. And the people of Germany had people communicating to them about climate science.

    So we have: Science, communication and value. I think Curry has indicated an approach more favorable to adaptation. I am there too, but it’s not so hard given the challenges that renewables face combined with the grid system already in place. So goal is to provide Value given the Science, by Communicating. And it that case, knowledge and competency would seem to be needed in each of these three areas.

  135. Ragnaar,

    I think Curry has indicated an approach more favorable to adaptation.

    I agree. The problem that I sometimes have is that Judith will claim that she is not advocating for anything, and yet it seems clear that her approach is indeed more favourable to adaptation. I don’t have a specific problem with the latter (I disagree with it, but I can at least appreciate that some might favour that). My own view, FWIW, is that people should be willing to acknowledge the potential policy implications of what they say publicly.

  136. BBD says:

    People were convinced say in Germany to add a bunch of renewables.

    It’s important to remember that Germany’s energy policy is a bit schizophrenic. The decision to begin a rapid phase-out of nuclear was driven by a long-established anti-nuclear political lobby, not by a coherent plan to decarbonise. Decarbonisation as a policy goal actually post-dates the anti-nuclear advocacy. This means that Germany isn’t a representative example of coherent decarbonisation policy.

  137. angech says:

    Izen. “Australia suggest a much hotter one.
    https://www.accuweather.com/en/au/sydney/22889/january-weather/22889“.
    Thanks.
    What a weird range of values historically.
    Sydney is one set of places, will wait and see what BOM puts up for January later.
    Seas are warm around Australia consistent with La Niña at moment.

  138. JCH says:

    In the forecast, the last week off January is cold, so January could have a lower anomaly than any 2016 month. But January is supposed to be the butt kicker month for this La Niña, and in terms of that, it’s basically a hot baby. Then we’ll hear about lags. What’s the lag of a hot baby La Niña? A torrid teenager.

  139. John Hartz says:

    Recent articles about the weather in Australia:

    How climate change is driving extreme heatwaves by Michael Lucy, Cosmos, Jan 19, 2018

    ‘Significant’ heatwave roasts south-eastern Australia as global records melt by Peter Hannam, Sydney Morning Herald, Jan 19, 2018

  140. John Hartz says:

    JCH: Forecast for where?

  141. JCH says:

    The surface air temperature of the earth.

  142. John Hartz says:

    JCH: Thanks for the clarification.

  143. angech says:

    John
    More hot news straight off the Australian Press
    “Sydney was the hottest place in the world on Sunday. (? Jan 7th 2018)
    The mercury in the Sydney suburb of Penrith hit 47.3 degrees Celcius, or 117.14 F.”
    Nice to know the rest of the world was cooling down?
    We seem to keep wanting to focus on extremes as a way of proving our particular viewpoint.
    It cuts both ways.
    It is Summer in Australia, it can be hot. It is not a proof or even an indication of anything.
    Sorry.

  144. angech,

    It is Summer in Australia, it can be hot. It is not a proof or even an indication of anything.

    Doesn’t change that the overall trend is that we are warming.

  145. angech says:

    MarkR,
    “The interest and usefulness comes from interpreting it and working out why its results are different and we’ve learned lots from this approach.”
    More than that we also gain insight into the way that you feel ECS can and should be and is calculated.
    Not up to speed yet on even trying to debate it from my skeptical viewpoint but happy that you are putting yourself and your ideas out there for discussion and constructive criticism.
    ATTP has put up a number of articles on ECS recently and it appears to be a subject of increasing interest and debate.
    Thanks John H for that summary definition to work from.
    The unknown unknowns, or as JCH calls them, a miracle, still remains my hope of salvation without sweating it.

  146. BBD says:

    The unknown unknowns, or as JCH calls them, a miracle, still remains my hope of salvation without sweating it.

    Not very rational of you, old chap.

    It is Summer in Australia, it can be hot. It is not a proof or even an indication of anything.

    Australian temperature data:

    It’s getting hotter by the decade and it will carry on getting hotter still as CO2 forcing increases.

  147. MarkR says:

    John Hartz:
    “My question: Do GCM’s ever use the WMO’s “blend” of temperature datasets in their projections? If not, why not?”

    Basically no. GCMs just solve the equations of physics and it’s easy to get the model’s true global temperature straight out of the computer.

    Our measurements sample the real world temperature, we don’t have an exact measurement. The WMO are just trying to combine the samples in the best way we can. The error bars on these values are supposed to represent how close they are to the truth.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s