Friday, November 21, 2014

The Climate Scam’s Meltdown: Corruption is now systemic in climate science

A superb essay by Australian marine biologist Walter Starck published at Quadrant Online exposes how
"The rent-seekers, opportunists, third-rate academics, carbon-market scam artists and peddlers of catastrophic prophecy can see the alarmist bubble deflating, so they're trying harder than ever to sustain the scare. Problem is, Mother Nature isn't cooperating"
The Climate Scam’s Meltdown

By Walter Starck

This doesn’t mean the climate change “debate” will stop, the news media will cease reporting weather as a dire threat, or that the true believers will no longer be obsessed by it. However, the ultimate arbiter, climate itself, has made clear its decision by ceasing to warm for over 18 years. Despite the ongoing use of fossil fuels, a proclaimed 95% certainty of 97% of scientists and the high-powered projections of the world’s most advanced climate models, the climate has refused to pay the slightest heed.

Contrary to all the confidence and predictions of alleged experts, storms are no more intense nor frequent, while droughts, floods and sea levels have declined to confirm alarmists’ barely concealed hopes of disasters. The simple fact is that the alleged experts and their high-powered models were wrong. The climate has ceased to warm and, with little or no greenhouse warming, the entire theory of Catastrophic Anthropogenic Global Warming (CAGW), aka Climate Change (CC), aka Global Warming, aka Extreme Weather, is left with no basis.

The debate over CC has been unique in the history of science in that its proponents have largely abandoned the primacy of evidence and openly declared methodology in favour of self-proclaimed authority backed by their own confidential methods and models. It is also unique in that the alarmists refuse to directly address their arguments, preferring to ignore, censor and personally denigrate them. In a few instances in the early part of the public debate, the proponents attempted direct debate with their critics but came away looking decidedly second-best and they soon refused any further direct discussion. With no convincing answers to the uncertainties and conflicting evidence raised by their opponents they simply chose to ignore them, declare the science “settled” and anoint themselves as the only experts. All who disagreed agree were deemed to be fools, knaves and/or in the pay and pocket of Big Energy.

With a naive and compliant news media steeped in the same politically correct, left wing academic indoctrination as the researchers, the latter enjoyed a near monopoly on favourable news coverage. Self-serving publicity releases were regurgitated undigested beneath the by-lines of environmental “reporters”, who eagerly reduced themselves to unquestioning stenographers.

Yet even as the alarmists’ received kid-glove treatment in the mainstream media, the Internet has been a very different story. Not only did the climate alarmists have no advantage online, the thinking public was increasingly looking to the Web as their primary source for news. This digital realm was outside any particular control or influence, open to the airing of opposing argument and evidence. It was also a forum for the exposure of malpractice, regularly producing exposes which would shatter the façade of scientific expertise and propriety the alarmists had erected around themselves. Think here of how Wattsupwiththat demolished the charlatan Michael Mann and his infamous hockey stick, and the Climategate emails revealed the lengths professional warmists are prepared to go in order to silence sceptics, not least by debasing the conventions of the peer-review process.

In retreat, climate alarmists are now trying to deny the lack of warming while fiddling the temperature record in an effort to “prove” it is continuing. Their ever-more imaginative explanations — the heat is hiding at the bottom of the ocean; trade winds are skewing sea-temperature readings — increasingly smack of desperation. Making matters worse for the alarmists, there is increasing evidence that the global climate has not only ceased to warm but may actually be starting to cool. Severe, often record-breaking winter weather demands more and more undeclared “adjustments” to the temperature record are being exposed. Overwhelmingly these serve to reduce past temperatures and increase more recent ones without which the lack of warming would be more obvious.

When such changes to the record have been discovered and questioned the response has been to waffle about “homogenization” and “world’s best practice” or to suggest such corrections are needed because, say, the Japanese bombed Darwin, a weather was moved and, therefore, the coastal city’s temperature record must be reconstructed by drawing on numbers from Daly waters, far inland. These explanations, however, are inevitably long on hypothetical context and galling deficient in specific explanations for their justification.

More broadly a similar pattern of response has also been made in various other instances wherein malpractice in climate science has been exposed. At first the problem is denied, then it is dismissed as being of no importance and, finally, the attempt is made to justify it as excusable error. In the more egregious instances, when the scandal is no longer in the news, the miscreant may then be given some prestigious award, thus certifying the unimportance of any misdeeds. To any thinking observer, all this, when combined with a noticeable lack of any disavowal by colleagues, can only confirm the corruption now systemic in climate science.

Currently the recent US/China agreement is being touted as an important breakthrough in the battle against climate change. In reality it amounts to a non-binding agreement to do nothing different before 2030. Until then China is free to continue increasing emissions while the US agrees to continue to reducing its own in line with the reductions already achieved and further anticipated through the ongoing switch from coal and oil to natural gas. For China this agreement affords cost-free relief from diplomatic pressure over their increasing emissions. On the US side it provides President Obama the excuse of diplomatic obligations to invoke his executive authority to implement various measures unlikely to receive approval from a Republican-controlled Congress. As ratification by the Congress would be required for any binding climate agreement and such approval is likely to be lengthy or never, the blank cheque for the exercise of executive authority may remain useful for some time.

The reality is that the threat of catastrophic climate change has almost certainly been vastly exaggerated. At pre-industrial levels of CO2 the back-radiated IR energy in the absorption bands of this gas was virtually all absorbed within a few tens of metres of the surface. More CO2 only concentrates the initial absorption a bit closer to the surface but the mixing at that level still quickly distributes the heat energy through a much larger volume of the lower troposphere while, at the same time, also increasing transport of heat away from the surface by enhancing evaporation, which in turn can be expected to increase cloud cover. How much actual warming may result is highly problematic. Empirical evidence is now indicating that any such warming is probably much less than has been estimated by the alarmists — probably so little as to be over-ridden by other natural variables. The only significant effect attributable with any confidence to increased CO2 thus far has been a marked greening of arid regions and an increase in agricultural yields.

With or without any agreement or government initiatives, economics, technological developments and demographic changes will in due course inevitably reduce the demand for fossil fuels and replace them with other and cleaner sources of energy. Thorium- and fusion-reactor developments are showing increasing promise of providing effectively unlimited cheap and clean energy within a few decades. For domestic use, solar voltaic technology is beginning to become competitive with mains electricity, with further gains in cost effectiveness near certain in the near future. Major advances in storage technology are also well underway and expected to become commercially available within a few years. Better cheaper solar technology to power homes and vehicles is likely to drive the beginning of mass uptake within a decade. This will be impelled by cost effectiveness, with subsidies unnecessary. Indeed, such support risks doing more harm than good if it diverts development and uptake from the best and most efficient technologies emerging from a complex, rapidly changing and impossible-to-predict scientific frontier.

An argument is often made that the climate-change threat must be real because a conspiracy involving an overwhelming majority of the world’s scientists is simply not credible. This is disingenuous in that the climate threat, as exemplified by the IPCC’s scare machine, is far from representing a scientific consensus, even a majority. Global scientific opinion on this matter is highly mixed with the alarmist position concentrated in Europe and the Anglosphere. Even here thousands of dissenters exist, including many highly qualified and respected researchers with very relevant expertise.

The core alarmist proponents only comprise a few dozen, mostly third-rate, academics whose scientific reputations are minimal outside of climate alarmism. They co-opted the niche, little known interdisciplinary field of climatology, proclaimed themselves to be the world authorities, declared a global crisis, received lavish funding to research it and gained global attention. They have been aided and abetted by sundry fellow travellers who see advantage for various other agendas. A conspiracy does not require secret planning. It can be implemented just as easily with a wink and a nod when the aims and methods are apparent to all the participants. It is time to recognise the climate scam for what it is: a conspiracy to defraud on a monumental scale.

Although climate itself is presenting its irrefutable opposing argument, failed prophets never willingly concede defeat until their mouths are stopped with the dust of reality. In this instance gob-stopping reality seems likely to take the form of severe winter weather leading to a widespread collapse of electrical power in an overloaded grid suffering from the underinvestment, malinvestment, restraints and neglect. All these stem from years of misguided climate policies. Until the crunch comes, the rent-seekers and their useful idiots in the press will rant and rage without pause, their livelihoods and careers hanging on their ability to perpetuate the hoax they foisted on the rest of us.

As so often, Shakespeare said it best: “A tale, told by an idiot, full of sound and fury, signifying nothing.”

A marine biologist, Walter Starck has spent much of his career studying coral reef and marine fishery ecosystems

Thing of the Past Departure from Normal October 2014 Edition

...just wait until the November data is complete

h/t Willie Soon

Thursday, November 20, 2014

Scientist proved 214 years ago that a cold body cannot make a hot body hotter, proved opposite

A paper published in the American Journal of Physics describes the simple experiment of Pictet in 1800, demonstrating that a cold body cannot make a hot body hotter, in fact quite the opposite. It wasn't until 51 years later in 1851 that Lord Kelvin & Clausius introduced the 2nd law of thermodynamics to explain the theory behind this phenomenon, requiring a one-way transfer of heat only from hot to cold that is necessary to maximize entropy production. 

Subsequently, quantum theory also explained why a lower temperature/frequency/energy body cannot increase the temperature of a higher temperature/frequency/energy body, because all of the higher frequency quantum microstates of the hotter body are already filled and cannot be raised higher by lower energy photons from the lower temperature/frequency/energy body. 

Although radiation between a hot and cold body is bidirectional, heat transfer is always one way only from hot to cold, as transfer of heat from cold to hot requires an impossible reduction of entropy (without work input) forbidden by the 2nd law. 

Pictet's experimental setup was simple as shown in Fig 1 below and described in the following comment by Rosco:

Two polished concave metal mirrors A & B focus radiation from a cold body C to a warmer body D [a thermometer], which causes a decrease in temperature of D below the starting ambient room temperature.
Rosco says: 2014/11/20 at 1:50 PM 
The really funny thing is that centuries ago Pictet conducted an experiment in radiation using polished concave metal mirrors. Each mirror was placed far enough from each other such that conduction/convection played little part in the experiment. 
He initially placed a hot object at one focus and a thermoscope at the other focus.
The thermoscope indicated a temperature increase due to IR. He allowed for equilibrium with room temperature to reestablish.
He then placed a cold object – a container of snow/ice at the focus and – viola - the thermoscope indicated a temperature decrease. 
He introduced a “new” energy source – the flask of snow/ice – emitting say 235 W/sqm at say minus 18 C so remote from the thermoscope that the low conductivity of air played no part in either the hot or cold energy transfer BUT he PROVED that cold does not heat hot – quite the reverse occurred. 
Modern physics explains this phenomenon by claiming the mirror screens the thermoscope from say up to 50% of the radiation from other objects in the room and the thermoscope is then radiating more to the flask than it gets back from the flask and hence decreases in temperature. 
This explanation sounds reasonable but fails to account for the fact that the air around the thermoscope was unable to maintain equilibrium with the thermoscope? 
BUT any way you care to spin this the radiation from a cold object always results in a decrease in temperature of a hotter object. 
This has been known for centuries until climate science invented their fantasies.

Full paper

Indeed, Pictet's experiment also explains why solar cookers are used today in the 3rd world as coolers/refrigerators, by simply pointing a solar cooker away from the Sun to clear sky during the day or night causes the focal point temperature to decrease by increasing the heat loss from the hotter focal point to the colder [ave -18C] atmosphere. Experiments show a solar cooker can even be used to produce ice when the ambient air temp is +6 deg C:

“If at night the temperature was within 6 °C or 10°F of freezing, nighttime cooling could be used to create ice. Previous tests at BYU (in the autumn and with less water) achieved ice formation by 8 a.m. when the minimum ambient night-time temperature was about 48 °F.”

Flashback: Climate scientists said warming decreases lake-effect snows

...but have conveniently changed their tune with the recent record-breaking lake-effect snow in the Buffalo area to claim global warming increases lake-effect snowfall. Four papers below collected by commenter Kenneth Richard find otherwise:

Global warming causes lessmore snow
Kenneth Richard November 19, 2014 at 11:33 PM

The record lake-effect snow in the Buffalo area (November 18-20, 2014) has been said to be caused by global warming...because less ice cover due to warming means more precipitation events in the form of snow...or so it's claimed.

But scientists have, in the past, concluded that global warming causes reduced lake-effect snow, not increases in lake-effect snow:

A general increase in LCS [lake-contribution snowfall] from the early 1920s to the 1950–80 period [during the 1970's ice age scare] at locations typically downwind of the lake was found. Thereafter, LCS decreased through the early 2000s, indicating a distinct trend reversal that is not reported by earlier studies. The reasons for this reversal are unclear. The reversal is consistent with observed increasing minimum temperatures during winter months after the 1970s, however.
Thus, there may be little change in the frequency of heavy lake-effect snow in the Lake Superior snowbelt and a substantial decrease in the southern Lake Michigan and Lake Erie snowbelts. Air-temperature [warming] was found to be the primary determining factor in reducing the frequency of heavy lake-effect events in this study...Anticipated regional impacts of climate change on lake-effect snow patterns – suggest almost no change [in lake-effect snowfall] in the northernmost belts but approximately a 50% decrease in southernmost belts.
3) Assessment of Potential Effects of Climate Change on Heavy Lake-Effect Snowstorms Near Lake Erie
...Surface conditions favorable for heavy lake-effect snow decreased in frequency by 50% and 90% for the HadCM2 and CGCM1 [models], respectively, by the late 21st Century. This reduction was due almost entirely to a decrease in the number of occurrences of surface air temperature in the range of −10 to 0°C, which in turn was the result of an increase in average winter air temperatures.
4) Another one, from 1971, that says that global cooling (during the 1970's ice age scare) contributed to increased lake-effect snowfall during the 1940s to 1970s, and global warming (during the 1920s and 1930s) contributed to decreased lake-effect snowfall.

Lake effect snowfall to the lee of the Great Lakes, its role in Michigan
Evidence suggests that lake effect snowfall has significantly increased during the past several decades, particularly in Southern Michigan and Northern Indiana. While the observed changes cannot be definitively ascribed to any single factor, it seems likely that a general cooling of winter temperatures may be partially responsible for this climatic change. [M]any of the snowfall time-series curves for the lake stations show downward trends during the 1920’s and 1930’s, at the height of the recent warm period, and the more recent snowfall increase has coincided with a general world-wide cooling which has occurred in the last several decades [1940s-1970s]. Recent evidence derived from [isotope] analysis of ice core samples on the Greenland ice cap indicates a continuance of this cooling trend for another 20 or 30 years [through the 1990s].

New paper finds another non-hockey-stick in Tibet

A paper published today in Quaternary Research finds another non-hockey-stick in Tibet, and also finds an inverse relationship between temperature and lake levels on the Tibet plateau. 

One of many simplistic memes is that "warmer means wetter," since warmer air can hold more water vapor, but this is only one side of the "equation." If warming causes both more evaporation and more precipitation, either of these could dominate to cause a net wetter or drier climate. In the case of this lake in Tibet, increased evaporation dominated over precipitation during warmer periods. According to the authors, the lake levels declined due to increased evaporation dominating during warm summers. 

The paper joins hundreds of others demonstrating that the hydrological cycle is far more complex than assumed by model simulations, which are unable to directly or skillfully simulate the hydrological cycle/convection/evaporation/clouds/etc and instead rely upon fudge factor "parameterizations" for these critical aspects necessary to skillfully project climate change. 

Looking at only one side of the equation and ignoring or not fully considering the opposing negative-feedbacks is unfortunately rampant in climate science, including the false assumption that radiative forcing dominates over negative-feedback convection in the troposphere, which in-turn has led to a gross exaggeration of greenhouse warming. 
Top graph shows another non-hockey-stick temperature reconstruction over the past 17,200 years, with temperatures higher during the Medieval Warm Period ~1100 years ago, Roman Warm Period ~2000 years ago, and during the Holocene Climate Optimum~6000-4000 years ago as compared to the end of the record [left side of graph]. Horizontal axis is thousands of years before the present.
Bottom graph shows an inverse relationship between temperature and lake levels over the past 4000 years.

We investigate the distribution of archaeal lipids in a 5.8-m-long sedimentary core recovered from Lake Qinghai to extract regional hydroclimate and temperature signals since the last deglaciation for this important region. The paleohydrology was reconstructed from the relative abundance of thaumarchaeol (%thaum) and the archaeol and caldarchaeol ecometric (ACE) index. The %thaum-inferred lake-level record was extended to deglaciation, showing three periods (11.9–13.0, 14.1–14.7 and 15.1–17.2 cal ka BP) with relatively higher lake levels than those during the early Holocene. The ACE record demonstrates three periods (10.6–11.2, 13.2–13.4 and 17.4–17.6 cal ka BP) of elevated salinity when the lake was shallow. Filtered TEX86 record based on archaeal lipid distributions corresponded to relatively higher lake levels, implying that a certain lake size is required for using the TEX86 paleothermometer. At 1–4 cal ka BP, the reconstructed temperature fluctuated significantly and correlated negatively with inferred lake level, indicating that lake temperature and hydrological change might be coupled during this period. We attribute this co-variance to the importance of summer temperature in controlling evaporation for this arid/semi-arid region. Moreover, our results indicate that archaeal lipids have potential in reconstructing paleoclimate patterns from lacustrine sedimentary cores, but the data should be interpreted with care.

New paper finds large surface solar radiation increase of 4% per decade & UV increase 7% per decade

A paper published today in Atmospheric Chemistry and Physics finds global solar radiation at the surface in Belgium has significantly and substantially increased by 4% per decade from 1991-2013, and solar UV radiation at the surface has increased even more by 7% per decade. 

According to the authors, the findings corroborate others for Europe as well as the well-known global brightening phenomenon, which followed the global dimming period from ~1970-1985 that was responsible for the ice age scare of the 1970's.  

The authors find a (statistically insignificant) decrease of aerosol optical depth of -8%/decade from 1991-2013, which could be due to a decrease of cloud cover and/or other aerosols. As noted by Dr. Roy Spencer, a mere 1-2% change in cloud cover can alone account for global warming or cooling. 

The authors also find total column ozone, which is primarily generated by solar UV and can act as a solar amplification mechanism, has increased by 3%/decade. 

The effects of solar dimming and brightening on climate are far greater than attributed to greenhouse gases, but which have not been simulated by climate models. These observed trends of solar surface radiation dimming and brightening correspond well to the observed global temperature changes over the past 50 years, and to a far greater extent than do CO2 levels.  

Findings from the paper:

erythemal ultraviolet (UV) dose (Sery):   +7%/decade
global solar radiation (Sg):                 +4%/decade
total ozone column:                                +3%/decade
aerosol optical depth:                            -8%/decade (insignificant)


Global solar radiation

Concerning the global solar radiation, many publications
agree on the existence of a solar dimming period between
1970 and 1985 and a subsequent solar brightening
period (Norris and Wild, 2007; Solomon et al.,
2007; Makowski et al., 2009; Stjern et al., 2009; Wild
et al., 2009; Sanchez-Lorenzo and Wild, 2012). Different
studies have calculated the trend in Sg after 1985.
The trend in Sg [global solar radiation] from GEBA (Global 
Energy Balance Archive; between 1987 and 2002 is
equal to +1.4 ( 3.4)Wm-2 per decade according to Norris
and Wild (2007). Stjern et al. (2009) found a total change
in the mean surface solar radiation trend over 11 stations
in northern Europe of +4.4% between 1983 and 2003. In
the Fourth Assessment Report of the IPCC (Solomon et al.,
2007), 421 sites were analyzed; between 1992 and 2002,
the change of all-sky surface solar radiation was equal to
0.66Wm-2 per year. Wild et al. (2009) investigated the
global solar radiation from 133 stations from GEBA/World
Radiation Data Centre belonging to different regions in Europe.
All series showed an increase over the entire period,
with a pronounced upward tendency since 2000. For
the Benelux region, the linear change between 1985 and
2005 is equal to +0.42Wm-2 per year, compared to the
pan-European average trend of +0.33Wm-2 per year (or
+0.24Wm-2 if the anomaly of the 2003 heat wave is excluded)
(Wild et al. 2009). Our trend at Uccle of +0.5
( 0.2)Wm-2 per year (or +4% per decade) agrees within
the error bars with the results from Wild et al. (2009).

Relations between erythemal UV dose, global solar radiation, total ozone column and aerosol optical depth at Uccle, Belgium

Atmospheric Chemistry and Physics, 14, 12251-12270, 2014

Author(s): V. De Bock, H. De Backer, R. Van Malderen, A. Mangold, and A. Delcloo

At Uccle, Belgium, a long time series (1991–2013) of simultaneous measurements of erythemal ultraviolet (UV) dose (Sery), global solar radiation (Sg), total ozone column (Q_{O3}$) and aerosol optical depth (τaer) (at 320.1 nm) is available, which allows for an extensive study of the changes in the variables over time. Linear trends were determined for the different monthly anomalies time series. Sery, Sg and QO3 all increase by respectively 7, 4 and 3% per decade. τaer shows an insignificant negative trend of −8% per decade. These trends agree with results found in the literature for sites with comparable latitudes. A change-point analysis, which determines whether there is a significant change in the mean of the time series, is applied to the monthly anomalies time series of the variables. Only for Sery and QO3, was a significant change point present in the time series around February 1998 and March 1998, respectively. The change point in QO3corresponds with results found in the literature, where the change in ozone levels around 1997 is attributed to the recovery of ozone. A multiple linear regression (MLR) analysis is applied to the data in order to study the influence of Sg, QO3 and τaer on Sery. Together these parameters are able to explain 94% of the variation in Sery. Most of the variation (56%) in Sery is explained by Sg. The regression model performs well, with a slight tendency to underestimate the measured Sery values and with a mean absolute bias error (MABE) of 18%. However, in winter, negative Sery are modeled. Applying the MLR to the individual seasons solves this issue. The seasonal models have an adjusted R2 value higher than 0.8 and the correlation between modeled and measured Sery values is higher than 0.9 for each season. The summer model gives the best performance, with an absolute mean error of only 6%. However, the seasonal regression models do not always represent reality, where an increase in Sery is accompanied with an increase in QO3 and a decrease in τaer. In all seasonal models, Sg is the factor that contributes the most to the variation in Sery, so there is no doubt about the necessity to include this factor in the regression models. The individual contribution of τaer to Sery is very low, and for this reason it seems unnecessary to include τaer in the MLR analysis. Including QO3, however, is justified to increase the adjusted R2 and to decrease the MABE of the model.

Wednesday, November 19, 2014

Nature article: $250 million should be spent on climate models able to skillfully simulate clouds & convection

An article published today in Nature notes multiple and substantial uncertainties and deficiencies of climate models which are "crucial for predicting global warming," due primarily to the low-resolution of today's models which is insufficient to skillfully simulate essential climate aspects such as 
  • clouds, 
  • ocean eddies, 
  • convection, 
  • water cycle, 
  • thunderstorms, 
  • "crucial components of the oceans" such as "the Gulf Stream, and the Antarctic Circumpolar Current" [and ocean oscillations]
  • etc
As the article mentions, typical climate models use a low resolution of 100 km, but much higher resolutions of 1 km or higher are required to skillfully model convection and clouds, far beyond the capability of current supercomputers. The author recommends a quarter billion dollars be spent to create international supercomputing centers for climate models, before the world spends trillions on mitigation based on the Precautionary Principle that may or may not be necessary.

As climate scientist Dr. Roger Pielke Sr. has pointed out, and contrary to popular belief, climate models are not based on "basic physics," rather are almost entirely comprised of parameterizations/fudge factors for most critical aspects of climate including convection and clouds. As the article below notes,

"simulations of climate change are very sensitive to some of the parameters [fudge factors] associated with these approximate representations of convective cloud systems"
However, even if supercomputers are developed over the next decade capable of handling such high resolution, substantial doubt remains of the benefits for climate prediction due to the inherent limitations of chaos theory, multiple flawed assumptions in the model code, and inadequate observations to initialize such numeric models. These are some of the reasons why two recent papers instead call for a new stochastic approach to climate modeling.

Climate forecasting: Build high-resolution global climate models

International supercomputing centres dedicated to climate prediction are needed to reduce uncertainties in global warming, says Tim Palmer.

Local effects such as thunderstorms, crucial for predicting global warming, could be simulated by fine-scale global climate models.

The drive to decarbonize the global economy is usually justified by appealing to the precautionary principle: reducing emissions is warranted because the risk of doing nothing is unacceptably high. By emphasizing the idea of risk, this framing recognizes uncertainty in the magnitude and timing of global warming.
This uncertainty is substantial. If warming occurs at the upper end of the range projected in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report1, then unmitigated climate change will probably prove disastrous worldwide, and rapid global decarbonization is paramount. If warming occurs at the lower end of this range, then decarbonization could proceed more slowly and some societies' resources may be better focused on local adaptation measures.
Reducing these uncertainties substantially will take a new generation of global climate simulators capable of resolving finer details, including cloud systems and ocean eddies. The technical challenges will be great, requiring dedicated supercomputers faster than the best today. Greater international collaboration will be needed to pool skills and funds.
Against the cost of mitigating climate change — conceivably trillions of dollars — investing, say, one quarter of the cost of the Large Hadron Collider (whose annual budget is just under US$1 billion) to reduce uncertainty in climate-change projections is surely warranted. Such an investment will also improve regional estimates of climate change — needed for adaptation strategies — and our ability to forecast extreme weather.

Grand challenges

The greatest uncertainty in climate projections is the role of the water cycle — cloud formation in particular — in amplifying or damping the warming effect of CO2 in the atmosphere2. Clouds are influenced strongly by two types of circulation in the atmosphere: mid-latitude, low-pressure weather systems that transport heat from the tropics to the poles; and convection, which conveys heat and moisture vertically.
Global climate simulators calculate the evolution of variables such as temperature, humidity, wind and ocean currents over a grid of cells. The horizontal size of cells in current global climate models is roughly 100 kilometres. This resolution is fine enough to simulate mid-latitude weather systems, which stretch for thousands of kilometres. But it is insufficiently fine to describe convective cloud systems that rarely extend beyond a few tens of kilometres.
Simplified formulae known as 'parameterizations' [i.e. fudge factors] are used to approximate the average effects of convective clouds or other small-scale processes within a cell. These approximations are the main source of errors and uncertainties in climate simulations3. As such, many of the parameters used in these formulae are impossible to determine precisely from observations of the real world. This matters, because simulations of climate change are very sensitive to some of the parameters [fudge factors] associated with these approximate representations of convective cloud systems4.
Decreasing the size of grid cells to 1 kilometre or less would allow major convective cloud systems to be resolved. It would also allow crucial components of the oceans to be modelled more directly. For example, ocean eddies, which are important for maintaining the strength of larger-scale currents such as the Gulf Stream and the Antarctic Circumpolar Current, would be resolved.

Simulation of convective cloud systems in a limited-area high-resolution climate model.

The goal of creating a global simulator with kilometre resolution was mooted at a climate-modelling summit in 20095. But no institute has had the resources to pursue it. And, in any case, current computers are not up to the task. Modelling efforts have instead focused on developing better representations of ice sheets and biological and chemical processes (needed, for example, to represent the carbon cycle) as well as quantifying climate uncertainties by running simulators multiple times with a range of parameter values.
Running a climate simulator with 1-kilometre cells over a timescale of a century will require 'exascale' computers capable of handling more than 1018 calculations per second. Such computers should become available within the present decade, but may not become affordable for individual institutes for another decade or more.

Climate facilities

The number of low-resolution climate simulators has grown: 22 global models contributed to the IPCC Fourth Assessment Report in 2007; 59 to the Fifth Assessment Report in 2014. European climate institutes alone contributed 19 different climate model integrations to the Fifth Assessment database ( Meanwhile, systematic biases and errors in climate models have been only modestly reduced in the past ten years6...
Even with 1-kilometre cells, unresolved cloud processes such as turbulence and the effects of droplets and ice crystals will have to be parameterized [fudge-factored] (using stochastic modelling to represent uncertainty in these parameterizations9). How, therefore, can one be certain that global-warming uncertainty can be reduced? The answer lies in the use of 'data assimilation' software — computationally demanding optimization algorithms that use meteorological observations to create accurate initial conditions for weather forecasts. Such software will allow detailed comparisons between cloud-scale variables in the high-resolution climate models and corresponding observations of real clouds, thus reducing uncertainty and error in the climate models10.
High-resolution climate simulations will have many benefits beyond guiding mitigation policy. They will help regional adaptation, improve forecasts of extreme weather, minimize the unforeseen consequences of climate geoengineering, and be key to attributing current weather events to climate change.
High-energy physicists and astronomers have long appreciated that international cooperation is crucial for realizing the infrastructure they need to do cutting-edge science. It is time to recognize that climate prediction is 'big science' of a similar league.

Why water vapor slows cooling at night & slows warming during the day

Recent comments by physicist Daniel Sweger explain why water vapor/humidity slows cooling at night, and why water vapor has the opposite effect of slowing warming during the day. The comments relate to recent posts including "Modeling of the Earth’s Planetary Heat Balance with an Electrical Circuit Analogy," as well as posts from a few years ago (reposted below) discussing Dr. Sweger's analysis finding the correlation of humidity and temperature is clearly negative, and the effect of water vapor on climate is a net negative-feedback cooling effect:

Dr. Sweger: In this paper I compiled 2 years of data showing the relationship of both relative and absolute humidity and daily temperature in two different locations on the continent of Africa. In this paper I demonstrated that the correlation of humidity and temperature is clearly negative.

Reply: Humidity increases nighttime temperatures

Dr. Sweger: I agree, and for the same reason that during the day high humidity slows the increase in temperature. The reason is that heat energy only flows in one direction: from a higher to a lower temperature. It is the same as in an electrical circuit. Electricity only flows from a higher potential to a lower one. During the day the energy from the sun is flowing from the sun towards the surface of the earth. The high specific heat of water impedes the flow of heat in the same way as a higher value of a resistor impedes the flow of electrons. At night the situation is reversed. Now the heat energy is flowing away from the earth into a very cold space, and the presence of water vapor impedes the flow of heat energy away from the earth’s surface. Thus the surface temperature decreases less on a humid night than a dry night.

The purpose of the paper was to demonstrate that any feedback loop between the effect of “greenhouse gases” and the earth’s surface must be negative, since it would result in an increase in water vapor, and thus an increase in the impedance of heat flow.

Reposts of two prior HS posts on Dr. Sweger's analysis:

Tuesday, March 13, 2012 [REPOST]

Water vapor, not CO2, controls climate and acts as a negative feedback

Physicist Daniel Sweger refutes the catastrophic AGW hypothesis in his paper The Climate Engine, showing that CO2 has a negligible effect upon climate and that water vapor acts as a negative feedback to global warming. Dr. Sweger uses data from 3 locales to show an inverse relationship between humidity and temperature. He notes,
"In the positive feedback mechanism as proposed by the global warming proponents this behavior would be reversed. Then the data would show a positive relationship between moisture content and temperature. But it does not. As suggested before, data is the language of science, not mathematical models."

The data clearly shows that the relationship between the amount of water vapor in the air and temperature is negative 

From the conclusion of The Climate Engine:

The role of water vapor in determining  surface temperatures is ultimately a dominant one. During daylight hours it moderates the sun’s energy, at night it acts like a blanket to slow the loss of heat, and carries energy from the warm parts of the earth to the cold. Compared to that, if carbon dioxide has an effect, it must be negligible.

It is also clear from the data presented above that water vapor acts with a negative feedback. The data clearly shows that the relationship between the amount of water vapor in the air and temperature is negative; that is, the higher the amount of water vapor in the atmosphere the lower the surface temperature. In that regard, it almost acts as a thermostat.

As the air cools as a result of an increasing moisture content in the atmosphere, there is a decrease in the amount of water vapor produced by evaporation. Eventually this decrease of the level of water vapor being introduced into the atmosphere results in a decrease in moisture content. At this point more sunlight reaches the earth’s surface resulting in higher temperatures and increasing evaporation.

In the positive feedback mechanism as proposed by the global warming proponents this behavior would be reversed. Then  the data would show  a positive  relationship  between moisture  content and temperature. But it does not.

As suggested before, data is the language of science, not mathematical models.

About the Author:
Dr. Daniel M. Sweger, AB (Physics, Duke University, 1965) and Ph.D. (Solid State Physics, American University, 1974) has been a research scientist at NIST, where he was active in a variety of research areas, including cryogenic thermometry, solid state and nuclear physics, and molecular spectroscopy. He also operated a computer software business and performed  consulting for the US Army. He is now semi- retired and is an adjunct instructor at National College of Business and Technology (, where, among other subjects, he teaches Environmental Science.

Saturday, April 2, 2011 [REPOST]

Physicist: Carbon dioxide has negligible effect on climate
Semi-retired physicist Dr. Daniel M. Sweger has been a research scientist at the National Institute of Standards and Technology, where he was active in a variety of research areas, including cryogenic thermometry, solid state and nuclear physics, and molecular spectroscopy.

Dr. Sweger's new paper, Earth’s Climate Engine (PDF), finds that if carbon dioxide has an effect on climate, it must be negligible. Instead, he finds on the basis of data and theory that water vapor is the dominant influence on climate, and its influence is the opposite of that assumed by the IPCC climate computer models.

From the Summary:
… While models can be useful, the results must be compared to actual measurements, i.e. data. Data is the language of science, but little has been done in that regard with the climate change models.

It is the premise of the author that water vapor is the dominant influence in determining and understanding global climate. Water vapor is much more abundant in the atmosphere than carbon dioxide, and its physical properties make it more important as well. During daylight hours it moderates the sun’s energy, at night it acts like a blanket to slow the loss of heat, and it carries energy from the warm parts of the earth to the cold. Compared to that, if carbon dioxide has any effect it must be negligible. Thus, the purpose of this paper is to explore the effect of water vapor on climate.

Detailed calculations and analysis of data from several locations clearly demonstrate that the effect of water vapor on temperature dominates any proposed effect of carbon dioxide. Furthermore, it is clear from the data presented that water vapor acts with a negative feedback on temperature, not a positive one. That is, the data demonstrate that increasing the level of water vapor in the atmosphere results in a decrease of temperature,not an increase as predicted by the climate models. In essence, atmospheric water vapor acts as a thermostat.

These results call into question the validity of using the results of the current general climate change models, particularly as the basis for policy decision making.

Related: Greenhouse Gases DO Have A Profound Effect On The Climate

New paper finds excuse #66 for the "pause": There's no pause if you look at only at the warmest & coldest day of the year

A new paper published in Environmental Research Letters finds excuse #66 for the 18+ year "pause" of global warming: There's no "pause" if you look at only the one single warmest and coldest days per year.

According to the authors and the accompanying editorial, if you use a dataset of extreme temperatures "which is not publicly available" and has data from "areas that don't have many observations" (with extrapolated (modeled) temperatures) and pick out the one single day per year with the highest and lowest temperatures, those sparse one day per year observations are within one standard deviation of the climate model warming predictions. 

[Except the "coherent cooling pattern across the Northern Hemisphere mid-latitudes that has emerged in the recent 15 years and is not reproduced by the models"]
All's good with the climate models since the single warmest and coldest days of the year are within 1 standard deviation of model predictions, except that inconvenient cooling trend in the NH mid-latitudes shown by the blue arrow. 
All's good with the climate models since the single warmest and coldest days of the year are within 1 standard deviation of model predictions, except that inconvenient cooling trend in SE Asia shown by the blue arrow. 
Of course, if you instead look at observations from the other 364 days per year, the climate models are overheated by a factor of 3-4 times and falsified at confidence levels exceeding 98%+, but that shouldn't stop anyone from grabbing at straws and believing in the "faux pause," or that this paper allegedly "results in increased confidence in projections of future changes in extreme temperature."

A temporary hiatus in warming of extreme temperatures is not unusual, nor inconsistent with model simulations of human-induced climate change

Michael Wehner examines trends in extreme temperatures during the warming hiatus.
Sillman et al. (2014) find that observed trends of extremely hot days and cold nights are consistent with the current generation of climate models. Short periods of localized decreases in these extreme temperatures are not unusual and the Sillman et al. results increase confidence in projections of future changes in extreme temperature.
Recent short periods of reduced rates of the increase or even a decrease in observed global average surface air temperatures have been used by those who refuse to accept the reality of anthropogenic climate change to argue that computational and theoretical models of the climate system are invalid and not useful tools for projections of the future. David Easterling and I first showed in 2009 that such arguments are a specious cherry picking of the natural variability inherent in the climate system (Easterling and Wehner 2009). Since then, a great deal of attention has been focused on the period since the very high temperatures of the El Niño year of 1998. Often referred to now as a "hiatus" in warming, the search for the causes of this recent reduction in warming has increased our knowledge of short-term fluctuations in global mean temperature. While the issue is not fully settled, some combination of natural variations in the climate as well as unforeseen changes in external forcing factors, including volcanic and human aerosols (Solomon et al. 2011, 2012, Santer et al. 2013) are the likely mechanisms behind the hiatus.
It follows that if there is a change in the rate of global average warming, there are likely some alterations to the rate of change of other aspects of the climate system. The present study by Jana Sillman and co-authors (Sillman et al. 2014) examine recent changes in the temperature of both extremely hot days and extremely cold nights. Both of these measures of extreme weather have warmed from 1971 to 2010 over most of the land regions where high quality observations exist. Increases are larger and more likely to be significantly different from zero for cold extremes. For hot extremes, they find that significant increases are confined to the Eurasian land mass. However, examination of changes in extreme temperatures over the much shorter period of 1996 to 2010 reveals a different pattern of a mix of coherent regions of increases and decreases for both measures of extreme temperature. Few of these observed changes are significantly different from zero over this short of a time period indicating that natural fluctuations at this time scale are of the same order of magnitude or larger than anthropogenically forced changes over this period. As with mean temperatures (Santer et al. 2011), longer periods are required for the human signal in extreme temperatures to arise from the natural noise of the climate system.
This interplay of the time scales and magnitudes of natural noise and externally forced signal make the evaluation of climate models and their projections of the future difficult, particularly on decadal time scales. The multi-model average change in both types of temperature extremes is positive at nearly all locations of quality observations over both the long and short periods considered. Such an apparent discrepancy is often incorrectly used to discredit the accuracy and usefulness of climate model projections of the future. However, such an interpretation of model simulations is fundamentally flawed as the observed climate system has followed a single path in a complex space of many different synoptic possibilities whereas a multi-model average is an integral (although somewhat incomplete) over these possibilities. Sillman and co-authors have very carefully examined 74 individual simulations from 27 of the most recent climate models and conclude that the observed globally averaged temperature changes in hot days and cold nights over both the long and short periods are consistent with a 5–95% range of the multi-model ensemble. They go on to analyze seven sub-continental regions and find that with one exception, the observations and models are also consistent by this same measure of confidence. The one exception is an observed large decrease in south Asia in cold night temperatures over the period 1996–2010. While most model realizations simulated increases for this extreme temperature metric in this region from 1996–2010, at least four model runs from four different models simulated extremely cold night temperature decreases. Remarkably, one of these simulations produces cold night temperatures of similar magnitude as observed despite the finding that "the recent 15-year period largely represents a highly unusual (extreme) realization of climate as part of internal variability".
Unlike the recent hiatus in the warming of average temperatures, where an appeal to unforeseen external forcings appears to be necessary to fully reconcile observed regional cooling with model simulations (Fyfe et al. 2013), most, if not all, observed recent downward regional trends in the temperatures of hot days and cold nights can be explained by the larger natural variability of these metrics of extreme weather. The CMIP5 models are far from perfect, particularly in their characterization of regional natural variability on decadal scales and their response to short-term variations in external forcing. And the hiatus in average global temperatures has necessitated a reduction in the lower bound on climate sensitivity (Otto et al. 2013). Nonetheless, the consistency between the multi-model ensemble and the actual climate, demonstrated by Sillman and co-authors, enhances confidence in suitably scaled multi-model projections of future long-term increases in expected temperature extremes.


This work was supported by the Regional and Global Climate Modeling Program of the Office of Biological and Environmental Research in the US Department of Energy Office of Science under contract number DE-AC02-05CH11231. The author wishes to thank David Easterling and Dáithí Stone for their useful comments.

About the author

Michael F Wehner, Lawrence Berkeley National Laboratory, Berkeley, California

Observed and simulated temperature extremes during the recent warming hiatus


Jana Sillmann1, Markus G Donat2, John C Fyfe3 and Francis W Zwiers

The discrepancy between recent observed and simulated trends in global mean surface temperature has provoked a debate about possible causes and implications for future climate change projections. However, little has been said in this discussion about observed and simulated trends in global temperature extremes. Here we assess trend patterns in temperature extremes and evaluate the consistency between observed and simulated temperature extremes over the past four decades (1971–2010) in comparison to the recent 15 years (1996–2010). We consider the coldest night and warmest day in a year in the observational dataset HadEX2 and in the current generation of global climate models (CMIP5). In general, the observed trends fall within the simulated range of trends, with better consistency for the longer period. Spatial trend patterns differ for the warm and cold extremes, with the warm extremes showing continuous positive trends across the globe and the cold extremes exhibiting a coherent cooling pattern across the Northern Hemisphere mid-latitudes that has emerged in the recent 15 years and is not reproduced by the models. This regional inconsistency between models and observations might be a key to understanding the recent hiatus in global mean temperature warming.