It’s More Like Global Luke-warming And It’s not so Bad

While human activity generated CO2 may play some role in effecting climate; straight forward comparison of the computer modeled temperature prediction with the observed temperatures demonstrates that there is much we don’t understand about CO2 and climate or even climate more generally.  Further relatively superficial investigation reveals that the so-called “settled science” underlying the prevailing politically correct climate change narrative is neither settled nor unassailable science. 

Contrary to what we have been told in the media:  There is no consensus within the broad climate science community that industrial CO2 production will inexorably lead to environmental and economic catastrophe. The 2014 Report of the Nongovernmental International Panel on Climate Change (NIPCC) is an 880-page comprehensive critique of the IPCC’s positions. The appendix of this reports lists the names of 31,478 American scientists who have endorsed the following statement:

“there is no convincing scientific evidence that human release of carbon dioxide, methane, or other greenhouse gases is causing or will, in the foreseeable future, cause catastrophic heating of the Earth’s atmosphere and disruption of the Earth’s climate.”

Before examining the temperature data, a few words on scientific discovery methodology.  In the field of science there are 2 general methods of investigation – deductive and inductive. 

Deduction

Picture1

Deduction works from the general to the more specific. It starts with a theory about a particular phenomenon of interest which in turn generates a hypothesis (prediction) to be tested.  Data is then collected (observed) to test the hypothesis.  The observed data is then analyzed looking to confirm or reject the hypothesis.

Induction

Picture1

Inductive investigation is more bottom up, beginning with specific observations looking for patterns from which to formulate a tentative hypothesis that can be further explored with further observation; with the ultimate goal of developing a general conclusion or theory.

Inherently, inductive investigation is more open-ended and exploratory, while deductive investigation is more narrow or limited, testing a particular hypothesis – bottom up thinking vs. top down thinking if you will.

It needs to be pointed out that most of our understanding in the natural sciences has arisen from observation of phenomena and data.  The underlying science is only figured out after observation and, not infrequently, after unexpected observation.  As an example, the antibiotic property of penicillin was not predicted but rather it was observed that a contaminant fungus on a culture plate killed bacteria.  The antibacterial properties were worked out after the observation.

Deductive investigation starts with an assumption (hypothesis). If the assumption is mistaken, that assumption can lead to erroneous and harmful conclusions until the data demonstrates the fallacy of the assumption.  As an example: Because cholesterol was noted in clogged arteries, it was hypothesized that dietary cholesterol must contribute to blood vessel disease. Government guidelines subsequently recommended eliminating fats from our diets.  The decrease in dietary fat resulted in increased carbohydrate intake.  Now the empirical evidence has accumulated that increased carbohydrate intake has promoted diabetes, obesity, unhealthy blood lipid levels, and more vascular disease.

Turning now to the investigation of climate change: What do we know about climate?

Climate has always changed, is changing, and will always change; that there were times when the earth was much colder or warmer than it is now and during both those circumstances CO2 levels were at times higher or lower than now – as demonstrated by ice core studies; and that solar cycles, atmospheric particulate matter (e.g. volcanic activity), greenhouse gasses, ocean currents, and macro weather patterns such as El Nino/La Nina all appear to effect climate. Our “understanding” of climate in fact suggests there is much we don’t understand about climate.  It would therefore stand to reason that any investigation of human influence on climate should begin with a broadly exploratory study of climate and the factors influencing climate.

However, that has not been the case.  The “settled science of climate change” is the product of the International Panel of Climate Change (IPCC).  The IPCC was established in 1988 by the United Nations Environment Programme and the World Meteorological Organization. Its membership is comprised of governments not scientists. The IPCC was not given the mandate to broadly study climate change or to look at natural as well as man-made influences on climate. It was specifically tasked to find and report on the human impact on climate, and to then make a scientific case for the adoption of national and international policies to reduce that impact.

The International Panel of Climate Change (IPCC) temperature modeling is based in the following deductive reasoning:  Human generated CO2 temperature increases will warm the oceans and generate additional water vapor – a greenhouse gas – and thereby amplify the CO2 greenhouse gas temperature effect.  This water vapor positive feedback effect is central to all IPCC modeling and that modeling incorporates a 2-3X or more amplification of the predicted CO2 temperature increase; yet, this critical assumption is not established fact.

For one thing, current climate models poorly simulate clouds, and clouds have significant effects on temperature.  In some cases, clouds could cool the atmosphere by blocking incoming radiation or producing precipitation; and in other cases, clouds could warm the atmosphere by blocking outgoing heat.

Reasoned arguments can be made in support of significant water vapor positive feedback or against such feedback. In either case however, those arguments are just postulates. Ultimately, the proof is in the observed data.

So how well has this deductive reasoning – modeling assuming water vapor positive feedback – predicted the observed reality?

John Christy, a climate expert from the University of Alabama, gave the following report on climate change to a joint meeting of Senate and House committees on December 8, 2015.

He first compared the observed temperature data to the IPCC computer modeled temperature for the middle troposphere.   The troposphere is the earth’s active weather zone, and extends from the surface to around 40,000 feet.  The observed temperature record was a product of 2 different temperature measurements – balloon data and satellite data.

The balloon data is the compilation of 4 separate data sets from weather balloons launched twice as day simultaneously across the world so to get a snapshot of the physical properties of that day’s atmosphere.  These balloon launches have occurred twice daily since 1979.  The satellite temperature recordings go back 35 years and are derived from measuring the vibration of diatomic oxygen in the lower atmosphere which turns out to produce a much more accurate temperature measurement than standard mercury-in-glass instruments.

The data demonstrates that for the 36-yr period from 1979 to 2015 the observed tropospheric temperature was less than that predicted by the mean of the 102 computer models, and at times significantly so.  Over that time period, the observed warming has been roughly 1/3rd of that predicted by the models.  This data also shows the observed tropospheric temperature increase over the last 10 years has been less than 0.05 degrees C.

Dr. Christy also compared the most recent revision of each of the 5 observed global surface temperature records to that of the average of the 108 IPCC climate models predicted temperatures.

Unlike tropospheric temperature data sets, the surface temperature records have been reconfigured on multiple times by those researchers responsible for the 3 principal surface temperature histories.  (A cynic might postulate that these data sets have been reconfigured multiple times because the observed temperature data hasn’t supported the significantly more warming predicted by the computer modeling.  Conversely, a cynic might suspect that had the data fit their desired outcome, no reconfiguration would have been done.)

Of the major surface temperature records, the IPCC primarily has used HadCRU temperature data – a combination of the data sets compiled by the Hadley Centre of the UK Met Office and the Climatic Research Unit (CRU) at the University of East Anglia.

In any case, his analysis of the surface temperature records demonstrates for all periods from 10 years (2006-2015) to 65 (1951-2015) years in length, the observed temperature trend was in the lower half of the climate model temperature predictions, and for several periods, the observed trend lies very close (or even below) the 2.5th percentile of those predictions. This comparison further reveals that the observed warming rate has been beneath the model mean expectation for all periods extending back 60+ years.

This empirical data also demonstrates a “pause” or “slowdown” in the rate of global warming has taken place over the past 15 years – a period during which more than 100 billion tons of carbon dioxide has been pumped into the atmosphere.  This pause only recently has been acknowledged in the AGW scientific journals.

 

Making sense of the early-2000s warming slowdown

Nature Climate Change 6, 224–228, 24 February 2016

John C. Fyfe, Gerald A. Meehl, Matthew H. England, Michael E. Mann, Benjamin D. Santer, Gregory M. Flato, Ed Hawkins, Nathan P. Gillett, Shang-Ping Xie, Yu Kosaka & Neil C. Swart 

This article was recently published in a mainline climate change journal.  Notice the list authors includes Michael Mann, the Penn State climatologists accused of fudging data to create the famed hockey stick shaped global warming prediction – the onetime icon of alarmist global warming projections. The authors write:

“It has been claimed that the early-2000s global warming slowdown or hiatus, characterized by a reduced rate of global surface warming, has been overstated, lacks sound scientific basis, or is unsupported by observations. The evidence presented here contradicts these claims.”

“In all three observational datasets the most recent 15-year trend (ending in 2014) is lower than both the latest 30-year and 50-year trends. This divergence occurs at a time of rapid increase in greenhouse gasses (GHGs). A warming slowdown is thus clear in observations; it is also clear that it has been a ‘slowdown’ and not a stop.”

Whether the hiatus is a slowdown, a pause or a stop is debatable. In any case, the observed recent temperatures demonstrate significantly less warming than IPCC modeling predicted.

Climate scientists have proposed over 40 explanations for the warming hiatus including particulate matter from small volcanoes and pollutions, ocean movements, data gathering problems, natural variability, and several more.

A favored explanation attributes the pause to strong trade winds in the Pacific Ocean. This theory proposes that for the last 30 years of the 20th century, ocean currents had been boosting temperatures by bringing heat to the surface, and then for the past 15 years the currents had been lowering temperature by pushing heat into the deep ocean.  This hypothesis was based on relatively few measurements that demonstrated a temperature change of a few hundredths of a degree at ocean depths of up to 200 meters.

Acknowledgement of the pause is good for science.  The 40+ explanations can’t all be right, but all potentially provide insight into better understanding climate change. The pause tells us that there is significant underlying natural climate variability. It tells us that our knowledge of climate change is limited and incomplete.  It tells us that the science is not settled.

Going forward

Given that the observed rate of warming in the satellite-sensed and balloon data is barely a third of that predicted by global climate models, it is both reasonable and prudent to cut the modeled temperature forecasts for the rest of this century by 50%. Doing so would mean that the world—without imposition of any further global warming policy —won’t warm by the dreaded 3.6 degrees Fahrenheit by 2100 that the United Nations regards as the climate apocalypse.

In fact, most experts believe that warming of less than 2 degrees Celsius from preindustrial levels not only result in no net economic or ecological damage but rather that up to two degrees of total warming, the benefits will generally outweigh the harmful effects

Warming of up to 1.2 degrees Celsius over the next 70 years (0.8 degrees have already occurred), most of which is predicted to happen in cold areas in winter and at night, would extend the range of farming further north, improve crop yields, slightly increase rainfall (especially in arid areas), enhance forest growth and cut winter deaths (which far exceed summer deaths in most places). Increased carbon dioxide levels also have caused and will continue to cause an increase in the growth rates of crops and the greening of the Earth—because plants grow faster and need less water when carbon dioxide concentrations are higher.

Climate Change Takeaways

The science is not settled. Models are not evidence – this bears repeating: Models are not evidence!  Our knowledge of the climate and climate change remains limited and incomplete. Finally, given the huge political and economic implications of climate policy, climate change study merits a vigorous, broad, and open-ended investigation – not research to confirm a pre-ordained conclusion.

Share
This entry was posted in Uncategorized. Bookmark the permalink.

3 Responses to It’s More Like Global Luke-warming And It’s not so Bad

  1. Carroll P Tignall, Jr says:

    Nick, Thanks. Happy 4th of July. Carroll

  2. Carroll P Tignall, Jr says:

    Nick, thanks.

  3. Rich Grimmie says:

    Well done, thanks.

Leave a Reply

Your email address will not be published. Required fields are marked *

Security Question * *
Time limit is exhausted. Please reload CAPTCHA.