Mathematical modeling illusions

The global climate scare – and policies resulting from it – are based on models that do not work

Dr. Jay Lehr and Tom Harris1
Model comparison chart by JR Christy, University of Alabama in Huntsville.

For the past three decades, human-caused global warming alarmists have tried to frighten the public with stories of doom and gloom. They tell us the end of the world as we know it is nigh because of carbon dioxide emitted into the air by burning fossil fuels.

They are exercising precisely what journalist H. L. Mencken described early in the last century: “The whole point of practical politics is to keep the populace alarmed (and hence clamorous to be lead to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.”

The dangerous human-caused climate change scare may well be the best hobgoblin ever conceived. It has half the world clamoring to be led to safety from a threat for which there is not a shred of meaningful physical evidence that climate fluctuations and weather events we are experiencing today are different from, or worse than, what our near and distant ancestors had to deal with – or are human-caused.

Many of the statements issued to support these fear-mongering claims are presented in the U.S. Fourth National Climate Assessment, a 1,656-page report released in late November. But none of their claims have any basis in real world observations. All that supports them are mathematical equations presented as accurate, reliable models of Earth’s climate.

It is important to properly understand these models, since they are the only basis for the climate scare.

Before we construct buildings or airplanes, we make physical, small-scale models and test them against stresses and performance that will be required of them when they are actually built. When dealing with systems that are largely (or entirely) beyond our control – such as climate – we try to describe them with mathematical equations. By altering the values of the variables in these equations, we can see how the outcomes are affected. This is called sensitivity testing, the very best use of mathematical models.

However, today’s climate models account for only a handful of the hundreds of variables that are known to affect Earth’s climate, and many of the values inserted for the variables they do use are little more than guesses. 

Dr. Willie Soon of the Harvard-Smithsonian Astrophysics Laboratory lists the six most important variables in any climate model:

1) Sun-Earth orbital dynamics and their relative positions and motions with respect to other planets in the solar system;

2) Charged particles output from the Sun (solar wind) and modulation of the incoming cosmic rays from the galaxy at large;

3) How clouds influence climate, both blocking some incoming rays/heat and trapping some of the warmth;

4) Distribution of sunlight intercepted in the atmosphere and near the Earth’s surface;

5) The way in which the oceans and land masses store, affect and distribute incoming solar energy;

6) How the biosphere reacts to all these various climate drivers.

Soon concludes that, even if the equations to describe these interactive systems were known and properly included in computer models (they are not), it would still not be possible to compute future climate states in any meaningful way. This is because it would take longer for even the world’s most advanced super-computers to calculate future climate than it would take for the climate to unfold in the real world.

So we could compute the climate (or Earth’s multiple sub-climates) for 40 years from now, but it would take more than 40 years for the models to make that computation.

Although governments have funded more than one hundred efforts to model the climate for the better part of three decades, with the exception of one Russian model which was fully “tuned” to and accidentally matched observational data, not one accurately “predicted” (hindcasted) the known past. Their average prediction is now a full 1 degree F above what satellites and weather balloons actually measured.

In his February 2, 2016 testimony before the U.S. House of Representatives Committee on Science, Space & Technology, University of Alabama-Huntsville climatologist Dr. John Christy compared the results of atmospheric temperatures as depicted by the average of 102 climate models with observations from satellites and balloon measurements. He concluded: “These models failed at the simple test of telling us ‘what’ has already happened, and thus would not be in a position to give us a confident answer to ‘what’ may happen in the future and ‘why.’ As such, they would be of highly questionable value in determining policy that should depend on a very confident understanding of how the climate system works.”

Similarly, when Christopher Monckton tested the IPCC approach in a paper published by the Bulletin of the Chinese Academy of Sciences in 2015, he convincingly demonstrated that official predictions of global warming had been overstated threefold. (Monckton holds several awards for his climate work.)

The paper has been downloaded 12 times more often than any other paper in the entire 60-year archive of that distinguished journal. Monckton’s team of eminent climate scientists is now putting the final touches on a paper proving definitively that – instead of the officially-predicted 3.3 degrees Celsius (5.5 F) warming for every doubling of CO2 levels – there will be only 1.1 degrees C of warming. At a vital point in their calculations, climatologists had neglected to take account of the fact that the Sun is shining!

All problems can be viewed as having five stages: observation, modeling, prediction, verification and validation. Apollo team meteorologist Tom Wysmuller explains: “Verification involves seeing if predictions actually happen, and validation checks to see if the prediction is something other than random correlation. Recent CO2 rise correlating with industrial age warming is an example on point that came to mind.”

As Science and Environmental Policy Project president Ken Haapala notes, “the global climate models relied upon by the IPCC [the United Nations Intergovernmental Panel on Climate Change] and the USGCRP [United States Global Change Research Program] have not been verified and validated.”

An important reason to discount climate models is their lack of testing against historical data. If one enters the correct data for a 1930 Model A, automotive modeling software used to develop a 2020 Ferrari should predict the performance of a 1930 Model A with reasonable accuracy2  And it will.

But no climate models relied on by the IPCC (or any other model, for that matter) has applied the initial conditions of 1900 and forecast the Dust Bowl of the 1930s – never mind an accurate prediction of the climate in 2000 or 2015. Given the complete lack of testable results, we must conclude that these models have more in common with the “Magic 8 Ball” game than with any scientifically based process.

While one of the most active areas for mathematical modeling is the stock market, no one has ever predicted it accurately. For many years, the Wall Street Journal chose five eminent economic analysts to select a stock they were sure would rise in the following month. The Journal then had a chimpanzee throw five darts at a wall covered with that day’s stock market results. A month later, they determined who preformed better at choosing winners: the analysts or the chimpanzee. The chimp usually won.

For these and other reasons, until recently, most people were never foolish enough to make decisions based on predictions derived from equations that supposedly describe how nature or the economy works.

Yet today’s computer modelers claim they can model the climate – which involves far more variables than the economy or stock market – and do so decades or even a century into the future. They then tell governments to make trillion-dollar policy decisions that will impact every aspect of our lives, based on the outputs of their models. Incredibly, the United Nations and governments around the world are complying with this demand. We are crazy to continue letting them get away with it.

A Critique of the Fourth National Climate Assessment

By Robert W. Endlich

In describing the errors in the Fourth National Climate Assessment, ‘NCA4’, I’ll use the words from the Executive Summary which purport to link climate changes in the USA to global climate change.

Photo by Pixabay

The first claim, “The last few years have also seen record-breaking, climate-related weather extremes,“ is shown to be false, simply by examining climate records, some from the National Climate Data Center.

Tornadoes have been decreasing over the past six decades as temperatures moderate from the significant cooling of the 1940s to 1970s.  As a basic knowledge of meteorology teaches, it is the pole to equator temperature difference that drives the intensity of cold Continue reading “A Critique of the Fourth National Climate Assessment”

Drought, Climate, Elephant Butte Water Storage

and the future of water storage for the lower Rio Grande Valley of New Mexico.
By Robert W. Endlich
Elephant Butte Dam and Landscape in New Mexico. Photo by U.S. Army Corps

Laura Paskus’ 3-part series on the current drought, its effects on farmers and residents, and the coming US Supreme Court decision, starts with a question, ”Elephant Butte is at 3 percent capacity; what happens next?” Let me introduce measurements, missing from Paskus’ series: Elephant Butte Lake levels, temperature, rainfall, and climate patterns. My analysis: nothing in the current meteorological/climatological situation is worse than the past century. History and study show that either water availability must increase, or water costs will increase.

Paskus’ sense of alarm with recent Elephant Butte Reservoir capacity falling to 3% implies impending catastrophe, but historic data show frequent episodes where the reservoir capacity in the 1950s, 60s and Continue reading “Drought, Climate, Elephant Butte Water Storage”

Q&A Following Dr. David Gutzler’s Water Conservation Workshop

by Bob Endlich

[Dr Gutzler conducted the 1 March 2018 “Lush and Lean” water conservation workshop in Las Cruces in the Roadrunner Room of the Branigan Library from 5:30 PM – 7:30 PM:  About an hour-long lecture of 21 Slides, followed by questions and answers. The nominal topic was, “Learn about projecting future water supplies in a rapidly changing climate.” I prepared this memo the next day, 2 March 2018 and have edited it a bit in the time since.  I provide it in this post as a small part of the overall climate debate.]

C:\Users\Bob\Desktop\Pictures and Graphs 7Apr2016\NNM__3.png
Bob Endlich at News NM Radio Set

Overall Gutzler did only a fair job explaining the development of the present La Nina and the present and impending drought conditions; I rate it as ‘only fair’ because he did not mention either the 2016 El Nino or features of El Nino-Southern Oscillation, he mentioned the Pacific Decadal Oscillation but did not explain it, or it’s 60-year periodicity.

[This is perhaps professional one-upsmanship on my part, because I think and thought at the time that my presentations on the subject are better than his, an example of my most recent on this subject is at the web site, Slides 34 to 122]

Dr. Gutzler devoted perhaps only 5 minutes to the “increasing greenhouse gasses are causing anthropogenic climate change,” but this point was the last one in the three points he emphasized in his concluding slide.

There were a couple of people who approached Gutzler after the talk was over; I was the last and introduced myself; we have Continue reading “Q&A Following Dr. David Gutzler’s Water Conservation Workshop”