A Climate Change Primer: Computer Models and the Need for More Research

A Climate Change Primer: Computer Models and the Need for More Research
July 1, 2003

In Part One of this three-part series, Lehr and Bennett defined and described the “greenhouse effect,” summarized temperature observations and reported that average global temperature has increased roughly 0.6º C over the past 120 years, and described the methods used to measure temperature. In Part Two, they explained how the Earth’s movement around the sun causes climate to change.

In this third and final part, Lehr and Bennett describe the shortcomings of computer models and emphasize the need for continued climate research.


Computer simulations of the Earth’s climate are not accurate, as they do not take into account all of the variables that affect climate.

We know from simple physics, for example, that the doubling of atmospheric carbon dioxide (CO2) adds about 4 watts per square meter (W/m2) of energy to the climate system. CO2, of course, is the “greenhouse gas” targeted by global warming adherents as the principal cause of man-made climate change.

But the 4 W/m2 of energy that a doubling of atmospheric CO2 contributes is dwarfed by the amount of energy supplied by the sun’s radiation: 342 W/m2 in the upper levels of the atmosphere.

And 4 W/m2 is also small compared to the uncertainties in the climate change calculations.

For example, our knowledge of the amount of energy flowing from the equator to the poles is uncertain by an amount equivalent to 25-30 W/m2. The amount of sunlight absorbed by the atmosphere or reflected by the surface is also uncertain by as much as 25 W/m2. Some computer models include adjustments to the energy flows of as much as 100 W/m2. Imprecise treatment of clouds may introduce another 25 W/m2 of uncertainty into the basic computations. (1)

These uncertainties in modeling climate processes are many times larger than the 4 W/m2 input of energy resulting from a doubling of CO2 concentration in the atmosphere. It is difficult to see how the climate impact of the 4 W/m2 can be accurately calculated in the face of such huge uncertainties. As a consequence, forecasts based on computer simulations of climate may not even be meaningful at this time.

A comparison of nearly all of the most sophisticated climate models with actual measurements of current climate conditions found the models in error by about 100 percent in cloud cover, 50 percent in precipitation, and 30 percent in temperature change. Even the best models give temperature change results differing from each other by a factor of two or more. (2)

2001 NAS Findings Overstated

Considerable confusion has been added to the debate by the National Academy of Sciences (NAS) Report on Global Climate Change, issued in June 2001. (3) The report, which the news media trumpeted as having confirmed global warming, in fact resolved very little. What it reported as fact--that global temperatures as measured by land-based thermometers have risen in the past 20 years--was already known and not a point of contention.

What the NAS report did not conclude was that global warming was the result of human activity. In a passage not reported by the press, the report actually concluded: “A causal linkage between the buildup of greenhouse gases in the atmosphere and the observed climate changes during the 20th century cannot be unequivocally established.”

The NAS report also pointed out that current warming could be completely natural. The scientist-authors state that due to inadequacies in the global warming computer simulation models, they cannot tell for sure whether this warming is anything more than natural climate variation: “The fact that the magnitude of observed warming is large compared to natural variability as simulated in climate models does not constitute proof of a linkage [to increases in greenhouse gases] because the model simulations could be deficient.”

The report also noted that land-based measurements of global temperatures over the past 20 years are inconsistent with satellite readings, which have shown only a 0.04º C temperature increase per decade. The report noted, “satellite measurements beginning in 1979 show little warming of air temperature in the troposphere.” This is significant because the global warming hypothesis predicts satellite measurements of temperatures in the troposphere will show warming before it is evident at the surface.

The NAS authors noted they have no explanation for this contradiction, stating, “the finding that surface and troposphere temperature trends have been as different as observed over intervals as long as a decade or two is difficult to reconcile with our current understanding of the processes that control the vertical distribution of temperature in the atmosphere.”

The evidence appears to directly contradict the global warming hypothesis, suggesting that observed surface warming is not part of a pattern of human-induced global warming. Of the three measurements of global temperatures--satellite, land-based, and weather balloon--only the land-based data show any significant warming.

The Case for Research

The predictions of severe warming over the coming century are based on developing models of climate that have so far failed to accurately simulate history, let alone predict the future. The satellite record of lower atmosphere temperatures since 1979 shows nearly no warming in the lowest level of the atmosphere, where models predicted the fastest warming. The models also utterly fail to accurately simulate the transition into and out of the Ice Ages for the past few hundred thousand years.

We should continue to invest in climate science, to unlock the mysteries of what really makes climate tick. There are many benefits to understanding and predicting our weather.

We should also plan for changes in climate, as climate naturally changes. We need to invest in agricultural adaptability and in water management. But we should also recognize that many scientific studies conclude high CO2 concentrations are better for agriculture.

Even under the worst-case global warming scenario, delaying for 25 years any substantial cuts in CO2 emissions would produce an additional global temperature rise of no more than a few tenths of a degree C by the year 2100. (4) That means we have at least 25 years in which to sharpen our understanding of climate and seek valid predictions, without contributing to serious climate change.

Policies developed in haste, or based on poor information, are likely to have a destructive impact on the U.S. and world economies and the well-being of all Earth’s citizens. By contrast, an incremental warming of a few tenths of a degree, spread over decades, constitutes no hazard while we seek important, additional information on which to build national and worldwide energy policy.


Jay Lehr is science director for The Heartland Institute. Richard Bennett is president of The Society of Environmental Truth in Corpus Christi, Texas.


References

(1) R.D. Cess, M.H. Zhang, P. Minnis, L. Corsetti, and E.G. Dutton, “Absorption of Solar Radiation by Clouds: Observations Versus Models,” Science 267, 496-499 (1995).

(2) T.P. Barnett, “Comparison of Near Surface Air Temperature Variability in 11 Coupled Global Climate Models,” Journal of Climate 12, 511-515 (1999).

(3) Report on Global Climate Change (Washington DC: National Academy of Sciences, June 2001).

(4) T.M.L. Wigley, R. Richels and J.A. Edmonds, “Economic and Environmental Choices in the Stabilization of Atmospheric CO2 Concentrations,” Nature 379, 240-243 (1996).