A Climate Change Primer: It’s the Sun!
The sun supplies the energy to warm the Earth. The atmosphere, which is mostly transparent to the incoming sunlight, absorbs outgoing reflected or internal thermal radiation to keep the Earth warmer than it otherwise would be.
This absorbing property of the atmosphere is the “greenhouse effect.” Gases in the atmosphere that absorb infrared radiation, thereby preventing some of the outgoing energy from returning to space, are called greenhouse gases.
Not all gases in the atmosphere absorb outgoing infrared radiation. Nitrogen and oxygen, which make up most of the Earth’s atmosphere, have no blocking effect. The gases that absorb the infrared radiation and create the greenhouse effect are mainly water vapor, carbon dioxide, methane, and nitrous oxide.
Water vapor and water in clouds absorb nearly 90 percent of the infrared radiation, while carbon dioxide, methane, and the other minor greenhouse gases together absorb little more than 10 percent of the infrared radiation. (1) Therefore, most of the greenhouse effect is natural and caused by the different forms of water in the atmosphere.
Over the past 100 years, however, human activities--such as burning wood, coal, oil, and natural gas--have increased the concentration of greenhouse gases in the atmosphere by an amount equivalent to a 50 percent increase in carbon dioxide alone, according to the George Marshall Institute, which has supported global climate change research for many years. Many studies project the level of greenhouse gases in the atmosphere from human activities will be effectively equal to a doubling of CO2 in the next 100 years.
The average global temperature of the Earth has increased roughly 0.6º C over the past 120 years. Much of the observed temperature rise occurred before 1940, whereas most of the additional carbon dioxide (more than 80 percent) entered the atmosphere after 1940. Increased greenhouse gas levels cannot explain a temperature rise that occurred before the major increases in these gases occurred in the atmosphere.
Between 1940 and 1970, carbon dioxide built up rapidly in the atmosphere. According to computer projections of climate, the temperature of the Earth should also have risen rapidly. Instead, the temperature dropped.
The increase in greenhouse gas levels cannot explain the rapid rise in temperature before 1940, and it cannot explain the drop in temperature from 1940 to 1970. The climate record over the past 100 years provides no support for the idea that human activities, such as burning coal and oil for energy, caused the early 20th century global warming. Natural factors must have caused most of that warming.
When scientists analyzed the relationship between atmospheric CO2 levels and temperatures dating back 250,000 years--data obtained from infrared analysis of ice cores drilled in Greenland and the Antarctic--they found that sometimes the concentration of CO2 was high when the temperature was low, and sometimes the CO2 was low when the temperature was high. (2)
Moreover, a careful analysis showed that some of the atmospheric CO2 changes did not precede the temperature changes, as the greenhouse warming theory would predict. Instead, changes in atmospheric carbon dioxide followed the temperature changes. The atmospheric CO2 changes were not the cause of the temperature changes. The CO2 changes were likely driven by changes in vegetation, in response to natural variations in air- and sea-surface temperatures.
During the past 10,000 years, the climate has remained relatively warm and stable, allowing humans to advance and prosper. But even during this generally warm period the temperature has fluctuated significantly.
The climate was warmer than it is today about 6,500 years ago, during the Holocene Climate Optimum. There is evidence that roughly 1,000 years ago, regions of the Earth again were substantially warmer than they are today, during a period called the Medieval Climate Optimum. By the fourteenth century, a cold period called the Little Ice Age had begun. The warming begun in the late nineteenth and early twentieth centuries seems to be a natural recovery from the Little Ice Age. (3)
Closer to the present, some researchers believe the 1980s were the hottest decade in 100 years, and that some years in the 1990s may have been hotter still. But in fact, more local U.S. temperature records were set in the 1930s than any other decade in the past century.
The more recent trend in surface temperature is warming of one- or two-tenths of a degree per decade or two. Such a change is well within the range of the climate’s natural variations, whose mechanisms are not all understood.
It is safe to conclude that the natural variability of climate adds confusion to the effort to diagnose human-induced climate change. Apparent long-term trends can be artificially amplified or damped by the contaminating effects of undiagnosed natural variations. (4)
Methods of Measurement
Until satellite measurements began in 1979, there was no accurate method to measure temperatures from all parts of the globe. Readings were taken largely at convenient land-based points; ocean, rainforest, and mountain surface temperatures were simply not available. Weather stations were established for the most part in cities, and later at airports starting in the 1920s.
Unfortunately, as cities grew and expanded, and air conditioning and paved areas became more common, a “heat island” effect became evident. Urban temperature readings became higher, but the rural areas did not experience the same changes. This became especially pronounced in cities like Phoenix, where the heat exhausted from efforts to cool commercial, industrial, and residential buildings attained such levels that airport temperature readings reached historic highs.
Weather balloons provided the first real advance compared to surface stations in temperature measurement technology. But balloons were quite expensive; there was no way to control their height or direction of flight; and their instrumentation was often destroyed by storms or landings at sea.
The Tyros satellite launched in 1979 changed all that. For the first time, temperature readings could be taken around the globe. The readings could be adjusted for altitude, from a few feet above sea level over the oceans to the top of the Himalayas.
While critics have charged satellite readings are not accurate, they correspond so closely to weather balloon observations that this argument has been discarded. Then, critics complained the satellite readings did not correct for orbital decay; that was easily corrected.
Satellite temperature readings have been available for more than 21 years. Although too little data exist to establish any trend in climate change, the readings to date do show that the Earth’s temperature changed during that time less than it did for the previous 100 years--a 0.04º C per-decade increase. There is general agreement among climatologists that the temperature of the Earth has increased by 0.6º C since 1880, and that most of this increase occurred before 1940.
In the May issue of Environment & Climate News: How the Earth’s movement around the sun affects climate.
Jay Lehr is science director for The Heartland Institute. Richard Bennett is president of The Society of Environmental Truth in Corpus Christi, Texas.
(1) A Guide to Global Warming (Washington, DC: George C. Marshall Institute, January 2000).
(2) H. Fischer and M. Wahlen, “Ice Core Records of Atmospheric CO2 around the Last Three Glacial Terminations,” Science 283, 1712-1714 (1999).
(3) H.H. Lamb, Climate History and the Modern World (New York, NY: Methuen, 1985).
(4) J.D. Mahlman, “Uncertainties in Projections of Human Caused Climate Warming,” Science 278, 1416-1417 (1997).