A few months ago, a study came out that demonstrated global temperatures have leveled off.  But instead of possibly admitting that this whole global warming thing is a farce, a group of British scientists concluded that the real global warming won’t start until 2009.  Between 2009 and 2014, they predict temperatures will soar past the record warmth of 1998.


Except guess what?  A report was released yesterday that NOAA has been using incorrect data!  NOAA has admitted this error, the results of which show the 1930s to now be the hottest decade on record.  The bottom line is that it hasn’t been nearly as hot as the alarmists thought it was.


But it doesn’t end there.  The National Climate Data Center (NCDC) is in the middle of a scandal.  Their global observing network, the heart and soul of surface weather measurement, is a disaster.  Urbanization has placed many sites in unsuitable locations — on hot black asphalt, next to trash burn barrels, beside heat exhaust vents, even attached to hot chimneys and above outdoor grills!


The data and approach taken by many global warming alarmists is seriously flawed. If the global data were properly adjusted for urbanization and station siting, and land use change issues were addressed, what would emerge is a cyclical pattern of rises and falls with much less of any background trend.


Weather observations have been taken around the world for centuries. Up until the early 1980s, a majority of the temperature observations were taken with a Liquid in Glass (LIG) mercury thermometer. Special LIG thermometers, known as minimum and maximum thermometers, were used to record the daily high and low temperature. These thermometers worked very simply and were quite accurate. The mercury rose (or fell) and marked the high (or low) temperature for the day. The temperature using this method was the absolute maximum or minimum; no averaging or sampling was used.

In the 1980s, technology allowed for sensors to become automated and computerized. Instead of using the absolute maximum or minimum, the new digital thermometers utilize algorithms to calculate the high and low temperature. The algorithm uses a 60 second sampling rate and calculates a running 5 minute average.

So how does this affect temperatures? A joint study conducted by the University of Nebraska and Nation Climate Data Center demonstrated that the difference between the automated sensors and LIG thermometers was between .15 and .5 degrees C.

I have yet to read any global warming research that accounts for this variation. This reason is simple – far too many environmental scientists, geologists, ecologists, and others with no expertise in meteorology are conducting global warming research. Without doubt, there is plenty of research funding to go around for global warming. Unfortunately it too often winds up in the hands of amateurs with no background in meteorology.

A recent study conducted by ClimatePolice.com indicates the climate over the last 9 years is beginning to cool. 1998 typically serves as the benchmark for the warmest year on record for global temperatures. Since 1998, temperatures have decreased by -.34 degrees F. Click here for a full image of the graph or click the thumbnail below.

Climate Trend

This conclusion agrees with a previous study suggesting that no new warming has occurred (Lindzen, 2006) since 1998. It also agrees with a report (Gray, 2006) predicting that global temperatures will begin decreasing.

At the minimum, as the graph indicates, global temperatures have stabilized and appear to have a negative trend. This is also reflected in the 2007 year-to-date average temperatures where most of the country is normal with some slight warming in the Northwest.