Until recently, even hardened climate skeptics when asked about the science would say: “Well we think it is warming, but how much is caused by humans is uncertain”. Now a rash of revelations are coming out to challenge even this bedrock claim, e.g.
It seems that when we leave out the great number of weather stations that were introduced in the last 50 years or so, that the tendency is absolutely not a rise in temperature, see Global Warming Vs Clojure!.
Andrew Bolt recently reported on base data being lost by climate scientists, so that no-one can check the actual measurements anymore.
The ‘adjustments’ to the record also came under fire, with examples of large and apparently unmotivated boosts in trends to Darwin and other places with The Smoking Gun At Darwin Zero.
While a single region does not a global temperature record make, as more data is being revealed, more regions are being analysed, and more discrepancies are being found. The problem is that over a long time, weather stations have moved, stopped collecting or started up, and the additional noise introduced by these interferences confounds any real trends that might be there.
The approach of the climate scientists has been to adjust each individual station record, based on the measurements of the similar stations around it, and then weight according to area. But its not clear whether this approach obtains a superior record of aggregated temperatures than a number of other possible approaches.
One approach to obtaining a less biased trend is to first take the differences of all series. Differencing produces the temperature change every single year, so effects such as station moves that might have extended over many years are removed. All the series are then averaged to give the average change in temperature each year. The changes are then accumulated to recover the overall global temperature.
Using this approach on all Australian station data, temperatures do not seem to have increased greatly in Australia in the 20th century.
Another approach is to select some subset of station records that might be less biased, such as long records, that started early in the century and continue to the present day. This reducing the effect of addition and removal of stations from other geographic areas that can artificially warm or cool the average.
Geoff alerted me to an analysis of all global records done this way, called Global Warming Vs Clojure!. Performed by a computer consultant, the article provides details of the analysis, code and results in a very interesting language called Clojure. The primary difficulty of analyzing large amounts of data on a small computer is addressed, as well as ensuring computational efficiency on typical dual chip-sets found in most PCs these days. Lau B. Jensen knows what he is doing.
The moral is to SAVE YOUR PRIMARY DATA! Its unfortunate that uncertainty should be cast on global temperature in the 20th Century at this stage, but if the data were available we could at least go back and redo the analysis. Now, because basic data archival was not observed, we may never know!