Paz pointed out that the normalization used previously might not remove geographic biases introduced by fugitive weather stations, so here is another approach. I have differenced each of the records, averaged the differences and then cumulative summed the result.
The issue with NZ and Nordic data that the raw temperature data for weather stations do not show the temperature increases indicated by the IPCC, raising the question of how the data have been adjusted.
As Prof. Karlen states in the ClimateGate email #1221683947, temperature at many stations has not exceeded early 20th century temperatures:
.. data sets show an increase after the 1970s to the same level as in the late 1930s or lower. None demonstrates the distinct increase IPCC indicates.
Here is the plot of means of Australian raw data for 103 temperature stations, based on the file Aus.tab downloaded from the Australian BoM web site and collated by Steve McIntyre.
A department is going to hire a new climate scientist and summons a series of candidates. The selection panel asks each applicant, “What is two plus two?” The first two candidates answer, “Four.” They don’t get the job. The third responds, “What do you want it to be?” He gets hired.
Is it too soon to say “We wus right!”?
A heartfelt thanks to all those sceptics who have toiled, lost sleep, suffered derision, rejection, loss of family time, and in some cases, career, fighting the exaggerations, collusion in the journals, cynical manipulation of public institutions, dishonesty and corruption in the climate science cartel.
Some say this has nothing to do with AGW, but the main message from the FOI2009 emails is of a weakly explanatory theory that is only maintainable by vigorous opposition to any opposing viewpoints.
I have stated a list of action items:
. Disband the entire Federal Department of Climate Change along with all the individual State Departments of Climate Change. Leave climate prediction to private enterprise.
. Vote down the Emissions Trading Scheme Legislation
. Cancel Copenhagen
. More suggestions.
Below are the results of applying the EMD algorithm (Empirical Mode Decomposition) to Australian Rainfall, and predicting the future rainfall with a VAR model (Vector Autoregression).
First, EMD splits the rainfall into IMF’s (Intrinsic Mode Functions) that are cyclical but variable in amplitude and frequency.
The one message I’d like to convey is we do not need to rush on this. This will be around to examine and feed our discussions for a long time to come. If we start right, it will go better for us. There seems to be some indications of possible unethical behaviour, if these are true representations of email communications. It isn’t right to tell people to delete emails that may be the subject of FOI requests, at the very least. But we don’t need to pile onto this right now.
Fascinating … a taste: ./mail/0933255789.txt
From: Adam Markham
To: email@example.com, firstname.lastname@example.org
Subject: WWF Australia
Date: Thu, 29 Jul 1999 09:43:09 -0400
I’m sure you will get some comments direct from Mike Rae in WWF
Australia, but I wanted to pass on the gist of what they’ve said to me so
They are worried that this may present a slightly more conservative
approach to the risks than they are hearing from CSIRO. In particular,
they would like to see the section on variability and extreme events
beefed up if possible. They regard an increased likelihood of even 50%
of drought or extreme weather as a significant risk. Drought is also a
particularly importnat issue for Australia, as are tropical storms.
I guess the bottom line is that if they are going to go with a big public
splash on this they need something that will get good support from
CSIRO scientists (who will certainly be asked to comment by the press).
One paper they referred me to, which you probably know well is:
“The Question of Significance” by Barrie in Nature Vol 397, 25 Feb 1999,
Let me know what you think. Adam
Another good one … ./mail/0926947295.txt
A researcher asks (naively) why the rates of CO2 accumulation being used are so unrealistically high:
From: franci [mailto:email@example.com]
> Sent: Saturday, May 15, 1999 3:58 PM
> To: Benjamin Felzer
> Cc: Mike Hulme; firstname.lastname@example.org; email@example.com; firstname.lastname@example.org;
> email@example.com; firstname.lastname@example.org; Mike MacCracken
> Subject: Re: CO2
> dear ben,
> You just showed that the Hadley transient run we are supposed to use for the
> national assessment is too high, forcing-wise, because it assumes an overall
> 1.2% increase in total forcing.
> My question is then the following:
> -why are we using a 1% annual increase in GHG forcing (corresponding to the
> 1.2% increase) as a criteria for GCM simulations to then be used for the
> national assessment? Is it because of the possible confusion you refer to
> below? If so, that criteria needs to be revised.
To be told off in no uncertain terms:
From: Dave Schimel
To: Shrikant Jagtap
Subject: RE: CO2
Date: Mon, 17 May 1999 09:21:35 -0600 (MDT)
Cc: franci , Benjamin Felzer , Mike Hulme , email@example.com, wigl
firstname.lastname@example.org, email@example.com, firstname.lastname@example.org, Mike MacCracken
I want to make one thing really clear. We ARE NOT supposed to be working with the assumption that these scenarios are realistic. They are scenarios-internally consistent (or so we thought) what-if storylines. You are in fact out of line to assume that these are in some sense realistic-this is in direct contradiction to the guidance on scenarios provided by the synthesis team.
UPDATE: The Mick Shedlock post Hackers Prove Global Warming Is A Scam summarizes the revelations so far, giving a finance community perspective on the (mis)behavior of Phil Jones in particular.
Have you noticed a distinct change in the rhetoric around global warming? Seems like revisionism going on in the mainstream media in the form of shift in focus to the most likely values, or expectations of global warming, rather than emphasizing the low probability, worse possible scenario.
For example, this one on sea level from nature.com.
Sea level rise – not so fast.
In the latest salvo of the scientific debate over future sea level rise, a new report counters claims that rapidly swelling seas will soak estimates published by the UN climate planel in 2007.
A major â€œitâ€™s worse than we thoughtâ€ story out of Marchâ€™s Copenhagen Climate Congress, for example, was that sea level could climb more than a metre by 2100 – seemingly far worse than the rise of up to 59 centimetres indicated in the 2007 report from the Intergivernmental Panel on Climate Change (IPCC). This was in fact something of a straw-man comparison, since the IPCC total explicitly excluded the impacts of accelerated glacier melt, and the new studies were attempting to add these impacts in.
How to predict with EMD? Because the EMD algorithm decomposes time series into a number of periodics of different frequency (IMFs), and a residue trend, prediction in EMD is by extrapolating each of the IMFs separately (a VAR model is recommended) and fitting a cubic polynomial to the residue (example code at end of here). The predictions are then added together.
Below are a couple of examples of EMD predictions on familiar data sets, the HadCRU global surface temperature, and the TLT series from the satellite MSU RSS data. In both I have also applied the recommended prediction techniques to extend the result into the future.
The first is the satellite TLT, and I have used up to IMF 5 so that variation up to annual quasi-periodicity is represented. The residue in this case, the red arch, has peaked and is projected to decline, along with the overall temperature, in the next few years. The amplitude of variability is also declining.
Our approach so far has been to model natural climate variation of global temperature with sinusoidal curves, and potential AGW as increasing trends. A new algorithm called EMD (Empirical Mode Decomposition) promises to more robustly identify cyclical natural variation (NV), showing the contribution of NV and AGW to global temperature, and testing the IPCC claim that most of the recent warming is due to AGW.
Underestimation of natural variation (NV) is a crucial flaw in the IPCC’s logic, according to Dr Roy Spencer:
They ignore the effect of natural cloud variations when trying to diagnose feedback, which then leads to overestimates of climate sensitivity. … By ignoring natural variability, they can end up claiming that natural variability does not exist. Admittedly, their position is internally consistent. But then, so is all circular reasoning.
The relative contribution of AGW to temperature increase in the late 20th century underpins the IPCC global warming claims, according to the Wiki page on Scientific Opinion on Climate Change:
National and international science academies and scientific societies have assessed the current scientific opinion, in particular on recent global warming. These assessments have largely followed or endorsed the Intergovernmental Panel on Climate Change (IPCC) position of January 2001 that states:
An increasing body of observations gives a collective picture of a warming world and other changes in the climate system… There is new and stronger evidence that most of the warming observed over the last 50 years is attributable to human activities.
Since 2007, no scientific body of national or international standing has maintained a dissenting opinion.
So estimating the relative proportion of natural variation vs. trend is very important. While widely used in other fields, EMD is relatively little used in climate science.
As an example, Lin and Wang (2004) used EMD for analysis of solar insolation. They claim that the solar eccentricity signal is much larger than previously estimated, more than 1% of solar irradiance, and adequate for controlling the formation and maintenance of quaternary ice sheets. This is a potential resolution of the 100,000 year problem, that has also been used to justify the necessity of CO2 feedback in producing ice ages.
Conventional spectral methods are strictly periodic — the period is constant in both frequency and amplitude. EMD relaxes these assumptions, allowing quasi-periodicity, which might explain why more variation is potentially explained. The EMD algorithm proceeds by first extracting out the highest frequency, called an intrinsic mode function (IMF) and leaving a residual. It does this to the next highest frequency, and so on, until only a trend is left.
While it is possible the residual is also part of a cycle — it is always possible to model a trend with a sinusoidal of long enough period — we treat this as AGW trend in order to estimate the maximum possible contribution of AGW to global warming.
Here are the results of applying EMD to the CRU global temperature series. Figure 1 below shows each of the 5 IMF’s and the residual, the remainder after subtracting out the periodics.
Each of the IMF’s is shown, with mean periods of 4.0, 6.6, 11.9, 23.4, and 55.1 years respectively. Most readers would be well aware of the similarity of these periods to major solar and oceanic cycles.