Using theory, applications, and examples of inferences, Niche Modeling: Predictions from Statistical Distributions demonstrates how to conduct and evaluate niche modeling projects in any area of application. It features a series of theoretical and practical exercises for developing and evaluating niche models.
Yesterday’s post noted the appearance of station summaries at the BoM adjustment page attempting to defend their adjustments to the temperature record at several stations. Some I have also examined. Today’s post compares and contrasts their approach with mine.
The figures below compare the minimum temperatures at Deniliquin with neighbouring stations. On the left, the BoM compares Deniliquin with minimum temperatures at Kerang (95 km west of Deniliquin) in the years around 1971. The figure on the right from my Deniliquin report shows the relative trend of daily temperature data from 26 neighbouring stations (ie ACORN-SAT – neighbour). The rising trends mean that the ACORN-SAT site is warming faster that the neighbour.
The BoMs caption:
Deniliquin is consistently warmer than Kerang prior to 1971, with similar or cooler temperatures after 1971. This, combined with similar results when Deniliquin’s data are compared with other sites in the region, provides a very clear demonstration of the need to adjust the temperature data.
Problems: Note the cherrypicking of a single site for comparison and the handwaving about “similar results” with other sites.
In my analysis, the ACORN-SAT version warms at 0.13C/decade faster than the neighbours. As the spread of temperature trends at weather stations in Australia is about 0.1C/decade at the 95% confidence level, this puts the ACORN-SAT version outside the limit. Therefore the adjustments have made the trend of the official long term series for Deniliquin significantly warmer than the regional neighbours. I find that the residual trend of the raw data (before adjustment) for Deniliquin is -0.02C/decade which is not significant and so consistent with its neighbours.
Now look at the comparison of minimum temperatures for Rutherglen with neighbouring stations. On the left, the BoM compares Rutherglen with the adjusted data from three other ACORN-SAT stations in the region. The figure on the right from my Rutherglen report shows the relative trend of daily temperature in 24 neighbouring stations (ie ACORN-SAT – neighbour). As in Deniliquin, the rising trends mean that the ACORN-SAT site is warming faster that the neighbour.
The BoMs caption is
While the situation is complicated by the large amount of
missing data at Rutherglen in the 1960s, it is clear that, relative to the other sites, Rutherglen’s raw minimum temperatures are very much cooler after 1974, whereas they were only slightly cooler before the 1960s.
Problems: Note the cherrypicking of only three sites, but more seriously, the versions chosen are from the adjusted ACORN-SAT. That is, the already adjusted data is used to justify an adjustment — a classic circularity! This is not stated in the other BoM reports, but probably applies to the other station comparisons. Loss of data due to aggregation to annual data is also clear.
In my analysis, the ACORN-SAT version warms at 0.14C/decade faster than the neighbours. As the spread of temperature trends at weather stations in Australia is about 0.1C/decade at the 95% confidence level, this puts the ACORN-SAT version outside the limit. Once again, the adjustments have made the trend of the official long term series for Deniliquin significantly warmer than the regional neighbours. As with Deniliquin, the residual trend of the raw data (before adjustment) is not significant and so consistent with its neighbours.
The raw data is not always more consistent, as Amberley shows. On the left, the BoM compares Amberley with Gatton (38 km west of Amberley) in
the years around 1980. On the right from my Amberley report is the relative trend of daily temperature to 19 neighbouring stations (ie ACORN-SAT – neighbour). In contrast to Rutherglen and Deniliquin, the mostly flat trends mean that the ACORN-SAT site is not warming faster than the raw neighbours.
The BoMs caption:
Amberley is consistently warmer than Gatton prior to 1980 and consistently cooler after 1980. This, combined with similar results when Amberley’s data are compared with other sites in the region, provides a very clear demonstration of the need to adjust the temperature data.
Problems: Note the cherrypicking and hand waving.
In my analysis, the ACORN-SAT version warms at 0.09C/decade faster than the neighbours. As the spread of temperature trends at weather stations in Australia is about 0.1C/decade at the 95% confidence level, I class the ACORN-SAT version as borderline. The residual trend of the raw data (before adjustment) is -0.32C/decade which is very significant and so there is clearly a problem with the raw station record.
More cherrypicking, circularity, and hand-waving from the BoM — excellent examples of the inadequacy of the adjusted ACORN-SAT reference network and justification for a full audit of the Bureau’s climate change division.
Last night George Christensen MP gave a speech accusing the Bureau of Meteorology of “fudging figures”. He waved a 28 page of adjustments around, and called for a review. These adjustments can be found here. While I dont agree that adjusting to account for station moves can necessarily be regarded as fudging figures, I am finding issues with the ACORN-SAT data set.
The problem is that most of the adjustments are not supported by known station moves, and many may be wrong or exaggerated. It also means that if the adjustment decreases temperatures in the past, claims of current record temperatures become tenuous. A maximum daily temperature of 50C written in 1890 in black and white is higher than a temperature of 48C in 2014, regardless of any post-hoc statistical manipulation.
But I do take issue with a set of summaries being released as blatant “cherry-picking”.
Scroll down to the bottom of the BoM adjustment page. Listed are station summaries justifying the adjustments to Amberley, Deniliquin, Mackay, Orbost, Rutherglen and Thargomindah. The overlaps with the ones I have evaluated are Deniliquin, Rutherglen and Amberley (see previous posts). While the BoM finds the adjustments to these stations justified, my quality control check finds problems with the minimum temperature at Deniliquin and Rutherglen. I think the Amberly raw data may have needed adjusting.
WRT Rutherglen, BoM defends the adjustments with Chart 3 (my emphasis):
Chart 3 shows a comparison of the raw minimum temperatures at Rutherglen with the adjusted data from three other ACORN-SAT stations in the region. While the situation is complicated by the large amount of missing data at Rutherglen in the 1960s, it is clear that, relative to the other sites, Rutherglen’s raw minimum temperatures are very much cooler after 1974, whereas they were only slightly cooler before the 1960s.
WRT Deniliquin, BoM defends the adjustments on Chart 3 (my emphasis):
Chart 3 shows a comparison of minimum temperatures at Kerang (95 km west of Deniliquin) and Deniliquin in the years around 1971. Deniliquin is consistently warmer than Kerang prior to 1971, with similar or cooler temperatures after 1971. This, combined with similar results when Deniliquin’s data are compared with other sites in the region, provides a very clear demonstration of the need to adjust the temperature data.
My analysis is superior to flawed the BoMs analysis in 3 important ways:
1. I compare the trend in Rutherglen and Deniliquin with 23 and 27 stations respectively, not 3 and 1 neighbouring stations respectively (aka cherry-picking).
2. I also use a rigorous statistical panel test to show that the trend of the Rutherglen minimum exceeds the neighbouring group by O.1C per decade, which is outside the 95% confidence interval for Australian stations trends — not a visual assessment of a chart (aka eyeballing).
3. I use the trends of daily data and not annual aggregates, which are very sensitive to missing data.
Been looking forward to doing Kerang as I knew it was another dud series from ACORN-SAT. The report is here:
The first thing to notice in plotting up the time series data for the raw CDO and ACORN-SAT is that while the ACORN-SAT data goes back to 1910 the CDO data is truncated at 1962.
The monthly data, however, goes back almost to 1900. This is inexplicable as the monthly data is derived from the daily data! Here is proof that, contrary to some opinion pieces, all of the data to check the record is not available at the Bureau of Meteorology website, Climate Data Online.
The residual trends of ACORN-SAT are at benchmark and greatly exceeding benchmark, for maximum and minimum respectively.
While on the subject of opinion pieces, the statement from No, the Bureau of Meteorology is not fiddling its weather data:
Anyone who thinks they have found fault with the Bureau’s methods should document them thoroughly and reproducibly in the peer-reviewed scientific literature. This allows others to test, evaluate, find errors or produce new methods.
So you think skeptics haven’t tried? A couple of peer-review papers of mine on quality control problems in Bureau of Meterology use of models have not had a response from the Bureau in over 2 years. The sound of crickets chirping is all. Talk is cheap in climate science, I guess. Here they are:
Three more quality tests of stations in the ACORN-SAT series have been completed:
The test measures the deviation of the trend of the series from its neighbours since 1913 (or residual trend). A deviation of plus or minus 0.05 degrees per decade is within tolerance (green), 0.05 to 0.1 is borderline (amber), and greater than than is regarded a fail (red) and should not be used.
|Cape Otway Lighthouse||-0.17||0.02||-0.05||0.01|
There are more inconsistent stations among the raw CDO data — as would be expected as it is “raw”. Howover the standout problems in this small sample are in the ACORN-SAT minimums.
Results so far suggest the Bureau of Meteorology has a quality control problem with its minimum temperatures, with almost all borderline residual trends and very large deviations from the neighbours in Rutherglen and Deniliquin.
Williamtown RAAF is one case where the quality control test indicates that the adjustments were justified, even though the homogenization in ACORN-SAT produced a strong change in trend. My report is here and Ken has posted graphs for Williamtown, along with a number of sites with large changes in trend by ACORN-SAT.
This illustrates neatly that a series is not rejected just because the ACORN-SAT increases the warming trend. An ACORN-SAT series should be rejected, however, if it’s trend is inconsistent with its raw neighbours.
You may have read Ken Stewart’s excellent blog on the official Australian temperature record. With the publication of the “adjustments.xls” file of the offical adjustments to the raw data in the ACORN-SAT dataset, as reported on JoNova’s blog, there has been a flurry of work behind the scenes, so to speak.
Jennifer Marohasy has been leading the charge also to audit or at least review the practises of the Bureau of Meteorology in adjusting raw temperature to produce the synthetic ACORN-SAT series. Rutherglen in particular has been in the news for massive warming adjustment to the minimum temperature trend.
The story has gone national in The Australian with articles like Climate records contradict Bureau of Meteorology. Even I have been quoted as saying that the BoM may be “adding mistakes” with their data modelling.
Well, all of this kerfuffle has been enough to get me out of hiding and start working on some stuff. I thought it would be good to have a quality assessment method that could reliably test the ACORN data. The idea I came up with is to test the trend of the ACORN-SAT series against the trends of the raw data neighbours. Ideally, if the trend of the synthetic series exceeds the overall trend of all the neighbours, then some thing must be wrong.
The difficulty is that the neighbours all start and stop at different times and so a slightly more complex test is needed than a simple ordinary least squares regression. The answer is a panel test, or POLS. Its all explained in the reports on the first three stations below.
The test seems to work remarkably well. The ACORN-SAT minimum temperatures for Rutherglen and Deniliquin fail the benchmark residual trend of 0.05 degrees C per decade – that is they warm at a much greater rate than their neighbours. ACORN-SAT at Williamtown, by contrast, is consistent with its neighbours and the raw CDO series are not, indicating that the single large adjustment applied around 1969 was warranted.
I intend to keep working through the stations one by one and uploading them to the viXra archive site as I go. I will also release the data and code soon for those who are interested. Its almost at the stage where you enter a station number and it spits out the analysis. Wish it would write the reports too, though latex goes a great deal towards that end.
As previously covered here, Andre Rossi appears to have delivered the goods…
Cold fusion reactor verified by third-party researchers, seems to have 1
million times the energy density of gasoline
Andrea Rossi’s E-Cat — the device that purports to use cold fusion to
generate massive amounts of cheap, green energy – has been verified by
third-party researchers, according to a new 54-page report. The researchers
observed a small E-Cat over 32 days, where it produced net energy of 1.5
megawatt-hours, or “far more than can be obtained from any known chemical
sources in the small reactor volume.”…
Follow the link to story in full
Third part report is here.
Significant transformation of isotopes of Lithium and Nickel are broadly consistent with the energy produced. this leaves no doubt the source of the energy is nuclear. But the authors are perplexed, nay, dumbfounded, nay, flabbergasted at the possible physics involved as all known nuclear reactions typically have large Coloumb barriers to overcome.
They found it “very hard to comprehend” how these fusion processes could take place at such low energies – 1200C-1500C degrees. While the transmutations are remarkable in itself, they found not trace of radiation during the test, or residual radiation after the reactor had stopped – almost inevitable in a reaction of nuclear source.
What is the possible reaction(s) then? Speculations from the vortex discussion list:
Li7 + Ni58 => Ni59 + Li6 + 1.75 MeV
> Li7 + Ni59 => Ni60 + Li6 + 4.14 MeV
> Li7 + Ni60 => Ni61 + Li6 + 0.57 MeV
> Li7 + Ni61 => Ni62 + Li6 + 3.34 MeV
> Li7 + Ni62 => Ni63 + Li6 – 0.41 MeV (Endothermic!)
> This series stops at Ni62, hence all isotopes of Ni less than 62 are
> and Ni62 is strongly enriched.
> I have only briefly skimmed the report, but the basic reaction appears to
> be a
> neutron transfer reaction where a neutron tunnels from Li7 to a Nickel
> The excess energy of the reaction appears as kinetic energy of the two
> nuclei (i.e. Li6 & the new Ni isotope), rather than as gamma rays. Because
> are two daughter nuclei, momentum can be conserved while dumping the
> energy as
> kinetic energy in a reaction that is much faster then gamma ray emission.
> Because both nuclei are “heavy” and slow moving, very little to no
> bremsstrahlung is produced. There is effectively no secondary gamma from
> because the first excited state is too high. (I haven’t checked Li7).
> There is
> unlikely to be anything significant from Ni because the high charge on the
> nucleus combined with the “3” from Lithium tend to keep them apart (minimum
> distance 31 fm).
> It would be nice to know if the total amounts of each of Li & Ni in the
> were conserved (I’ll have to study the report more closely).
> Robin van Spaandonk
Fascinating new world of materials science opening up.
“these [modern] fluctuations appear to me to preclude any strong conclusions that the relatively modest increase [in ocean heat content] is unprecedented.”
Originally posted on Climate Audit:
The article itself presents a Holocene temperature reconstruction that is very much at odds both with Marcott et al 2013 and Mann et al 2008. And, only a few weeks after IPCC expressed great confidence in the non-worldwideness of the Medieval Warm Period, Rosenthal et al 2013 argued that the Little Ice Age, Medieval Warm Period and Holocene Optimum were all global events.
Although (or perhaps because) the article apparently contradicts heroes of the revolution, Rosenthal et al 2013 included a single sentence of genuflection to CAGW:
The modern rate of Pacific OHC change is, however, the highest in the past 10,000 years (Fig. 4 and table S3).
View original 1,910 more words
The Climate Council mini-statement called Bushfires and Climate Change in Australia – The Facts states in support of their view that “1. In Australia, climate change is influencing both the frequency and intensity of extreme hot days, as well as prolonged periods of low rainfall. This increases the risk of bushfires.”
Southeast Australia is experiencing a long-term drying trend.
A moment of fact-checking the BoM recorded rainfall in Southeastern Australia reveals no trend in rainfall.
Another moment of fact-checking the BoM recorded rainfall in Australia reveals an increasing rainfall trend.