Moncktons argument

Monckton’s main argument seems to be represented by the statement that climate sensitivity to CO2 has been overestimated by the IPCC by around 6-7 times, giving exaggerated projections of warming for a business as usual scenario of CO2 emissions. The IPCC range is around 2-6C degrees warming by 2100, and Monckton’s is 0.5C. While he provides some calculations, this view is also supported by a measure of respectable scientific literature.

The view that CO2 sensitivity is being grossly exaggerated is the one that is shared by myself and the likes of Spencer, Shaviv, Lindzen, Douglass and others. The most concise illustration of this is the one produced by Spencer, showing where the various authors lie, relative to the IPCC model projections.


Continue reading

IPCC omissions on species extinctions

WUWT reports in The IPCC: More Sins of Omission – Telling the Truth but Not the Whole Truth the greatest failing of the IPCC, if not environmental sciences. The article describes how the effects of climate change on climate, hunger and water storage are misrepresented to exaggerate negative effects. Here I show that the same deception is in play with the statements on species extinctions in AR4.

In Climate Change 2007: Working Group II: Impacts, Adaption and Vulnerability it is stated, that:

up to 30% of known species being committed to extinction * (Chapter 4 Section 4.4.11 and Table 4.1; Thomas et al., 2004;

and in other places, with statements to the effect that 20-30% of species are at increased risk of extinction. Revkin at NYT recently drew attention to the sloppiness of expression throughout the IPCC report. No-one seems to know what ‘committed to extinction’ means.

But its worst than that. The way the figure is worked out in Thomas is to map the change in the aggregate area of the individual species, using various models under various climate change scenarios, and use a relationship of the risk of extinction to the size of the species range (smaller ranges are more risky).

That is, if the total area of all species before warming is S1, and the total area after is S2, then S1>S2 implies increased risk of extinction.

Now there are a lot of hairs on this approach, with many sources of potential bias, but the foremost can be easily seen by considering that the expected area of the range of a typical species does not change under climate change. It may move north in the NH, or south in the SH, but given it does not run into the ocean, the expected sizes of species ranges should stay constant. How then can you get risks of 20-30% of extinctions?

What Thomas et al. (2004) does is to simply disregard all species whose ranges increased for any given climate change, and only sum the ranges of the species that decreased with climate change! That is, if their risk of extinction increases, they are counted, if the risk of extinction decreases (due to enlarged area) they are not counted. That way a zero net expected change in risk of extinction is transformed into a massive increase in the risk of extinction.

It gets worse when you look at the actual data on what happens to species when climate changes. One of the first major reports by the Idso’s in 2003 was a review of all the species response data called The Specter of Species Extinction: Will Global Warming Decimate Earth’s Biosphere? They and others have found that the overwhelming reaction of species to climate change is to do nothing, or to EXPAND their range. In the NH, the northern limit usually expands because that is where it is temperature limited. The southern limit is usually determined by other factors, such as rainfall.

Similar results expansions of ranges have been seen in more detailed modelling by Anderson (2009), whereby the southern margins (in the NH) stay constant while the northern ranges expand under warming. If we are to regard range size as an indicator of extinction risk, then warming should reduce the risk. These results are consistent with the view that climate variability increased biodiversity, not decreases biodiversity.

The paper by Anderson is coauthored by Professor Brook, who will incidentally be debating Monckton and Plimer at the Brisbane luncheon on the 29th of January. Prof. Brook should be well aware of the mass of evidence contradicting the massive extinction theory.

This post provides more background to the issue and ways it has been brought to the attention of the scientists involved. But, as we have known for years, and the general public are now being informed by the MSM, these results have been deliberately ignored by the IPCC as they would have undermined the case for drastic greenhouse gas emission reductions.

Not only is the IPCC report starting to smell like prawns in the sun, the bias of the contributors themselves is becoming obvious. As Indur M. Goklany concludes in WUWT:

Perhaps they didn’t want to get crossways with their political masters. But if that was the case, what function does the SPM serve other than rubber stamp political leanings? Regardless, they committed sins of omissions and, it seems, with due deliberation.

Anderson, B.J., Akçakaya, H.R., Araujo, M.B., Fordham, D.A., Martinez-Meyer, E., Thuiller, W. & Brook, B.W. (2009) Dynamics of range margins for metapopulations under climate change. Proceedings of the Royal Society of London – Series B, 276, 1415-1420. doi: 10.1098/rspb.2008.1681


Monckton, Plimer Tour Dates

Historic. Not to be missed. I will be at the Brisbane events. Come and say hi.

Wednesday 27 January,
12:15, Luncheon, The Union Club, SOLD OUT
17:30 Public lecture, Sheraton on the Park download PDF
contact: John Smeed, phone or SMS 0417 269 216

Thursday 28 January
12:30 Public lecture, Newcastle City Hall – Banquet Room Download PDF
contact: Anthony Cox, 0412 474916,

Friday 29 January
12:00 – 2.00 Brisbane Institute luncheon Panel Debate Brisbane Hilton – Grand Ballroom, Download PDF
Pre-registered and pre-paid event. $130 for lunch and debate. Monckton, Plimer, Barry Brooks and Graham Readfearn.
contact: Karyn Brinkley

15:00 Public Lecture Brisbane – Irish Club, Tara Ballroom Download PDF
$20 at the door, no pre-registration or booking required
contact: Tony Gomme

Saturday 30 January
2pm Public Lecture “The J”, Noosa Heads, John McRobert, Download PDF
contact: Simon Gamble,

February Monday 1
12:30 SOLD OUT “Sandwich Luncheon”, 6/112 Millswyn / Max Rheese (Aust. Climate Science Coalition)/Des Moore (Inst. For Private Enterprise)
17:30 Public Lecture, Sofitel Hotel Grand Ballroom, Download PDF
contact: Case Smit,

Wednesday 3 February,
15:00 Public Lecture , National Press Club, Download PDF
contact: Alix Turner,

Thursday4 February
19:30 Public lecture Intercontinental Hotel (formerly Hyatt), North Terrace, Download PDF
contact: Damian Wyld,, Ph 08 8379 0246
Bookings Essential,

Monday 8 February
Luncheon Parmelia Hilton Hotel,
18:00 Public Lecture, Parmelia Hilton Hotel – Swan Room, Download PDF
contact: Daphne Dimitri for Gina Rinehart,
RSVP by Feb 3, Donation $10 at the door. (Insanely cheap!)

Secrecy – rebuttal

An opinion on the The Social Cost of Transparency — a defense of secrecy — was given by an Australian economist on Mish’s blog.

Steve Keen says:

One quick perusal of that article and I could consign it to neoclassical gibberish. The key giveaway is in the first sentence of the abstract:

“I study a class of models commonly used to motivate monetary exchange, extended to include a physical asset whose expected short-run return is subject to exogenous news events, but whose expected long-run return is independent of this information”

The part of bold is the give: this is a model where the future is subject to randomness, but not change: whatever happens in the short term has no impact on the future.

Continue reading

Disproving Global Warming

JoNova noticed a Canberra Times article that the Tasmanian drought may not be due to global warming after all.

Are there any predictions of global warming that have proven true? WUWT has a list of spectacular failures, and is calling on readers to add more. All of the successful ‘predictions’ I have seen have been ex post facto — after the fact.

The string of failed AGW predictions points to the “expert problem” as described by Nassim Taleb in economics. Substitute Mr Takatoshi for Penny Wong, and Obama for Rudd, and you have the Australian AGW situation.

I told the audience that the next time someone from the IMF shows you projections for some dates in the future, to show us what they PROJECTED for 2008 and 2009 in 2004, 2005, …, and 2007. They would then verify that Mr. Takatoshi and his colleagues provide a prime illustration to the “expert problem”: they serve as experts while offering the scientific reliability of astrologers. Anyone relying on them is a turkey.

His solution?

This allowed me to show the urgency of my idea of robustness. We cannot get rid of charlatans. My point is that we need to build a society robust to charlatanism and expert-error, one in which Mr. Takatoshi and his staff can be as incompetent as they want without endangering the general public. We need less reliance on these people and the Obama administration has been making us more dependent on the “expert problem”.

Climate experts show a stunning lack of concern with evidence and model validation. The solution is to only attend to methods exposed to professional engineering standards.

In Praise of Secrecy

Concluding that:

Approaching the problem under the premise that fuller transparency is always desirable may not be the right place to start.

an article On the Social Cost of Transparency in Monetary Economies from the St Louis Fed explains why secrecy and non-disclosure of data may be advantageous.

For an asset economy then, the prescription of “full transparency” is not generally warranted.

After many pages of mathematics, their argument is summarized:

In competitive economies, the disclosure of high-frequency information unrelated to an asset’s “long-run fundamentals” may be detrimental to economic welfare when claims to such assets serve as high-velocity payment instruments. The equilibrium price of a liquid asset is excessively volatile when asset prices capitalize all information.

If we translate this into a more familiar example, such as climate data and code, it would seem to argue that disclosure of all data and code may be detrimental to the value of the research to the author of an article, the university, the journal, coauthors etc.

Continue reading

Rahmstorf Reamed

The Australian reports a major new controversy after Britain’s Met Office denounced research from Stefan Rahmstorf suggesting that sea levels may increase by more than 1.8m by 2100.

Jason Lowe, a leading Met Office climate researcher, said: “We think such a big rise by 2100 is actually incredibly unlikely. The mathematical approach used to calculate the rise is completely unsatisfactory.”

Critic Simon Holgate, a sea-level expert at the Proudman Oceanographic Laboratory, Merseyside, has written to Science magazine, attacking Professor Rahmstorf’s work as “simplistic”.

“Rahmstorf’s real skill seems to be in publishing extreme papers just before big conferences like Copenhagen, when they are guaranteed attention,” Dr Holgate said.

Seems like the concerns with work from the Clown of Climate Science that the blogosphere have been voicing for years are finally going mainstream. Interested readers might want to read:

Recent Climate Observations: Diagreement with Projections

Warning to the Garnaut Commission about the excessive reliance on the work of Rahmstorf and friends.

Also search Niche Modelling, and see ClimateAudit, the Blackboard, as well as many other posts from these and other statistical blog sites for examples of R’s statistical incompetence.

I don’t have a problem with academics being wrong. The problem is with the Federal Government and Penny Wong who are suckers for this charlatan offering the scientific reliability of an astrologer.

The report states one objection:

Based on the 17cm increase that occurred from 1881 to 2001, Professor Rahmstorf calculated that a predicted 5C increase in global temperature would raise sea levels by up to 188cm.

Its worse than that. It appears that the extrapolation in R’s model is actually based in a non-significant rate increase of sea level w.r.t. temperature, i.e. a tiny derivative of already problematic data.

It uses simple measurements of historic changes in the real world to show a direct relationship between temperature rise and sea level increase and it works stunningly well – Rahmstorf

But R cheerfully admits here:

How do we know that the relationship between temperature rise and sea level rate is linear, also for the several degrees to be expected, when the 20th century has only given us a foretaste of 0.7 degrees? The short answer is: we don’t.

Why then should anyone take any notice of predictions from a model when, as the author admits, the truth of the fundamental assumption is unknown? How do we know the stars affect human behaviour? The short answer is: we don’t.

Heed the fourth commandment of statistics: When using multivariate models, always get the most for the least.