Not cointegrated, so global warming is not anthropogenic – Beenstock

Cointegration has been mentioned previously and is one of the highest ranking search terms on landshape.

We have also discussed the cointegration manuscript from 2009 by Beenstock and Reingewertz, and I see he has picked up another author and submitted it to an open access journal here.

Here is the abstract.

Polynomial cointegration tests of anthropogenic impact on global warming M. Beenstock, Y. Reingewertz, and N. Paldor

Abstract. We use statistical methods for nonstationary time series to test the anthropogenic interpretation of global warming (AGW), according to which an increase in atmospheric greenhouse gas concentrations raised global temperature in the 20th century. Specifically, the methodology of polynomial cointegration is used to test AGW since during the observation period (1880–2007) global temperature and solar irradiance are stationary in 1st differences whereas greenhouse gases and aerosol forcings are stationary in 2nd differences. We show that although these anthropogenic forcings share a common stochastic trend, this trend is empirically independent of the stochastic trend in temperature and solar irradiance. Therefore, greenhouse gas forcing, aerosols, solar irradiance and global temperature are not polynomially cointegrated. This implies that recent global warming is not statistically significantly related to anthropogenic forcing. On the other hand, we find that greenhouse gas forcing might have had a temporary effect on global temperature.

The bottom line:

Once the I(2) status of anthopogenic forcings is taken into consideration, there is no significant effect of anthropogenic forcing on global temperature.

They do, however, find a possible effect of the CO2 first difference:

The ADF and PP test statistics suggest that there is a causal effect of the change in CO2 forcing on global temperature.

They suggest “… there is no physical theory for this modified theory of AGW”, although I would think the obvious one would be that the surface temperature adjusts over time to higher CO2 forcing, such as through intensified heat loss by convection, so returning to an equilibrium. However, when revised solar data is used the relationship disappears, so the point is probably moot.

When we use these revised data, Eqs. (11) and (12) remain polynomially uncointegrated. However, Eq. (15) ceases to be cointegrated.

Finally:

For physical reasons it might be expected that over the millennia these variables should share the same order of integration; they should all be I(1) or all I(2), otherwise there would be persistent energy imbalance. However, during 150 yr there is no physical reason why these variables should share the same order of integration. However, the fact that they do not share the same order of integration over this period means that scientists who make strong interpretations about the anthropogenic causes of recent global warming should be cautious. Our polynomial cointegration tests challenge their interpretation of the data.

Abandon Government Sponsored Research on Forecasting Climate – Green

Kesten Green, now of U South Australia, has a manuscript up called Evidence-based Improvements to Climate Forecasting: Progress and Recommendations arguing that evidence-based research on climate forecasting finds no support for fear of dangerous man-made global warming, because simple, inexpensive, extrapolation models are more accurate than the complex and expensive “General Circulation Models” used by the Intergovernmental Panel on Climate Change (IPCC).

Their rigorous evaluations of the poor accuracy of climate models supports the view there is no trend in global mean temperatures that is relevant for policy makers, and that…

[G]overnment initiatives that are predicated on a fear of dangerous man-made global warming should be abandoned. Among such initiatives we include government sponsored research on forecasting climate, which are unavoidably biased towards alarm (Armstrong, Green, and Soon 2011).

This is what I found also in the evaluation of the CSIRO’s use of IPCC drought model. In fact, the use of the climate model projections is positively misleading, as they show decreasing rainfall over the last century when rainfall actually increased.

This is not welcome news to the growing climate projection science industry that serves the rapidly growing needs of impact and adaptation assessments. A new paper called Use of Representative Climate Futures in impact and adaptation assessment by Penny Whetton, Kevin Hennessy and others proposes another ad-hoc fix to climate model inaccuracy called Representative Climate Futures (or RFCs for short). Apparently the idea is the wide range of results given by different climate models are classified as “most likely” or “high risk” or whatever, and the researcher is then free to chose whichever set of models he or she wishes to use.

Experiment Resources.Com condemns ad hoc-ery in science:

The scientific method dictates that, if a hypothesis is rejected, then that is final. The research needs to be redesigned or refined before the hypothesis can be tested again. Amongst pseudo-scientists, an ad hoc hypothesis is often appended, in an attempt to justify why the expected results were not obtained.

Read “poor accuracy of climate models” for “hypothesis is rejected” and you get the comparison. Models that are unfit for the purpose need to be thrown out. RCF appears to be a desperate attempt to do something, anything, with grossly inaccurate models.

On freedom of choice, Kesten Green says:

So uncertain and poorly understood is the global climate over the long term that the IPCC modelers have relied heavily on unaided judgment in selecting model variables and setting parameter values. In their section on “Simulation model validation in longer-term forecasting” (p. 969–973, F&K observe of the IPCC modeling procedures: “a major part of the model building is judgmental” (p. 970).

Which is why its not scientific.

LENR demonstrated at Austin Trade Show

Renewables, man, get something for nothing. Harness the infinite power of wind, water, wave and tide. Why hasn’t it been done before? But of course it has – two thousand years ago – and has yet to overcome three basic and insurmountable problems”: intermittent, depends on locally weak, dispersed sources of energy, and no viable technology to store significant amounts of power. Bummer….

But Low Energy Nuclear Reactions, or LENR, are showing much more promise as an alternative energy source with a demonstration of a reliable, replicable, nickel-hydrogen gas low-energy nuclear reaction reactor on Monday at National Instruments national trade show at the Austin Convention Center in Texas.

The technique demonstrated by Celani is described here. He describes the desirable qualities for LENR of a class of Nickel/Copper alloys called the Constantans: high catalytic power at dissociating molecular hydrogen. Wiki indicates the name comes from metals with a low or negative temperature coefficient of resistance, or “constantan”.

He then describes his experiments with the measured resistance of Contantan wires while generating anomalous heat.

We observed a further increasing of anomalous
power that, if there are no mistakes around, was about
twice (i.e. absolute value of over 10W) of that detected
when the power was applied to inert wire. The R/Ro
value, after initial increasing, stabilized to 0.808.

If the consideration at point 11) is correct, we can
think that the reaction, apart some temperature threshold,
has a positive feedback with increasing temperature.