The test of integration order from the previous post is applied to the major atmospheric forcings used in the GISS global climate models in recent years. These are available for 1880 to 2003 in a file called RadF.txt The codes for the forcings are self explanatory: W-M_GHGs, O3, StratH2O, Solar, LandUse, SnowAlb, StratAer, BC, ReflAer, AIE.
Below is the sum of the forcings.
We want to know the integration order of the global surface temperature series as well. The result:
Temp=1, W-M_GHGs=2, O3=2, StratH2O=2, Solar=0, LandUse=2, SnowAlb=2, StratAer=0, BC=2, ReflAer=2, AIE=2.
Temperature has an integration order I(1), while the most of the forcings have an integration order I(2). The only exceptions are Solar which comes in at I(0) in this test, and Stratospheric Aerosols (due to ultra-Plinian eruptions) also I(0). Its pretty clear that forcings related to anthropogenic factors are I(2), and temperature is I(1).
Our results are (mostly) consistent with Beenstock’s in this regard. Remembering that the test is not foolproof, the result for solar is suspicious, as he (and others) find it to be I(1). Solar has a huge 11 year cycle in it, and that might have something to do with it. You need to be very careful with confounding factors in these tests.
And it is (largely) because of this incompatibility between the orders of integration that Beenstock told London’s Cass Business School that the link between rising greenhouse gas emissions and rising temperatures is “spuriousâ€:
“The greenhouse effect is an illusion.â€
More choice quotes:
He warned that climatologists have misused statistics, leading them to the mistaken conclusion global warming is evidence of the greenhouse effect.
Professor Beenstock said that just because greenhouse gases and temperatures have risen together does not mean they are linked.
In the next post I’ll look at the basis for his prediction:
“If the sun’s heat continues to remain stable, and if carbon emissions continue to grow with the rate of growth of the world economy, global temperatures will fall by about 0.5C by 2050,” Professor Beenstock said.
I took a quick and dirty look at CO2 and temperature from the Dome C core (links here: http://www.ncdc.noaa.gov/paleo/icecore/antarcti… ) data covering the time range of 9,000 to 22,000 years BP. I converted CO2 to ln(CO2(t)/CO2(to) so it would look more like forcing and used the deuterium isotope delta directly rather than converting it to temperature. I haven't tried adf.test, but both series appeared to by AR(1), which, I think is indistinguishable from I(1).
Wouldn't they have to have a regular time step for that analysis?
Well, I said it was quick and dirty. Do you know of a method to generate a time series with a constant time step from data like that?Also, try testing the sum of all the forcings.
No I don't. I will test the sum.
I took a quick and dirty look at CO2 and temperature from the Dome C core (links here: http://www.ncdc.noaa.gov/paleo/icecore/antarctica/domec/domec_epica_data.html ) data covering the time range of 9,000 to 22,000 years BP. I converted CO2 to ln(CO2(t)/CO2(to) so it would look more like forcing and used the deuterium isotope delta directly rather than converting it to temperature. I haven’t tried adf.test, but both series appeared to by AR(1), which, I think is indistinguishable from I(1).
Wouldn’t they have to have a regular time step for that analysis?
Well, I said it was quick and dirty. Do you know of a method to generate a time series with a constant time step from data like that?
Also, try testing the sum of all the forcings.
No I don’t. I will test the sum.
Is there a problem that the “well mixed greenhouse gases” in green might be the ML data, which are a refined set, almost detatched from where the interactions happen near ground level, where the GHG are much more variable in range and time?To continue your bucket analogy, is the green GHG curve more like a hose that drips continuously with only small variation in drip rate? Is there any advantage in using daily data, if only for a small subset pilot run?
The Solar Forcing used by GISS is based on the obsolete Lean 2000. GISS acknowledged that it's outdated yet elected to use in anyway.
Is there a problem that the “well mixed greenhouse gases” in green might be the ML data, which are a refined set, almost detatched from where the interactions happen near ground level, where the GHG are much more variable in range and time?
To continue your bucket analogy, is the green GHG curve more like a hose that drips continuously with only small variation in drip rate?
Is there any advantage in using daily data, if only for a small subset pilot run?
The Solar Forcing used by GISS is based on the obsolete Lean 2000. GISS acknowledged that it’s outdated yet elected to use in anyway.
There's a package for R to regularize a time series with irregular time steps called pastecs. I used the regarea method in that package to produce time series for deltaD and ln(CO2(t)/CO2(to)) from the Law Dome ice core data for the period from 9,000 BP to 21,700 BP. the ADF test strongly rejected the hypothesis that the series weren't stationary, p values of 0.9 and 0.46. So can temperature and CO2 forcing change integration order over time? I suspect not.There is a problem, of course. The lag between deltaD and CO2 means that CO2 is a function of deltaD. That doesn't mean that CO2 can't amplify the change in deltaD, though.
There’s a package for R to regularize a time series with irregular time steps called pastecs. I used the regarea method in that package to produce time series for deltaD and ln(CO2(t)/CO2(to)) from the Law Dome ice core data for the period from 9,000 BP to 21,700 BP. the ADF test strongly rejected the hypothesis that the series weren’t stationary, p values of 0.9 and 0.46. So can temperature and CO2 forcing change integration order over time? I suspect not.
There is a problem, of course. The lag between deltaD and CO2 means that CO2 is a function of deltaD. That doesn’t mean that CO2 can’t amplify the change in deltaD, though.
I think you need to keep differencing until the ADF test rejects to get the order. See traces in recent post.
Of course. How dumb. Well, by and large you can’t learn without making mistakes.
adf.test p values for first difference : lnCO2, 0.053; deltaD 0.062.
p values for second difference: less than 0.01 for both.
I would take that to mean that in fact both series have the same integration order and it’s probably I(1).
Good one. Now I would guess that the regularization affects the AR of
the series, and would probably repeat it over a range of resolutions,
from the coarsest in the series to the finest.
Increasing the resolution to a 100 year interval from 177 years didn’t make much difference. I did change the width of the window used to determine how many adjacent points are used in the interpolation. The default value is the same as the time step. Increasing the window to 500 years with a time step of 100 years smooths the curve and increases the p value of the ADF test of the first difference to 0.1. That’s one of the things that led me to look at CO2 forcing calculated from the Mauna Loa data, which has a real annual resolution as opposed to ice core data which can be heavily smoothed both during the process of encapsulation as well as the processing of the analytical data. Law Dome data, for example, is smoothed over 20 or 75 years. The Dome C data I used above doesn’t seem to have been smoothed after measurement.
Hmm. Write up your analysis if you have the time and Ill post it.
I think you need to keep differencing until the ADF test rejects to get the order. See traces in recent post.
Of course. How dumb. Well, by and large you can't learn without making mistakes.adf.test p values for first difference : lnCO2, 0.053; deltaD 0.062.p values for second difference: less than 0.01 for both.I would take that to mean that in fact both series have the same integration order and it's probably I(1).
Good one. Now I would guess that the regularization affects the AR ofthe series, and would probably repeat it over a range of resolutions,from the coarsest in the series to the finest.
Increasing the resolution to a 100 year interval from 177 years didn't make much difference. I did change the width of the window used to determine how many adjacent points are used in the interpolation. The default value is the same as the time step. Increasing the window to 500 years with a time step of 100 years smooths the curve and increases the p value of the ADF test of the first difference to 0.1. That's one of the things that led me to look at CO2 forcing calculated from the Mauna Loa data, which has a real annual resolution as opposed to ice core data which can be heavily smoothed both during the process of encapsulation as well as the processing of the analytical data. Law Dome data, for example, is smoothed over 20 or 75 years. The Dome C data I used above doesn't seem to have been smoothed after measurement.
Hmm. Write up your analysis if you have the time and Ill post it.
Increasing the resolution to a 100 year interval from 177 years didn't make much difference. I did change the width of the window used to determine how many adjacent points are used in the interpolation. The default value is the same as the time step. Increasing the window to 500 years with a time step of 100 years smooths the curve and increases the p value of the ADF test of the first difference to 0.1. That's one of the things that led me to look at CO2 forcing calculated from the Mauna Loa data, which has a real annual resolution as opposed to ice core data which can be heavily smoothed both during the process of encapsulation as well as the processing of the analytical data. Law Dome data, for example, is smoothed over 20 or 75 years. The Dome C data I used above doesn't seem to have been smoothed after measurement.
Hmm. Write up your analysis if you have the time and Ill post it.
Pingback: kliknij tutaj
Pingback: polecam link
Pingback: witryna
Pingback: tutaj
Pingback: pity2015program.pl - pity 2014 program
Pingback: zobacz tutaj
Pingback: zobacz oferte
Pingback: montaz sufitów armstrong