How to predict with EMD? Because the EMD algorithm decomposes time series into a number of periodics of different frequency (IMFs), and a residue trend, prediction in EMD is by extrapolating each of the IMFs separately (a VAR model is recommended) and fitting a cubic polynomial to the residue (example code at end of here). The predictions are then added together.

Below are a couple of examples of EMD predictions on familiar data sets, the HadCRU global surface temperature, and the TLT series from the satellite MSU RSS data. In both I have also applied the recommended prediction techniques to extend the result into the future.

The first is the satellite TLT, and I have used up to IMF 5 so that variation up to annual quasi-periodicity is represented. The residue in this case, the red arch, has peaked and is projected to decline, along with the overall temperature, in the next few years. The amplitude of variability is also declining.

Below is the result for CRU, and I have used IMF’s up to the 11 year solar cycle (including the 22 year IMF and the ~60 year cycle). The residue trend is increasing (global warming has not stopped!), although the rate of increase is at the extreme low end of IPCC projections. The pop up predicted for the next few years seems to presume another strong solar cycle (referring to the individual IMF’s). If this doesn’t eventuate, as it looks like it won’t, one could omit the 11 and 22 year cycles, and simply use the 60 year and residue for prediction.

The periodic IMF’s are also plotted along the zero line, to show the degree of variation associated with natural variation.

The paper by Wu et al On the trend, detrending, and variability of nonlinear and nonstationary time series is well worth a read. It introduces EMD as a contender for optimal smoothing of climate series. Climate science should be looking for one after the poster child of global warming was demolished here, and here at ClimateAudit, and ‘worse than we thought’ alarmism abuse of it was exposed here.

After EMD extracts sequentially the frequency modes, Wu claims the residue is ‘intrinsic’ to the data, and not ‘extrinsic’ as a predetermined shape, like a straight line, and unlike a moving average it extends to the end without blowing out the confidence intervals. The approach certainly would seem to handle the end-point problem better than ‘end-pinning’ or other pathological methods used often in climate science (but only when the end points are high!). The benefit of being ‘intrinsic’ is that the resulting trend is more objective. The definition of trend in EMD is more generic than most, usually being monotonic increasing, but the residue can be S shaped too. It is always a concern if the method introduces biases, and I am not convinced that EMD does not equally contain biases, they are just hidden deeper in the guts of the algorithm for extracting out the IMF’s.

David: There appears to be some disagreement between TLT and surface temperature projections. But what I like about the short-term version is that you'd actually be able to track the projections versus observations over the next decade.Regards

Yes, and you might need to look at all the different data sets to see what'swhat. Different contamination with UHI, different cycles, different extentsand processing can lead to differences.

David: There appears to be some disagreement between TLT and surface temperature projections. But what I like about the short-term version is that you’d actually be able to track the projections versus observations over the next decade.

Regards

Yes, and you might need to look at all the different data sets to see what’s what. Different contamination with UHI, different cycles, different extents and processing can lead to differences.

David, Wu et al are not aat all recommending using their algorithm for prediction. They say

David, Wu et al are not aat all recommending using their algorithm for prediction. They say

I read that quote as saying that the standard disclaimers for fitting models apply…

The assumption is that the predictive models have to be process-based, not data-driven.Sounds like they mean it.

I read that quote as saying that the standard disclaimers for fitting models apply…

It would be nice to have some assurance that the HadCRU numbers that are announced from time to time are not in turn derived from a graph like your lower one. After all, your work gives them a century of forward modelling plus a statistical method to show that it fits rather well. Is there any scope for application of this deconvolution method to the buzz going on about separating non-linear tree ring growth curves in populations of different ages?Hi Nick. If you have non-stationary input data, how do you measure if your analysis & prediction is correct?

It would be nice to have some assurance that the HadCRU numbers that are announced from time to time are not in turn derived from a graph like your lower one. After all, your work gives them a century of forward modelling plus a statistical method to show that it fits rather well.

Is there any scope for application of this deconvolution method to the buzz going on about separating non-linear tree ring growth curves in populations of different ages?

Hi Nick. If you have non-stationary input data, how do you measure if your analysis & prediction is correct?

Sherro, there are lots of ways in which processes can be non-stationary. And of course, you can never be sure that any process

istruly stationary. If you’re confident it is stationary except for a trend, say, then you can subtract the trend and things are OK. But if you have a prediction with error, and the variance is unstable, it’s hard to test the error for significance.This all gets a lot of focus with stochastic volatility models in finance.

Sherro, there are lots of ways in which processes can be non-stationary. And of course, you can never be sure that any process

istruly stationary. If you're confident it is stationary except for a trend, say, then you can subtract the trend and things are OK. But if you have a prediction with error, and the variance is unstable, it's hard to test the error for significance.This all gets a lot of focus with stochastic volatility models in finance.The assumption is that the predictive models have to be process-based, not data-driven.Sounds like they mean it.Thanks Nick. This invites a trite answer that the exploitation of non-stationarity is an art form developed in financial circles so that those in the know can better cherry pick and prosper financially. Why, they might even stampede a bit of non-stationarity from time to time.The relevance is, of course, that we now have the admissions coming out that some parts of dendroclimatology depend on the ability to deal with non-stationarity trees to prosper financially through more grant monies. Seven standard deviations indeed.

Thanks Nick. This invites a trite answer that the exploitation of non-stationarity is an art form developed in financial circles so that those in the know can better cherry pick and prosper financially. Why, they might even stampede a bit of non-stationarity from time to time.

The relevance is, of course, that we now have the admissions coming out that some parts of dendroclimatology depend on the ability to deal with non-stationarity trees to prosper financially through more grant monies. Seven standard deviations indeed.

In my opinion, neither process-based nor data-driven models can provide reliable predictions for the future in the long term, except in very simple deterministic systems.In any case, the split-sample technique is a simple and necessary test that any model should pass before it can be used for any type of prediction (even short term): Split the available sample in two parts, formulate and fit the model based on the one part and then validate the model on the other part, which was not used in the fitting. It seems the formulation and fitting of the model discussed here (and many others) did not follow this procedure. Am I wrong?

Demetris, no objections from me there, this is not an example of how to do asplit-sample validation, and would certaintly be a component of predictionfor publication.The interesting point to me at this stage is the contrast between this andan LTP formulation: Your comments on Loehle's paper would be welcome as hesays here:http://icecap.us/images/uploads/05-loehleNEW.pdf

Rest of message: Loehle 2009″This is a length of time at which disagreement with climate models can nolonger be attributed to simple LTP. On the other hand, studies cited hereinhave documented a 50â€“70 year cycle of climate oscillations overlaid on asimple linear warming trend since the mid-1800s and have used this model toforecast cooling beginning between 2001 and 2010, a prediction that seems tobe upheld by the satellite and ocean heat content data.”He also says:”Furthermore, simple LTP can not produce a periodic signal over 350 years(since the Little Ice Age) nor would it be so consistently predicted by thevarious methods discussed here, including those based on patterns of solaractivity and ocean mode indices. Thus suggests that the cooling is notmerely a random fluctuation, but definitive evidence will take more time.”Note also Figure 7 in Wu (2007) where the only significant IMF is the 60year quasi-periodic fluctuation. Lohe's paper has a number of figures,using a different method, that resemble the results here. Interesting thatthe 60 year periodic, the natural variation that could be contributing mostof the increase in temperature in the last half centure, emerges strongly inthese analyses.You have to keep in mind that the EMD will decompose a white noise signalinto a series of IMF's as well. In that case they are junk and notpredictive, so like any model you have to validate each of the components isa robust signal.Its of interest here that each of the frequencies in the EMF decompositionmatch cycles that are talked about all the time. If , and only if, theypass validation, the quasi-periodic nature would provide some limitedpredictive ability, IMHO.These is then the issue of the residue trend, which is also interesting, butleave ofr another time.Sorry about the formatting.

I have not read the paper in full but the diagrams do not seem to contrast LTP. It is normal that small samples (and in LTP terms, these are small samples) show some (perhaps artificial) periodicities, which would diffuse in larger samples. By the way, I too welcome comments on my new paper, which discusses issues such as periodicity, nonstationarity, persistence, antipersistence, and determinism vs. randomness, in http://www.hydrol-earth-syst-sci-discuss.net/6/… or http://www.itia.ntua.gr/en/docinfo/923/

In my opinion, neither process-based nor data-driven models can provide reliable predictions for the future in the long term, except in very simple deterministic systems.

In any case, the split-sample technique is a simple and necessary test that any model should pass before it can be used for any type of prediction (even short term): Split the available sample in two parts, formulate and fit the model based on the one part and then validate the model on the other part, which was not used in the fitting.

It seems the formulation and fitting of the model discussed here (and many others) did not follow this procedure. Am I wrong?

Demetris, no objections from me there, this is not an example of how to do a

split-sample validation, and would certaintly be a component of prediction

for publication.

The interesting point to me at this stage is the contrast between this and

an LTP formulation: Your comments on Loehle’s paper would be welcome as he

says here:

http://icecap.us/images/uploads/05-loehleNEW.pdf

I have not read the paper in full but the diagrams do not seem to contrast LTP. It is normal that small samples (and in LTP terms, these are small samples) show some (perhaps artificial) periodicities, which would diffuse in larger samples.

By the way, I too welcome comments on my new paper, which discusses issues such as periodicity, nonstationarity, persistence, antipersistence, and determinism vs. randomness, in http://www.hydrol-earth-syst-sci-discuss.net/6/6611/2009/hessd-6-6611-2009.html or http://www.itia.ntua.gr/en/docinfo/923/

It would also seem to be the case with Wu that white noise was used for contrast. Will check out your paper.

Demetris, Yours is a very good paper with a lot of important points. I note where the periodicity of the simple models is discussed on page 6631 and

would suggest that this is the kind of quasi-periodic system that EMD attempts to deal with. A lot of other thoughts that I will try to collate.

I agree, David, it is kind of quasi-periodic system. However, I prefer to use the term “antipersistent” over “quasi-periodic”, because the latter may give a wrong message of a predictable (or quasi-predictable?) system.

I see these predictions above as largely in the ‘low uncertainty’ Fig 8 or ‘better skill

at deterministic forecast’ Fig 10 sections, as they involve only a few cycles,

providing of course they are significant, which Wu claims is the case with the

trend and the 60 year periodic (PDO?). So they are in the predictable part of the plot.

What do you think of the term quasi-harmonic? I think they share many features of a harmonic system, though only approximating the differential definition at the process level. Your toy system would then be an undamped, undriven quasi-harmonic, but the extension to more complex coupled systems is obvious.

Nice remark, David. Indeed, it is in the ‘low uncertainty’ as far as climatic prediction is concerned, because the average becomes flat very quickly. But this does not mean that the exact trajectory is equally predictable as the time average.

Rest of message: Loehle 2009

“This is a length of time at which disagreement with climate models can no longer be attributed to simple LTP. On the other hand, studies cited herein have documented a 50–70 year cycle of climate oscillations overlaid on a simple linear warming trend since the mid-1800s and have used this model to

forecast cooling beginning between 2001 and 2010, a prediction that seems to be upheld by the satellite and ocean heat content data.”

He also says:

“Furthermore, simple LTP can not produce a periodic signal over 350 years (since the Little Ice Age) nor would it be so consistently predicted by the various methods discussed here, including those based on patterns of solar activity and ocean mode indices. Thus suggests that the cooling is not merely a random fluctuation, but definitive evidence will take more time.”

Note also Figure 7 in Wu (2007) where the only significant IMF is the 60 year quasi-periodic fluctuation. Lohe’s paper has a number of figures, using a different method, that resemble the results here. Interesting that the 60 year periodic, the natural variation that could be contributing most of the increase in temperature in the last half century, emerges strongly in

these analyses.

You have to keep in mind that the EMD will decompose a white noise signal into a series of IMF’s as well. In that case they are junk and not predictive, so like any model you have to validate each of the components is a robust signal.

Its of interest here that each of the frequencies in the EMF decomposition match cycles that are talked about all the time. If , and only if, they pass validation, the quasi-periodic nature would provide some limited predictive ability, IMHO.

These is then the issue of the residue trend, which is also interesting, but leave for another time.

It would also seem to be the case with Wu that white noise was used forcontrast. Will check out your paper.

Demetris, Yours is a very good paper with a lot of important points. I notewhere the periodicity of the simple models is discussed on page 6631 andwould suggest that this is the kind of quasi-periodic system that EMDattempts to deal with. A lot of other thoughts that I will try to collate.

The Koutsoyiannis 2009 paper is a gem. http://www.itia.ntua.gr/en/docinfo/923/For years in a very clumsy and disjointed way I have been trying to make similar points, but not being a mathematician, was unable to give them impact.Some of the exercises I toyed with were similar. e.g. Take the radioactive decay scheme of Uranium-238 and change the estimate of half-life of U-238 by a tiny amount, to see the effects propagate through the daughters. e.g. Take a string of related numbers like assays down a drill hole, calculate the semivariogram and then use as a predictor beyond the end of the hole, the value that has least effect on the semivariogram properties. Now seen to be pointless.e.g. Using a better method to represent uncertainty in past temperature data – so use the tube rather than the string.e.g. For what length of time can the orbit of the earth WRT the stars be accurately modelled? Are the cycles like Milankovich really cyclic as in having constant periods, or do they wobble around? How do you estimate the uncertainty when all such cyclicities are combined (in a manner opposite to the deconvolution that David did to introduce this thead).e.g. Why does the frequency of Australian drought forecast change the year the CSIRO paper was published? A. Because it is wrong.The toy model that DK used is rather clever because of the simplicity of the starting assumptions, the lack of input perturbation, the evidence that it stays alive and does not die through equilibration, and the complexity of the outcomes. My understanding of autocorrelation and related concepts has been greatly improved and some idea of complexity added.Thank you both. I have a headache now.

Geoff, you might be interested in this “The modulated annual cycle: an alternative reference frame for climate anomalies”ftp://grads.iges.org/pub/ctr/ctr_244.pdf

The Koutsoyiannis 2009 paper is a gem. http://www.itia.ntua.gr/en/docinfo/923/

For years in a very clumsy and disjointed way I have been trying to make similar points, but not being a mathematician, was unable to give them impact.

Some of the exercises I toyed with were similar.

e.g. Take the radioactive decay scheme of Uranium-238 and change the estimate of half-life of U-238 by a tiny amount, to see the effects propagate through the daughters.

e.g. Take a string of related numbers like assays down a drill hole, calculate the semivariogram and then use as a predictor beyond the end of the hole, the value that has least effect on the semivariogram properties. Now seen to be pointless.

e.g. Using a better method to represent uncertainty in past temperature data – so use the tube rather than the string.

e.g. For what length of time can the orbit of the earth WRT the stars be accurately modelled? Are the cycles like Milankovich really cyclic as in having constant periods, or do they wobble around? How do you estimate the uncertainty when all such cyclicities are combined (in a manner opposite to the deconvolution that David did to introduce this thead).

e.g. Why does the frequency of Australian drought forecast change the year the CSIRO paper was published? A. Because it is wrong.

The toy model that DK used is rather clever because of the simplicity of the starting assumptions, the lack of input perturbation, the evidence that it stays alive and does not die through equilibration, and the complexity of the outcomes. My understanding of autocorrelation and related concepts has been greatly improved and some idea of complexity added.

Thank you both. I have a headache now.

Geoff, you might be interested in this “The modulated annual cycle: an alternative reference frame for climate anomalies”

ftp://grads.iges.org/pub/ctr/ctr_244.pdf

Very kind comment, sherro, thank you. Your examples seem interesting toys. The Milankovich one has intrigued me too. I hope to find some time in the future to play more with it.

I agree, David, it is kind of quasi-periodic system. However, I prefer to use the term “antipersistent” over “quasi-periodic”, because the latter may give a wrong message of a predictable (or quasi-predictable?) system.

Very kind comment, sherro, thank you. Your examples seem interesting toys. The Milankovich one has intrigued me too. I hope to find some time in the future to play more with it.

I can remember some time ago reading a thread at Open Mind about Hurst scaling and Tamino selcted 2 points and basically infilled data until he achieved a predictive capacity between those 2 points; I'm probably doing the great man an injustice but when reading it I thought of Asimov's Foundation where social prediction occurs through weight of numbers which smooth out the individual perturbations. That is, until, an event/individual happens along which is outside the predictive range; the mule. I don't think we have a mule in the AGW debate but at the risk of being mystical – since we are in a universe where anything can happen – how do predictions about the physical world deal with mules?

I see these predictions above as largely in the 'low uncertainty' Fig 8 or 'better skillat deterministic forecast' Fig 10 sections, as they involve only a few cycles,providing of course they are significant, which Wu claims is the case with thetrend and the 60 year periodic (PDO?). So they are in the predictable part of the plot.What do you think of the term quasi-harmonic? I think they share many features of a harmonic system, though only approximating the differential definition at the process level. Your toy system would then be an undamped, undriven quasi-harmonic, but the extension to more complex coupled systems is obvious.

I can remember some time ago reading a thread at Open Mind about Hurst scaling and Tamino selcted 2 points and basically infilled data until he achieved a predictive capacity between those 2 points; I’m probably doing the great man an injustice but when reading it I thought of Asimov’s Foundation where social prediction occurs through weight of numbers which smooth out the individual perturbations. That is, until, an event/individual happens along which is outside the predictive range; the mule. I don’t think we have a mule in the AGW debate but at the risk of being mystical – since we are in a universe where anything can happen – how do predictions about the physical world deal with mules?

Perhaps the type of phenomenon you are thinking of is like rogue waves. http://en.wikipedia.org/wiki/Rogue_wave. This is something that is predicted, but not predictable. As a number of reinforcing waves, it is a are event, but understandable from the dynamics. I wonder if the climate system could be like that too. That is a rapid cooling or warming event could be caused by reinforcing cycles.

Perhaps the type of phenomenon you are thinking of is like rogue waves. http://en.wikipedia.org/wiki/Rogue_wave. This is something that is predicted, but not predictable. As a number of reinforcing waves, it is a are event, but understandable from the dynamics. I wonder if the climate system could be like that too. That is a rapid cooling or warming event could be caused by reinforcing cycles.

David, thank you for 3 excellent references in one day. To return the favour, here are two to fill in your idle hours.The first has relevance to Steig et al Antarctic, matching ground temps to airborne. It also relates to Bob Tisdale's post at the start. There are a couple of graphs that speak for themselves. Submitted 2004 for Ph.D.http://www.phys.unsw.edu.au/astro/research/thes…The second is an unusually frank speech by a US politician about future energy forms and dynamics of global economies. Unfortunately Australia will sit in the naughty corner with the dunce hat on.http://batr.net/cohoctonwindwatch/LAMAR%20ALEXA…These are OT, but one good turn deserves another – especially when I have 2 headaches at once from math overload.

David, thank you for 3 excellent references in one day. To return the favour, here are two to fill in your idle hours.

The first has relevance to Steig et al Antarctic, matching ground temps to airborne. It also relates to Bob Tisdale’s post at the start. There are a couple of graphs that speak for themselves. Submitted 2004 for Ph.D.

http://www.phys.unsw.edu.au/astro/research/thesis/TonyTravouillon.pdf

The second is an unusually frank speech by a US politician about future energy forms and dynamics of global economies. Unfortunately Australia will sit in the naughty corner with the dunce hat on.

http://batr.net/cohoctonwindwatch/LAMAR%20ALEXANDER%20on%20NUCLELAR%20vs%20WIND.pdf

These are OT, but one good turn deserves another – especially when I have 2 headaches at once from math overload.

Nice remark, David. Indeed, it is in the 'low uncertainty' as far as climatic prediction is concerned, because the average becomes flat very quickly. But this does not mean that the exact trajectory is equally predictable as the time average.

Rogue waves; surfs up! The rogue seems to be a type of environmental Gestalt;http://en.wiktionary.org/wiki/gestaltThe Gestalt seems to be behind the AGW idea of 'tipping points' that people like Glikson hammer on about all the time; and this seems to be what informs the hysteria of AGW; not accretion by itself but accretion based threshold disasters, runnaway, tipping points and the like. Can the EMD isolate the components for a potential tipping point, or is this just butterflys flapping [with one wing] in the forest? Or more to the point a bit of theoretical clap-trap due to MEP and no extra external energy to the system?

Rogue waves; surfs up! The rogue seems to be a type of environmental Gestalt;

http://en.wiktionary.org/wiki/gestalt

The Gestalt seems to be behind the AGW idea of ‘tipping points’ that people like Glikson hammer on about all the time; and this seems to be what informs the hysteria of AGW; not accretion by itself but accretion based threshold disasters, runnaway, tipping points and the like. Can the EMD isolate the components for a potential tipping point, or is this just butterflys flapping [with one wing] in the forest? Or more to the point a bit of theoretical clap-trap due to MEP and no extra external energy to the system?

In the quasi-mechanical framework I am thinking of, the anti-persistence behavior is similar to a restoring force in a dynamical system (eg a spring). Then the main assumption is that the restoring force is conservative, that is, is the restoring force the same if the system moves in a circle back to the same point. If not, then yes, the system could move in a trajectory but equilibrate back at at different temperature, even when the forcings are the same. It could then do that again, essentially spiralling out of control.

I admit this is getting out of my depth, but the question may be answerable on the basis of the conservativeness of the anti-persistence. Is Miskolczi conservative, and Eddington not? I don’t know.

In the quasi-mechanical framework I am thinking of, the anti-persistence behavior is similar to a restoring force in a dynamical system (eg a spring). Then the main assumption is that the restoring force is conservative, that is, is the restoring force the same if the system moves in a circle back to the same point. If not, then yes, the system could move in a trajectory but equilibrate back at at different temperature, even when the forcings are the same. It could then do that again, essentially spiralling out of control. I admit this is getting out of my depth, but the question may be answerable on the basis of the conservativeness of the anti-persistence. Is Miskolczi conservative, and Eddington not? I don't know.

Couldn't you do an EMD of historical tipping points; say the Younger Dryas and the PETM or any series of Dansgaard-Oeschger events and see if there is any commonality between them and with the modern equivalent of decomposed factors?

Yes, good idea.

In the quasi-mechanical framework I am thinking of, the anti-persistence behavior is similar to a restoring force in a dynamical system (eg a spring). Then the main assumption is that the restoring force is conservative, that is, is the restoring force the same if the system moves in a circle back to the same point. If not, then yes, the system could move in a trajectory but equilibrate back at at different temperature, even when the forcings are the same. It could then do that again, essentially spiralling out of control. I admit this is getting out of my depth, but the question may be answerable on the basis of the conservativeness of the anti-persistence. Is Miskolczi conservative, and Eddington not? I don't know.

Couldn’t you do an EMD of historical tipping points; say the Younger Dryas and the PETM or any series of Dansgaard-Oeschger events and see if there is any commonality between them and with the modern equivalent of decomposed factors?

Yes, good idea.

Couldn't you do an EMD of historical tipping points; say the Younger Dryas and the PETM or any series of Dansgaard-Oeschger events and see if there is any commonality between them and with the modern equivalent of decomposed factors?

Yes, good idea.

Pingback: free coupons online

Pingback: zobacz

Pingback: oferta

Pingback: kliknij

Pingback: kliknij tutaj

Pingback: tutaj

Pingback: Bench Craft Company

Pingback: videos porno

Pingback: link do strony

Pingback: colonie de vacances

Pingback: witryna firmowa

Pingback: sufity armstrong