Downloading Monthly Mean Australian Temperatures from the BoM

This is a funny story about getting monthly mean Australian temperatures from the Bureau of Meteorology. I hope for happy ending, but we have yet to see.

About 2 weeks ago I tried to download these data from the BoM website set up for the purpose.

I have noticed before that temperature is NOT available as a MONTHLY mean series, but this time I was very perplexed. It is available by season, by January, February, etc. but not all together.

To get the data into a monthly series would require me to download all 12 raw datafiles (one for each month) then interleave them to create a continuous list with 12 values for each month. This would be timeconsuming and a potential source of error.

Note that all of the major sources of global mean temperature data on the web are available in as a monthly temperature series, GISS, HadCRUT, RSS MSU, etc…

It would seem relatively easy to add an option for download of data to the selection menu.

So I contacted the help desk, and after some time, had a delightfully frustrating conversation along the lines of: “I would like the monthly data series.” “But all the months are there.” etc, etc. Finally, she agreed to send my enquiry on to the technical department I presume, and today I found out why Australian temperature is not available as a monthly series.

Apparently, the reason why stock standard monthly data series are not available is because they don’t look good as a barchart.

Well, I would never have guessed.

Advertisements

0 thoughts on “Downloading Monthly Mean Australian Temperatures from the BoM

  1. “To get the data into a monthly series would require me to download all 12 raw datafiles (one for each month) then interleave them to create a continuous list with 12 values for each month. This would be timeconsuming and a potential source of error.”

    Actually, in Excel at least, this is fairly easy. I would just take each file, turn it into a table:

    year, value
    year, value

    and so on, then put the table in Excel, then move the values column over one, and put a string of ones for January, twos fro February, etc. between the values columns and the years columns.

    I would then take each set of three columns and put each set right under the last month’s set.

    Then, I would select all what is now three columns of 12 sets of the same year, a hundred or so ones, twos, threes, etc, and the values. Then I’d click on the sort button. Bingo, all the years line up, the months are in order, and you’re done!

  2. “To get the data into a monthly series would require me to download all 12 raw datafiles (one for each month) then interleave them to create a continuous list with 12 values for each month. This would be timeconsuming and a potential source of error.”Actually, in Excel at least, this is fairly easy. I would just take each file, turn it into a table:year, valueyear, valueand so on, then put the table in Excel, then move the values column over one, and put a string of ones for January, twos fro February, etc. between the values columns and the years columns.I would then take each set of three columns and put each set right under the last month's set.Then, I would select all what is now three columns of 12 sets of the same year, a hundred or so ones, twos, threes, etc, and the values. Then I'd click on the sort button. Bingo, all the years line up, the months are in order, and you're done!

  3. “So the reason why stock standard monthly data series are not available is because they don’t look good as a barchart.”

    I guess I have to take you at your word.

    HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

    Do these people really understand how poorly they look to those of us looking in??

  4. “So the reason why stock standard monthly data series are not available is because they don’t look good as a barchart.”I guess I have to take you at your word.HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHADo these people really understand how poorly they look to those of us looking in??

  5. They probably think its a world class interface. It looks very good. But its not what this customer wants, which is a simple link to a text file of all the data.

  6. They probably think its a world class interface. It looks very good. But its not what this customer wants, which is a simple link to a text file of all the data.

  7. David,
    I found this R scrap seemed to work, returning a 12 col matrix:
    s1=”http://www.bom.gov.au/web01/ncc/www/cli_chg/timeseries/tmax/01/aus/latest.txt”
    for(i in 1:12){
    j=62
    if(i>9)j=61
    substring(s1,first=j)<-as.character(i)
    download.file(s1,"temp.txt")
    vv <- matrix(scan("temp.txt", 0), ncol=2, byrow=TRUE)
    j=dim(vv)[1]
    if(i==1)monthly=matrix(0,j,12)
    monthly[1:j,i]=vv[,2]
    }

  8. David,I found this R scrap seemed to work, returning a 12 col matrix:s1=”http://www.bom.gov.au/web01/ncc/www/cli_chg/timeseries/tmax/01/aus/latest.txt”for(i in 1:12){ j=62 if(i>9)j=61 substring(s1,first=j)<-as.character(i) download.file(s1,”temp.txt”) vv <- matrix(scan(“temp.txt”, 0), ncol=2, byrow=TRUE) j=dim(vv)[1] if(i==1)monthly=matrix(0,j,12) monthly[1:j,i]=vv[,2]}

  9. I am not surprised my numerate readers would have no trouble getting the data into the form they require. My point was more along the lines of “Why would I want to walk around the block 12 times to go to the coffee shop when I only have to walk around once?” And why has nobody asked for it to be in a more convenient and standard form for data analysis before?

  10. I am not surprised my numerate readers would have no trouble getting the data into the form they require. My point was more along the lines of “Why would I want to walk around the block 12 times to go to the coffee shop when I only have to walk around once?” And why has nobody asked for it to be in a more convenient and standard form for data analysis before?

  11. If you wish to walk around the block 144 times or more, try finding the Max and Min temps for Cape Grim, as well as the data for their GHG measurements. Some temps are available from 1985 on one BOM disc, and on the BOM online site.

    Cape Grim is BOM 091245 and is a designated B.A.P.S., acronym for Baseline Air Pollution Station. It has a record of CO2 that might be compared with Mauna Loa. Given its position on the NW tip of Tasmania, I would expect a bit of CO2 to come from exhausted seagulls as they fight the roaring forties. (Lat -40.7, Long 144.7). In past years I helped explore the coastline further south, beyond Macquarie Harbour where “Grim” is a good word for this World Heritage useless wet desert.

    The Australian Weather News coverage is back to 1870 for temp. It is difficult indeed for an amateur to draw a graph of monthly or annual CO2 vs Temperature for this site.

    Suggestions are welcomed.

  12. If you wish to walk around the block 144 times or more, try finding the Max and Min temps for Cape Grim, as well as the data for their GHG measurements. Some temps are available from 1985 on one BOM disc, and on the BOM online site.Cape Grim is BOM 091245 and is a designated B.A.P.S., acronym for Baseline Air Pollution Station. It has a record of CO2 that might be compared with Mauna Loa. Given its position on the NW tip of Tasmania, I would expect a bit of CO2 to come from exhausted seagulls as they fight the roaring forties. (Lat -40.7, Long 144.7). In past years I helped explore the coastline further south, beyond Macquarie Harbour where “Grim” is a good word for this World Heritage useless wet desert.The Australian Weather News coverage is back to 1870 for temp. It is difficult indeed for an amateur to draw a graph of monthly or annual CO2 vs Temperature for this site.Suggestions are welcomed.

  13. While it is annoying to get, they did provide all the data. It is quite likely that they have no html coders on site (most people don’t) and are paying a contract webmaster for updates, so fulfilling your request may end up costing them extra money.

    While your request wouldn’t be unreasonable if you were their employer, it isn’t a reasonable demand that they make it available in a way more convenient to you.

    Andrew, the monthly and seasonal data is probably not on the computer before 1950. Do you want to pay someone (even an intern) to type in decades worth of data from handwritten documents (maybe hand-typed), solely benefiting people outside your organization?

    Never attribute to malice anything that has another reasonable explanation. The most common of these are stupidity, laziness, and frugality. This appears to be the last of those three.

    We should be praising them for having the data so open and available (unlike seemingly everyone else), not griping like children that it isn’t in the most convenient format

    • First, let me say that I agree with the principle you advocate here, also known as Hanlon’s razor. Anybody however could drift from such moorings occasionally. But none of this seems an adequate explanation to me. The US government has Nationwide, Regional, Statewide, and a few Local data sets readily available for mean temperature and precipitation by month, season, and annual (as well as YTD) on a single web page, all going back (except for some local records) to 1895, here:

      http://www.ncdc.noaa.gov/oa/climate/research/cag3/cag3.html

      And it strikes me as odd that, although one may wish for a little more transparency on NOAA’s part, that BoM can’t be bothered to even do that much work. Suffice it to say that my opinion of the Australian agency is now greatly diminished relative to NOAA here in the states, which I don’t hold in particularly high regard anyway.

  14. While it is annoying to get, they did provide all the data. It is quite likely that they have no html coders on site (most people don't) and are paying a contract webmaster for updates, so fulfilling your request may end up costing them extra money.While your request wouldn't be unreasonable if you were their employer, it isn't a reasonable demand that they make it available in a way more convenient to you.Andrew, the monthly and seasonal data is probably not on the computer before 1950. Do you want to pay someone (even an intern) to type in decades worth of data from handwritten documents (maybe hand-typed), solely benefiting people outside your organization? Never attribute to malice anything that has another reasonable explanation. The most common of these are stupidity, laziness, and frugality. This appears to be the last of those three.We should be praising them for having the data so open and available (unlike seemingly everyone else), not griping like children that it isn't in the most convenient format

  15. Ben: You are missing the point, though perhaps I didn’t explain it clearly. The website is set up for pretty pictures, not with numerate analysis in mind. If you don’t give feedback, then they don’t know they are out of step with every major climate data provider in the presentation format of the data. Its like no-one else has ever asked for it.

  16. Ben: You are missing the point, though perhaps I didn't explain it clearly. The website is set up for pretty pictures, not with numerate analysis in mind. I you don't give feedback, then they don't know they are out of step with every major climate data provider in the presentation format of the data. Its like no-one else has ever asked for it.

  17. Ben, the daily data for temp as Tmax and Tmin, with year/month/day is already electronic and digital. The hard yards have already been walked by BOM. At most stations it goes back to the year of commencement of records, such as the mid 1850s for Melb and Sydney.

    David is talking about the presentation of these figures in a way that makes it easy to inform. Sure, if you are a programmer, they are reasonably easy to manipulate. But it is more efficient for the central controller to manipulate them into a presentation form, than for myriads of public users out there to do their own thing.

    As well as data reported on a daily basis, it would be handy to have an official version using in fill of minor missing values by an explained algorithm, followed by tabulation of monthly and yearly averages for each acceptable station. Probably, these are the forms most used in studies.

    What is lacking, but is being worked on, is the transcription of meta data such as station changes and other recorded factors that might influence the record.

    On the BOM website, the data records tend to commence later. A decision seems to have been made that uncertainties about the start of use of Stevenson screens requires deletion of much data before 1910 or so. Later, there has been a minor trend to report data after about 1950, for reasons that elude me. One day we might see data only after 2000, or whatever was the actual year when the last of the Mercury thermometers was replaced by thermocouple or thermistor devices. It is becoming clearer that Tmax and Tmin from a Hg thermometer is less useful than the Tmax and Tmin from a near-continuous recorder, which can be estimated optionally after smoothing and filtering.

    At present, there is no single “official” version of the Australian temperature record, because it is being changed from time to time. This has a bad knock-on effect when other users such as GISS and HadCRU take the same data and do their own adjustments to it. I simply do not know which record to use for studies, because there can be differences in the average of a given year of several degrees C from one source to another.

    I suspect that this effect alone has already invalidated some studies that attempted to calibrate proxy data in the instrumented era.

    • sherro – all good points. Good meta-data and data control is more important than the data themselves. I have worked on this for many years in the museum and zoo communities. Greater appreciation of the needs of rigorous data analysis is one of the causes I want to champion, as I see often where the distribution is aligned around flashy interfaces, rather than substance.

  18. Ben, the daily data for temp as Tmax and Tmin, with year/month/day is already electronic and digital. The hard yards have already been walked by BOM. At most stations it goes back to the year of commencement of records, such as the mid 1850s for Melb and Sydney.David is talking about the presentation of these figures in a way that makes it easy to inform. Sure, if you are a programmer, they are reasonably easy to manipulate. But it is more efficient for the central controller to manipulate them into a presentation form, than for myriads of public users out there to do their own thing.As well as data reported on a daily basis, it would be handy to have an official version using in fill of minor missing values by an explained algorithm, followed by tabulation of monthly and yearly averages for each acceptable station. Probably, these are the forms most used in studies.What is lacking, but is being worked on, is the transcription of meta data such as station changes and other recorded factors that might influence the record.On the BOM website, the data records tend to commence later. A decision seems to have been made that uncertainties about the start of use of Stevenson screens requires deletion of much data before 1910 or so. Later, there has been a minor trend to report data after about 1950, for reasons that elude me. One day we might see data only after 2000, or whatever was the actual year when the last of the Mercury thermometers was replaced by thermocouple or thermistor devices. It is becoming clearer that Tmax and Tmin from a Hg thermometer is less useful than the Tmax and Tmin from a near-continuous recorder, which can be estimated optionally after smoothing and filtering.At present, there is no single “official” version of the Australian temperature record, because it is being changed from time to time. This has a bad knock-on effect when other users such as GISS and HadCRU take the same data and do their own adjustments to it. I simply do not know which record to use for studies, because there can be differences in the average of a given year of several degrees C from one source to another.I suspect that this effect alone has already invalidated some studies that attempted to calibrate proxy data in the instrumented era.

  19. sherro – all good points. Good meta-data and data control is more important than the data themselves. I have worked on this for many years in the museum and zoo communities. Greater appreciation of the needs of rigorous data analysis is one of the causes I want to champion, as I see often where the distribution is aligned around flashy interfaces, rather than substance.

  20. First, let me say that I agree with the principle you advocate here, also known as Hanlon's razor. Anybody however could drift from such moorings occasionally. But none of this seems an adequate explanation to me. The US government has Nationwide, Regional, Statewide, and a few Local data sets readily available for mean temperature and precipitation by month, season, and annual (as well as YTD) on a single web page, all going back (except for some local records) to 1895, here:http://www.ncdc.noaa.gov/oa/climate/research/ca…And it strikes me as odd that, although one may wish for a little more transparency on NOAA's part, that BoM can't be bothered to even do that much work. Suffice it to say that my opinion of the Australian agency is now greatly diminished relative to NOAA here in the states, which I don't hold in particularly high regard anyway.

  21. David, even though BOM says that “the monthly data are not available as a single file because the graphs are generated automatically from the data files and an all-months data file would contain too many data values to fit easily on one graph. We will see what we can do.”

    I can scrape monthly station data from the BoM website. 🙂

    Here are the first few months from Charters, Qld which I read directly into R:

    1 19620101 19620131 28.5
    2 19620201 19620228 27.6
    3 19620301 19620331 26.1
    4 19620401 19620430 22.8
    5 19620501 19620531 21.4
    6 19620601 19620630 19.4

    If BoM wishes some assistance, I’d be happy to help them.

  22. David, even though BOM says that “the monthly data are not available as a single file because the graphs are generated automatically from the data files and an all-months data file would contain too many data values to fit easily on one graph. We will see what we can do.”I can scrape monthly station data from the BoM website. 🙂 Here are the first few months from Charters, Qld which I read directly into R:1 19620101 19620131 28.52 19620201 19620228 27.63 19620301 19620331 26.14 19620401 19620430 22.85 19620501 19620531 21.46 19620601 19620630 19.4… If BoM wishes some assistance, I'd be happy to help them.

  23. DAvid, if you want to extract Aus monthly mean data, here’s a script that scrapes it. AusInfo is an information sheet (now at CA) listing the 103 stations in the monthly webpages with id numbers. get.bom scrapes the data into a time series using the id (character). I HATE these JAva pages that these guys waste money on.

    source(“http://www.climateaudit.org/scripts/station/australia/functions.bom.txt”)
    station= get.bom(id=AusInfo$id[i])

  24. DAvid, if you want to extract Aus monthly mean data, here's a script that scrapes it. AusInfo is an information sheet (now at CA) listing the 103 stations in the monthly webpages with id numbers. get.bom scrapes the data into a time series using the id (character). I HATE these JAva pages that these guys waste money on.source(“http://www.climateaudit.org/scripts/station/australia/functions.bom.txt”) station= get.bom(id=AusInfo$id[i])

  25. Well I need some stinkin’ bar graphs but I need to design them.

    There is a problem with monthly data sets being prepared at BOM and that problem is missing data. It’s almost as if those with something to contribute to method need to get together to resolve an agreed in fill method that is transparent and portable. In filling does not change the sense of the data much, mostly, but it makes working with the data far more tedious. It also encourages multiple users to devise their own infills, some of which are not reproducible by others. So there are increasingly more versions getting around in the literature and not all of them can be correct.

    Yes, I’d very much like tabulation of monthly Tmax and T min at as many stations as practical.

    • The usual approach is to define level 0, level 1, etc data according to the degree of manipulation. Creating contiguous data ups the levels.

      What KMNI has done is really great. Need more of it. Cheers

  26. Well I need some stinkin' bar graphs but I need to design them.There is a problem with monthly data sets being prepared at BOM and that problem is missing data. It's almost as if those with something to contribute to method need to get together to resolve an agreed in fill method that is transparent and portable. In filling does not change the sense of the data much, mostly, but it makes working with the data far more tedious. It also encourages multiple users to devise their own infills, some of which are not reproducible by others. So there are increasingly more versions getting around in the literature and not all of them can be correct.Yes, I'd very much like tabulation of monthly Tmax and T min at as many stations as practical.

  27. The usual approach is to define level 0, level 1, etc data according to the degree of manipulation. Creating contiguous data ups the levels. What KMNI has done is really great. Need more of it. Cheers

  28. The usual approach is to define level 0, level 1, etc data according to the degree of manipulation. Creating contiguous data ups the levels. What KMNI has done is really great. Need more of it. Cheers

  29. Pingback: link do strony

  30. Pingback: escort vienna

  31. Pingback: steve

  32. Pingback: life insurance policy

  33. Pingback: witryna

  34. Pingback: assurance temporaire

  35. Pingback: wien escort

  36. Pingback: livecooking

  37. Pingback: pills

  38. Pingback: àìéøï òåáã äéîåøéí

  39. Pingback: aiéoi oaáa äéîaoéí

  40. Pingback: zayiflama

  41. Pingback: bforex ??????

  42. Pingback: ????? ???

  43. Pingback: aiéoi oaáa äéîaoéí

  44. Pingback: zobacz oferte

  45. Pingback: strona

  46. Pingback: warez

  47. Pingback: izmir masaj

  48. Pingback: masöz

  49. Pingback: ehpad

  50. Pingback: Glenn Lyvers

  51. Pingback: carpet cleaning nyc

  52. Pingback: bforex

  53. Pingback: zobacz oferte

  54. Pingback: gsm

  55. Pingback: jersey de punto mujer

  56. Pingback: novelty techpoint

  57. Pingback: tylko masaze

  58. Pingback: outsourcing it zabrze

  59. Pingback: Birmingham escort agency.

  60. Pingback: Bench Craft Company Ads

  61. Pingback: masaj

  62. Pingback: zorras

  63. Pingback: love spells

  64. Pingback: love spells

  65. Pingback: ????? ????

  66. Pingback: kliknij link

  67. Pingback: Servicio Tecnico de Reparacion

  68. Pingback: simonwilby.com

  69. Pingback: servicio tecnico electrodomesticos barcelona

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s