• dum method for "current temperature of the earth" likely out by +-8C

    From MrPostingRobot@kymhorsell.com@21:1/5 to All on Wed Aug 25 13:05:19 2021
    XPost: alt.global-warming

    EXECUTIVE SUMMARY:
    - Measure twice and cut once says the old saw. According to your
    highschool math class by averaging N repeated measurements together
    you can reduce the error by a factor of 1/sqrt(N). Hillbillies
    generally say the error in the average temperature if N measurements
    is still +1C because each thermometer has (for argument's sake) that error.
    - But the math formula assumes the measurements are "independent" and
    "unbiased".
    - One web page tries to average up available met stations to arrive at
    an average temp of the earth. The page pointedly maintains it uses
    unprocessed temperature data in deg C. Unfortunately the siting of
    met stations is not "random". They mostly cluster together around
    population centers. They also are not very complete. Even (say)
    16000 met stations represent only around 1% of the land's surface,
    and much less the total surface of the earth.
    - We conduct an experiment with 1000s of met stations from the GHCN v3
    dataset. Using larger and larger samples from the TAVG for a given
    date we determine how far the sample average is from the average of
    all stations for that date. We want to know whether the error is
    reducing according to 1/sqrt(N) or maybe is much worse.
    - It turns out 80% of the results obtained showed an error of +-8C. It
    seems as samples got larger and larger the error rate "got stuck"
    and samples larger than ~20 made no improvement to the error bounds
    of the average of the TAVG's.
    - A power law was fit to the observed trend and showed the error
    (ignoring it seemed to get stuck at 8) *seemed* to follow 1/N^0.1 --
    a much slower rate of improvement than 1/sqrt(N). Using that
    formula showed the error for an average of 6000 or 16000 stations
    would still be more than +-5C.
    - It seems the "add it up and average" of clustered met stations that
    anyway represent only 1% of land area directly (i.e. assuming each
    met station faithfully represents the temperature of the surrounding
    100 km2) is just junk science.
    - By dividing the oceans up into e.g. 2x2 km grids and measuring the
    IR from each at nighttime and averaging them all up for a 24 hr
    period estimates the average temp of the oceans on the same date as the
    dumb method but got a result of 15.85C with a claimed 3 digits of accuracy.
    Not surprisingly that compared favorably with other published results.


    For non-math types it's sometimes hard to understand that scientists
    can calculate the warming trend attributed to AGW without actually
    bothering to calculate an explicit "average temperature of the earth".

    To many people it seems impossible to calculate rate of increase in X
    without calculating X's and looking at how they change over time.

    The resolution of the conundrum involves only high school level math.
    Each met station on the earth is liable to see the effect of global
    warming. If look at a number of them and "average the trends" you tend
    to zero in on a very good estimate of the overall trend warming for AGW.

    It's also true that each met station also is an approximation of the
    average temperature of the earth. It's just that it's a very noisy
    estimate. Adding up noisy estimates of something without careful
    handling just adds up the noise and eventually swamps the underlying signal.

    This is a useful trick lie bloggers and others employed by PR agencies
    to fuzz up the public's understanding of what's going on with anything
    that might be subject to product liability claims. They add up
    disparate numbers and magnify the noise and stand back saying --
    "look, nothing to see here".

    In the case of the "average temperature of the earth" -- that some
    climate denier sites tend to point to as evidence the earth is not
    warming and there are no more tornadies, floods, hurricanes and
    heatwaves like the newspapers and scientists is sayin -- the dumb way
    of calculating this magic number is guaranteed to be way off. Usually
    they explicitly say they are using "unprocessed temperature data".

    And it's amazing how bad this method is. It's really really laughable.
    but as with most science, you have to have an IQ over 90 or have
    graduated highschool to get the joke.

    Standard highschool math (I met this in year 11) shows you how to
    calculate the uncertainty in the calculation of an average of some numbers.

    They say if you measure twice you need to cut only once. In some way
    averaging 2 measurements is "better" than just one. But how much better?

    The highschool math shows you that if you measure something N times
    each with an "accuracy" of S then the error in the average of the
    measurements is S/sqrt(N). if you take 2 measurements and average then
    the error is reduced to 1/sqrt(2) -- about .70, i.e. about a 30%
    improvement on 1 measurement. If you measure it 9 times and average
    then then error is reduced by almost 70%.

    But this formula relies on some assumptions. And the most important is
    that the measurements are independent of one another. I.e. if you're
    cutting a length of timber you really should stand up, walk around,
    come back and measure it again. Not just stand there and look at the
    tape measure twice.

    To measure the temperature of the earth -- if you wanted to do that --
    you would need to measure the temperature of random spots on the
    planet at more or less the same time. If each thermometer had an
    accuracy of (say) 0.1C then taken 100 "independent" measurements would
    reduce the error to 1/sqrt(100) i.e. 1/10 of .1 == .01C.

    This was the basic error in the old climate denier claim you cant
    measure any temperature better than +-1C. I remember some guy -- let's
    call him Brent -- arguing about this basic highschool math for months
    before someone must have taken him aside and showed him a school
    textbook. He then deleted all those posts he's made because it seemed
    at the time he was a college tutor.

    But if the temperature measurements are not carefully selected to be
    random then the error does not reduce as expected. It can even grow.

    So this takes up so the exercise of calculating the "average
    temperature of the earth" by summing up the "unprocessed temperature measurements" from 1000s of (we assume) almost exclusively land-based
    stations at some point in time.

    We don't expect temp stations to be sited randomly all over the earth.
    They are usually, e.g., near population centers and they -- fore sure
    -- cluster around latitude 45N.

    If you ask only registered voters for one political party their
    voting intentions and averaging them out you will not predict the
    outcome of too many elections.

    But it is incredible just how bad this average of all stations without twiddling is.

    I did an experiment with the GHCN v3 stations. There are 100k of them
    in the database, but at any one time only around 7k of them are
    active. Picking a random 20th cent date in August I found about 2k of
    them had a TAVG measurement for that date.

    I made a little program to take averages of random subsets of those
    stations, increasing the sample size each time, and plotted out the
    average difference between the sample average and the average of all
    2k stations for that date.

    The results were illuminating:

    Samplesize Avg abs err Stddev(AAE)
    1 8.16 12.9751
    2 12.4845 11.6322
    3 16.3943 12.3321
    4 15.0626 10.0463
    5 14.1675 8.85287
    6 13.5807 8.45744
    7 13.7353 7.57161
    8 13.3503 7.76353
    9 11.1113 7.04888
    10 12.5444 6.25128
    11 10.924 6.76996
    12 10.908 6.5558
    13 10.6103 5.98467
    14 10.8273 5.85623
    15 9.36202 5.94612
    16 8.82959 5.62549
    17 9.90271 6.28352
    18 10.0032 6.09971
    19 9.3917 5.57505
    20 9.87653 5.40169
    21 8.59656 5.77514
    ...
    100 8.85012 2.7686


    It seems the avg error in the estimate of the global temperature from
    a sample gets stuck at 9 degrees. IOW the average of some sample of
    met stations without any fancy fiddling widda numbers is almost +-9C.
    IOW the method is truly junk.

    Ignoring the error seems to reach ~8 and then gets stuck we can put a
    power law through the data and get the approximation avgerror = 13.096
    * x^-0.100538.

    So for (say) 6000 stations (around the number of active GHCN stations)
    in the sample we get 13.096/6000^0.1 = 5.48686 (i.e. approx +-5.5C
    error). And for 16000 (supposedly the number of stations used in a
    certain web page) 13.096/16000^.1 = 4.97424 (i.e. approx +-5C error).

    Given one web page claimed their calculated averaged differed by some
    small amount (much less than 1C) from the 30y average it seems the
    site is just junk science. They can't tell within +-5C and more likely
    within +-8C what the average temperature of the earth is.

    On a more careful basis using the avg SST surveyed from satellites and involving mns of more or less equally spaced samples all over the
    earth's oceans the avg temp for the earth on the same day as the
    experiment above was 15.85C and .66C above the 30y average centered on
    1980. Not surprisingly this corresponds fairly closely with temp anoms
    against a 19080s baseline as published by Hadley and NOAA for the period.

    --
    BOM says 'weather bomb' unlikely to go off over land
    ABC Weather, 24 Aug 2021 20:57Z
    Despite being officially categorised as a "bomb cyclone", the Bureau of Meteorology says the wild weather hitting the New South Wales coast isn't actually an east coast low.

    Tax wealth properly, and Australia's ageing population will enjoy better
    health services, report says
    ABC News, 24 Aug 2021 21:56Z
    There is no reason why Australia's ageing population should put health
    budgets under pressure, given the amount of untaxed wealth in the economy,
    a new paper argues.

    [New Record!]
    NSW records 919 new COVID-19 cases, 2 deaths
    ABC News, 25 Aug 2021 01:05Z
    NSW records 919 new locally transmitted COVID-19 cases, and 2 deaths
    to 8:00pm last night.

    [No-brainer:]
    Report finds federal govt failings likely contributed to Ruby
    Princess COVID-19 disaster
    ABC/7.30, 24 Aug 2021 09:14Z
    A scathing audit has exposed further errors when the virus-plagued ship returned to Sydney last March and passengers were allowed to disembark.
    [The AG found a 2018 report into bio security related to cruise ships
    had been largely ignored by the fed govt. It concluded much of the
    fallout from covid in AUS would have been less if the govt had done a
    better job. The govt rejects the findings].

    BREAKING (25 Aug): Qld Prem Palaszczuk has called a halt to people
    arriving in the state. She said there are more than 5000 people in 22 quarantine hotels and there is no more room. Aside from special
    exemptions the she's called for a halt for 2 wks. After that time
    people from other states will have to re-apply for a permit to visit
    the Sunshine State.

    Four in 10 immunosuppressed people have a 'low or undetectable' level of immunity after 2 vaccines
    Daily Mail, 24 Aug 2021 20:04Z

    Arkansans calling poison control after taking livestock deworming drug
    4029tv, 24 Aug 2021 18:03Z

    Ivermectin: Horse deworming tablets dangerous for humans and not approved
    COVID-19 treatment
    KXAN.com, 21 Aug 2021 13:21Z

    Maersk Orders 8 Carbon-Neutral Container Ships
    CleanTechnica, 24 Aug 2021 18:59Z

    An airborne disease changed the way we live in the 19th century. Will COVID
    do it again?
    ABC Radio National, 24 Aug 2021 21:00Z
    A deadly airborne disease terrified Australians more than 100 years ago.
    This is how Adelaide town planners helped to bring the outbreak under control.

    Victorians say COVID vaccination portal struggling with high demand
    ABC News, 24 Aug 2021 20:58Z
    The Victorian govt's new vaccination portal is struggling to keep up
    with people eager to book COVID-19 shots, with users reporting delays both online and by phone. Follow live.
    [The website failed. Then the phone line failed].

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Eric Stevens@21:1/5 to All on Wed Aug 25 22:23:17 2021
    XPost: alt.global-warming

    On Wed, 25 Aug 2021 13:05:19 +1000, MrPostingRobot@kymhorsell.com
    wrote:


    EXECUTIVE SUMMARY:
    - Measure twice and cut once says the old saw. According to your
    highschool math class by averaging N repeated measurements together
    you can reduce the error by a factor of 1/sqrt(N).

    Just in case you wondered, that applies using the same device in the
    same place at more or less the same time. It does not apply to a
    miriad of different devices in different places, irrespective of
    whether the measurements are made at the same time or not.

    Hillbillies
    generally say the error in the average temperature if N measurements
    is still +1C because each thermometer has (for argument's sake) that error.
    - But the math formula assumes the measurements are "independent" and
    "unbiased".
    - One web page tries to average up available met stations to arrive at
    an average temp of the earth. The page pointedly maintains it uses
    unprocessed temperature data in deg C. Unfortunately the siting of
    met stations is not "random". They mostly cluster together around
    population centers. They also are not very complete. Even (say)
    16000 met stations represent only around 1% of the land's surface,
    and much less the total surface of the earth.


    - We conduct an experiment with 1000s of met stations from the GHCN v3
    dataset. Using larger and larger samples from the TAVG for a given
    date we determine how far the sample average is from the average of
    all stations for that date. We want to know whether the error is
    reducing according to 1/sqrt(N) or maybe is much worse.
    - It turns out 80% of the results obtained showed an error of +-8C. It
    seems as samples got larger and larger the error rate "got stuck"
    and samples larger than ~20 made no improvement to the error bounds
    of the average of the TAVG's.
    - A power law was fit to the observed trend and showed the error
    (ignoring it seemed to get stuck at 8) *seemed* to follow 1/N^0.1 --
    a much slower rate of improvement than 1/sqrt(N). Using that
    formula showed the error for an average of 6000 or 16000 stations
    would still be more than +-5C.
    - It seems the "add it up and average" of clustered met stations that
    anyway represent only 1% of land area directly (i.e. assuming each
    met station faithfully represents the temperature of the surrounding
    100 km2) is just junk science.
    - By dividing the oceans up into e.g. 2x2 km grids and measuring the
    IR from each at nighttime and averaging them all up for a 24 hr
    period estimates the average temp of the oceans on the same date as the
    dumb method but got a result of 15.85C with a claimed 3 digits of accuracy.
    Not surprisingly that compared favorably with other published results.


    For non-math types it's sometimes hard to understand that scientists
    can calculate the warming trend attributed to AGW without actually
    bothering to calculate an explicit "average temperature of the earth".

    To many people it seems impossible to calculate rate of increase in X
    without calculating X's and looking at how they change over time.

    The resolution of the conundrum involves only high school level math.
    Each met station on the earth is liable to see the effect of global
    warming. If look at a number of them and "average the trends" you tend
    to zero in on a very good estimate of the overall trend warming for AGW.

    It's also true that each met station also is an approximation of the
    average temperature of the earth. It's just that it's a very noisy
    estimate. Adding up noisy estimates of something without careful
    handling just adds up the noise and eventually swamps the underlying signal.

    This is a useful trick lie bloggers and others employed by PR agencies
    to fuzz up the public's understanding of what's going on with anything
    that might be subject to product liability claims. They add up
    disparate numbers and magnify the noise and stand back saying --
    "look, nothing to see here".

    In the case of the "average temperature of the earth" -- that some
    climate denier sites tend to point to as evidence the earth is not
    warming and there are no more tornadies, floods, hurricanes and
    heatwaves like the newspapers and scientists is sayin -- the dumb way
    of calculating this magic number is guaranteed to be way off. Usually
    they explicitly say they are using "unprocessed temperature data".

    And it's amazing how bad this method is. It's really really laughable.
    but as with most science, you have to have an IQ over 90 or have
    graduated highschool to get the joke.

    Standard highschool math (I met this in year 11) shows you how to
    calculate the uncertainty in the calculation of an average of some numbers.

    They say if you measure twice you need to cut only once. In some way >averaging 2 measurements is "better" than just one. But how much better?

    The highschool math shows you that if you measure something N times
    each with an "accuracy" of S then the error in the average of the >measurements is S/sqrt(N). if you take 2 measurements and average then
    the error is reduced to 1/sqrt(2) -- about .70, i.e. about a 30%
    improvement on 1 measurement. If you measure it 9 times and average
    then then error is reduced by almost 70%.

    But this formula relies on some assumptions. And the most important is
    that the measurements are independent of one another. I.e. if you're
    cutting a length of timber you really should stand up, walk around,
    come back and measure it again. Not just stand there and look at the
    tape measure twice.

    To measure the temperature of the earth -- if you wanted to do that --
    you would need to measure the temperature of random spots on the
    planet at more or less the same time. If each thermometer had an
    accuracy of (say) 0.1C then taken 100 "independent" measurements would
    reduce the error to 1/sqrt(100) i.e. 1/10 of .1 == .01C.

    This was the basic error in the old climate denier claim you cant
    measure any temperature better than +-1C. I remember some guy -- let's
    call him Brent -- arguing about this basic highschool math for months
    before someone must have taken him aside and showed him a school
    textbook. He then deleted all those posts he's made because it seemed
    at the time he was a college tutor.

    But if the temperature measurements are not carefully selected to be
    random then the error does not reduce as expected. It can even grow.

    So this takes up so the exercise of calculating the "average
    temperature of the earth" by summing up the "unprocessed temperature >measurements" from 1000s of (we assume) almost exclusively land-based >stations at some point in time.

    We don't expect temp stations to be sited randomly all over the earth.
    They are usually, e.g., near population centers and they -- fore sure
    -- cluster around latitude 45N.

    If you ask only registered voters for one political party their
    voting intentions and averaging them out you will not predict the
    outcome of too many elections.

    But it is incredible just how bad this average of all stations without >twiddling is.

    I did an experiment with the GHCN v3 stations. There are 100k of them
    in the database, but at any one time only around 7k of them are
    active. Picking a random 20th cent date in August I found about 2k of
    them had a TAVG measurement for that date.

    I made a little program to take averages of random subsets of those
    stations, increasing the sample size each time, and plotted out the
    average difference between the sample average and the average of all
    2k stations for that date.

    The results were illuminating:

    Samplesize Avg abs err Stddev(AAE)
    1 8.16 12.9751
    2 12.4845 11.6322
    3 16.3943 12.3321
    4 15.0626 10.0463
    5 14.1675 8.85287
    6 13.5807 8.45744
    7 13.7353 7.57161
    8 13.3503 7.76353
    9 11.1113 7.04888
    10 12.5444 6.25128
    11 10.924 6.76996
    12 10.908 6.5558
    13 10.6103 5.98467
    14 10.8273 5.85623
    15 9.36202 5.94612
    16 8.82959 5.62549
    17 9.90271 6.28352
    18 10.0032 6.09971
    19 9.3917 5.57505
    20 9.87653 5.40169
    21 8.59656 5.77514
    ...
    100 8.85012 2.7686


    It seems the avg error in the estimate of the global temperature from
    a sample gets stuck at 9 degrees. IOW the average of some sample of
    met stations without any fancy fiddling widda numbers is almost +-9C.
    IOW the method is truly junk.

    Ignoring the error seems to reach ~8 and then gets stuck we can put a
    power law through the data and get the approximation avgerror = 13.096
    * x^-0.100538.

    So for (say) 6000 stations (around the number of active GHCN stations)
    in the sample we get 13.096/6000^0.1 = 5.48686 (i.e. approx +-5.5C
    error). And for 16000 (supposedly the number of stations used in a
    certain web page) 13.096/16000^.1 = 4.97424 (i.e. approx +-5C error).

    Given one web page claimed their calculated averaged differed by some
    small amount (much less than 1C) from the 30y average it seems the
    site is just junk science. They can't tell within +-5C and more likely
    within +-8C what the average temperature of the earth is.

    On a more careful basis using the avg SST surveyed from satellites and >involving mns of more or less equally spaced samples all over the
    earth's oceans the avg temp for the earth on the same day as the
    experiment above was 15.85C and .66C above the 30y average centered on
    1980. Not surprisingly this corresponds fairly closely with temp anoms >against a 19080s baseline as published by Hadley and NOAA for the period.
    --

    Regards,

    Eric Stevens

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From R Kym Horsell@21:1/5 to Eric Stevens on Wed Aug 25 11:49:45 2021
    XPost: alt.global-warming

    In alt.global-warming Eric Stevens <eric.stevens@sum.co.nz> wrote:
    On Wed, 25 Aug 2021 13:05:19 +1000, MrPostingRobot@kymhorsell.com
    wrote:


    EXECUTIVE SUMMARY:
    - Measure twice and cut once says the old saw. According to your
    highschool math class by averaging N repeated measurements together
    you can reduce the error by a factor of 1/sqrt(N).

    Just in case you wondered, that applies using the same device in the
    ...

    Sorry. I dont take technical advice from a silly old fraud that
    claims a confidence interval is the same as a correlation.

    --
    [Hillbillies nebba nebba read their own cites because they nebba read anythin:]

    Eric Stevens 2/25/2021 4:27 PM:
    The sky is falling! So is the death rate from climate and weather
    related catstrophes.
    [Unum:]
    No cite as usual.
    [ES:]
    Is this another thing you dont know? See, for example https://ourworldindata.org/uploads/2018/04/Global-annual-absolute-deaths-from-
    natural-disasters-01.png

    You didn't even bother to look at it? Deaths from extreme temperature
    and drought are clearly shown as having increased since the 2000s.
    -- Unum, 26 Feb 2021 10:17

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Eric Stevens@21:1/5 to kym@kymhorsell.com on Sat Aug 28 15:39:32 2021
    XPost: alt.global-warming

    On Wed, 25 Aug 2021 11:49:45 -0000 (UTC), R Kym Horsell
    <kym@kymhorsell.com> wrote:

    In alt.global-warming Eric Stevens <eric.stevens@sum.co.nz> wrote:
    On Wed, 25 Aug 2021 13:05:19 +1000, MrPostingRobot@kymhorsell.com
    wrote:


    EXECUTIVE SUMMARY:
    - Measure twice and cut once says the old saw. According to your
    highschool math class by averaging N repeated measurements together
    you can reduce the error by a factor of 1/sqrt(N).

    Just in case you wondered, that applies using the same device in the
    ...

    Sorry. I dont take technical advice from a silly old fraud that
    claims a confidence interval is the same as a correlation.

    I hope you dont think that is me.

    In any case, you should try any standard work on statistics.
    --

    Regards,

    Eric Stevens

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)