• supercomputer progress

    From jlarkin@highlandsniptechnology.com@21:1/5 to All on Tue Apr 26 08:44:41 2022
    Lawrence Berkeley Lab announced the results from a new supercomputer
    analysis of climate change. They analyzed five west coast "extreme
    storms" from 1982 to 2014.

    The conclusion from a senior scientist is that "it rains a lot more
    during the worst storms."



    --

    Anybody can count to one.

    - Robert Widlar

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Cydrome Leader@21:1/5 to jlarkin@highlandsniptechnology.com on Tue Apr 26 16:56:33 2022
    jlarkin@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer
    analysis of climate change. They analyzed five west coast "extreme
    storms" from 1982 to 2014.

    The conclusion from a senior scientist is that "it rains a lot more
    during the worst storms."

    I'm surprised they even noticed that detail. Too bad they never talked to anybody over at the NOAA about how things work.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From John Robertson@21:1/5 to jlarkin@highlandsniptechnology.com on Tue Apr 26 12:04:44 2022
    On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer
    analysis of climate change. They analyzed five west coast "extreme
    storms" from 1982 to 2014.

    https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer

    ---------<quote>-----------------
    Lawrence Berkeley National Laboratory scientists are looking to make
    highly detailed, 1 kilometer scale cloud models to improve climate
    predictions. Using current supercomputer designs of combining
    microprocessors used in personal computers, a system capable of making
    such models would cost about $1 billion and use up 200 megawatts of
    energy. A supercomputer using 20 million embedded processors, on the
    other hand, would cost about $75 million and use less than 4 megawatts
    of energy, according to Lawrence Berkeley National Laboratory researchers. -------------<end quote>--------------

    4 megawatts/200 megawatts - do the computers factor in their heat
    generation in the climate models?

    John ;-#)#

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From John Larkin@21:1/5 to All on Tue Apr 26 13:53:08 2022
    On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <spam@flippers.com>
    wrote:


    On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer
    analysis of climate change. They analyzed five west coast "extreme
    storms" from 1982 to 2014.

    https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer

    ---------<quote>-----------------
    Lawrence Berkeley National Laboratory scientists are looking to make
    highly detailed, 1 kilometer scale cloud models to improve climate >predictions. Using current supercomputer designs of combining
    microprocessors used in personal computers, a system capable of making
    such models would cost about $1 billion and use up 200 megawatts of
    energy. A supercomputer using 20 million embedded processors, on the
    other hand, would cost about $75 million and use less than 4 megawatts
    of energy, according to Lawrence Berkeley National Laboratory researchers. >-------------<end quote>--------------

    4 megawatts/200 megawatts - do the computers factor in their heat
    generation in the climate models?

    John ;-#)#

    Does LBL measure energy in megawatts?

    Do bigger computers predict climate better?

    Oh dear.

    --

    If a man will begin with certainties, he shall end with doubts,
    but if he will be content to begin with doubts he shall end in certainties. Francis Bacon

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Anthony William Sloman@21:1/5 to John Larkin on Tue Apr 26 20:43:27 2022
    On Wednesday, April 27, 2022 at 6:53:20 AM UTC+10, John Larkin wrote:
    On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <sp...@flippers.com> wrote:

    On 2022/04/26 8:44 a.m., jla...@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer
    analysis of climate change. They analyzed five west coast "extreme
    storms" from 1982 to 2014.

    https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer

    ---------<quote>-----------------
    Lawrence Berkeley National Laboratory scientists are looking to make
    highly detailed, 1 kilometer scale cloud models to improve climate >predictions. Using current supercomputer designs of combining >microprocessors used in personal computers, a system capable of making
    such models would cost about $1 billion and use up 200 megawatts of
    energy. A supercomputer using 20 million embedded processors, on the
    other hand, would cost about $75 million and use less than 4 megawatts
    of energy, according to Lawrence Berkeley National Laboratory researchers. >-------------<end quote>--------------

    4 megawatts/200 megawatts - do the computers factor in their heat >generation in the climate models?

    Probably don't have to bother. It's lost in the rounding errors.

    Does LBL measure energy in megawatts?

    No, but the media department won't be staffed with people with degrees in physics (or any hard science).

    Do bigger computers predict climate better?

    That remains to be seen, but modelling individual cloud masses at the 1km scale should work better than plugging in average cloud cover for regions broken up into 100km by 100km squares
    The IEEE Spectum published an article on "Cloud computing" a few years ago that addressed this issue.

    Oh dear.

    John Larkin doesn't know much, and what he thinks he know mostly comes from Anthony Watts' climate change denial web site.

    --
    Bill Sloman, Sydney

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jan Panteltje@21:1/5 to jlarkin@highland_atwork_technology. on Wed Apr 27 10:19:06 2022
    On a sunny day (Tue, 26 Apr 2022 13:53:08 -0700) it happened John Larkin <jlarkin@highland_atwork_technology.com> wrote in <fpmg6hhot88ajjqkcb6nv9mkbjm7s9q85k@4ax.com>:

    On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <spam@flippers.com>
    wrote:


    On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer
    analysis of climate change. They analyzed five west coast "extreme
    storms" from 1982 to 2014.
    https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer

    ---------<quote>-----------------
    Lawrence Berkeley National Laboratory scientists are looking to make
    highly detailed, 1 kilometer scale cloud models to improve climate >>predictions. Using current supercomputer designs of combining >>microprocessors used in personal computers, a system capable of making
    such models would cost about $1 billion and use up 200 megawatts of
    energy. A supercomputer using 20 million embedded processors, on the
    other hand, would cost about $75 million and use less than 4 megawatts
    of energy, according to Lawrence Berkeley National Laboratory researchers. >>-------------<end quote>--------------

    4 megawatts/200 megawatts - do the computers factor in their heat >>generation in the climate models?

    John ;-#)#

    Does LBL measure energy in megawatts?

    Do bigger computers predict climate better?

    Oh dear.

    I have read CERN uses more power than all windmills together deliver in Switzerland.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jan Panteltje@21:1/5 to Leader on Wed Apr 27 10:17:12 2022
    On a sunny day (Tue, 26 Apr 2022 16:56:33 -0000 (UTC)) it happened Cydrome Leader <presence@MUNGEpanix.com> wrote in <t49881$clq$2@reader1.panix.com>:

    jlarkin@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer
    analysis of climate change. They analyzed five west coast "extreme
    storms" from 1982 to 2014.

    The conclusion from a senior scientist is that "it rains a lot more
    during the worst storms."

    I'm surprised they even noticed that detail. Too bad they never talked to >anybody over at the NOAA about how things work.

    There is a lot about need to publish
    Somebody I knew did a PhD in psychology or something
    He promoted on a paper about the sex-life of some group living in the wild.
    I asked him if he went there and experienced it...

    No :)

    if you read sciencedaily.com every day there are papers and things
    discovered that are either too obvious to read
    or too vague to be useful.
    Do plants have feeling?
    Do monkeys feel emotions?
    sort of things
    Of course they do.
    Today:
    Prehistoric People Created Art by Firelight
    of course they did, no flashlights back then in a dark cave.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Anthony William Sloman@21:1/5 to Jan Panteltje on Wed Apr 27 05:41:53 2022
    On Wednesday, April 27, 2022 at 8:17:20 PM UTC+10, Jan Panteltje wrote:
    On a sunny day (Tue, 26 Apr 2022 16:56:33 -0000 (UTC)) it happened Cydrome Leader <pres...@MUNGEpanix.com> wrote in <t49881$clq$2...@reader1.panix.com>: >jla...@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer
    analysis of climate change. They analyzed five west coast "extreme
    storms" from 1982 to 2014.

    The conclusion from a senior scientist is that "it rains a lot more
    during the worst storms."

    I'm surprised they even noticed that detail. Too bad they never talked to >anybody over at the NOAA about how things work.
    There is a lot about need to publish
    Somebody I knew did a PhD in psychology or something
    He promoted on a paper about the sex-life of some group living in the wild.
    I asked him if he went there and experienced it...

    No :)

    if you read sciencedaily.com every day there are papers and things discovered that are either too obvious to read or too vague to be useful?

    In Jan's ever-so-expert opinion.

    Anything published in the peer reviewed literature, get reviewed by people who do know something about the subject - the author's peers - who have to accept it as a useful and meaningful contribution.

    Max Planck didn't bother sending out any of Einsteins 1905 papers for review. He had enough confidence in his own judgement not to bother, and he was right.

    Do plants have feeling?

    Depends what you mean by feelings.

    Do monkeys feel emotions?

    Obviously they do.

    sort of things
    Of course they do.
    Today:
    Prehistoric People Created Art by Firelight
    of course they did, no flashlights back then in a dark cave.

    That's all popular science. Peer reviewed science is rather more technical.

    --
    Bill Sloman, Sydney

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jeroen Belleman@21:1/5 to Jan Panteltje on Wed Apr 27 14:30:11 2022
    On 2022-04-27 12:19, Jan Panteltje wrote:
    On a sunny day (Tue, 26 Apr 2022 13:53:08 -0700) it happened John Larkin <jlarkin@highland_atwork_technology.com> wrote in <fpmg6hhot88ajjqkcb6nv9mkbjm7s9q85k@4ax.com>:

    On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <spam@flippers.com>
    wrote:


    On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer
    analysis of climate change. They analyzed five west coast "extreme
    storms" from 1982 to 2014.

    https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer

    ---------<quote>-----------------
    Lawrence Berkeley National Laboratory scientists are looking to make
    highly detailed, 1 kilometer scale cloud models to improve climate
    predictions. Using current supercomputer designs of combining
    microprocessors used in personal computers, a system capable of making
    such models would cost about $1 billion and use up 200 megawatts of
    energy. A supercomputer using 20 million embedded processors, on the
    other hand, would cost about $75 million and use less than 4 megawatts
    of energy, according to Lawrence Berkeley National Laboratory researchers. >>> -------------<end quote>--------------

    4 megawatts/200 megawatts - do the computers factor in their heat
    generation in the climate models?

    John ;-#)#

    Does LBL measure energy in megawatts?

    Do bigger computers predict climate better?

    Oh dear.

    I have read CERN uses more power than all windmills together deliver in Switzerland.


    Yes, that sounds correct. CERN uses about 200MW when everything is
    running. Switzerland has a little over 70MW of windmills installed.
    Of course, those never actually deliver 70MW. More like 25% of that,
    on average.

    Most of CERN's electricity comes from the Genissiat dam in nearby
    France.

    Jeroen Belleman

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From DecadentLinuxUserNumeroUno@decadenc@21:1/5 to Anthony William Sloman on Thu Apr 28 01:33:20 2022
    Anthony William Sloman <bill.sloman@ieee.org> wrote in news:cc47cea9-5c27-411a-9793-f83274cfb007n@googlegroups.com:

    On Wednesday, April 27, 2022 at 6:53:20 AM UTC+10, John Larkin
    wrote:
    On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson
    <sp...@flippers.com> wrote:

    On 2022/04/26 8:44 a.m., jla...@highlandsniptechnology.com
    wrote:
    Lawrence Berkeley Lab announced the results from a new
    supercomputer analysis of climate change. They analyzed five
    west coast "extreme storms" from 1982 to 2014.

    https://www.greenbiz.com/article/berkeley-lab-tensilica-collabora
    te-energy-efficient-climate-modeling-supercomputer

    ---------<quote>-----------------
    Lawrence Berkeley National Laboratory scientists are looking to
    make highly detailed, 1 kilometer scale cloud models to improve
    climate predictions. Using current supercomputer designs of
    combining microprocessors used in personal computers, a system
    capable of making such models would cost about $1 billion and
    use up 200 megawatts of energy. A supercomputer using 20 million
    embedded processors, on the other hand, would cost about $75
    million and use less than 4 megawatts of energy, according to
    Lawrence Berkeley National Laboratory researchers.
    -------------<end quote>--------------

    4 megawatts/200 megawatts - do the computers factor in their
    heat generation in the climate models?

    Probably don't have to bother. It's lost in the rounding errors.

    Does LBL measure energy in megawatts?

    No, but the media department won't be staffed with people with
    degrees in physics (or any hard science).

    Do bigger computers predict climate better?

    That remains to be seen, but modelling individual cloud masses at
    the 1km scale should work better than plugging in average cloud
    cover for regions broken up into 100km by 100km squares The IEEE
    Spectum published an article on "Cloud computing" a few years ago
    that addressed this issue.

    Oh dear.

    John Larkin doesn't know much, and what he thinks he know mostly
    comes from Anthony Watts' climate change denial web site.


    1nm scale not kilometer.

    I want to marry this woman...

    <https://www.youtube.com/watch?v=0sUQkIyoF8M>

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From boB@21:1/5 to jlarkin@highland_atwork_technology. on Thu Apr 28 09:26:40 2022
    On Tue, 26 Apr 2022 13:53:08 -0700, John Larkin <jlarkin@highland_atwork_technology.com> wrote:

    On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <spam@flippers.com>
    wrote:


    On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer
    analysis of climate change. They analyzed five west coast "extreme
    storms" from 1982 to 2014.
    https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer

    ---------<quote>-----------------
    Lawrence Berkeley National Laboratory scientists are looking to make
    highly detailed, 1 kilometer scale cloud models to improve climate >>predictions. Using current supercomputer designs of combining >>microprocessors used in personal computers, a system capable of making
    such models would cost about $1 billion and use up 200 megawatts of
    energy. A supercomputer using 20 million embedded processors, on the
    other hand, would cost about $75 million and use less than 4 megawatts
    of energy, according to Lawrence Berkeley National Laboratory researchers. >>-------------<end quote>--------------

    4 megawatts/200 megawatts - do the computers factor in their heat >>generation in the climate models?

    John ;-#)#

    Does LBL measure energy in megawatts?

    Do bigger computers predict climate better?

    Oh dear.


    I think the jury has already returned that there is climate
    change/global warming and it is probably already too late to do much
    about it with the short time needed for countries and people to react.

    Especially with all the global warming denialists that don't care
    agout it and state of the art and science of generating
    non-greenhouse gas energy.

    I suppose that I won't be around to see how bad it will get which
    could be a good thing.

    I would love to have a super computer to run LTspice.

    boB

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From jlarkin@highlandsniptechnology.com@21:1/5 to boB on Thu Apr 28 09:37:20 2022
    On Thu, 28 Apr 2022 09:26:40 -0700, boB <boB@K7IQ.com> wrote:

    On Tue, 26 Apr 2022 13:53:08 -0700, John Larkin ><jlarkin@highland_atwork_technology.com> wrote:

    On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <spam@flippers.com> >>wrote:


    On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer
    analysis of climate change. They analyzed five west coast "extreme
    storms" from 1982 to 2014.
    https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer

    ---------<quote>-----------------
    Lawrence Berkeley National Laboratory scientists are looking to make >>>highly detailed, 1 kilometer scale cloud models to improve climate >>>predictions. Using current supercomputer designs of combining >>>microprocessors used in personal computers, a system capable of making >>>such models would cost about $1 billion and use up 200 megawatts of >>>energy. A supercomputer using 20 million embedded processors, on the >>>other hand, would cost about $75 million and use less than 4 megawatts
    of energy, according to Lawrence Berkeley National Laboratory researchers. >>>-------------<end quote>--------------

    4 megawatts/200 megawatts - do the computers factor in their heat >>>generation in the climate models?

    John ;-#)#

    Does LBL measure energy in megawatts?

    Do bigger computers predict climate better?

    Oh dear.


    I think the jury has already returned that there is climate
    change/global warming and it is probably already too late to do much
    about it with the short time needed for countries and people to react.

    At last! We'll all be dead in 8 years. I'd rather be drowned or blown
    away than bored to death.



    --

    Anybody can count to one.

    - Robert Widlar

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Dennis@21:1/5 to boB on Thu Apr 28 12:01:59 2022
    On 4/28/22 11:26, boB wrote:

    I would love to have a super computer to run LTspice.

    I thought one of the problems with LTspice (and spice in general)
    performance is that the algorithms don't parallelize very well.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jeroen Belleman@21:1/5 to boB on Thu Apr 28 19:47:03 2022
    On 2022-04-28 18:26, boB wrote:
    [...]
    I would love to have a super computer to run LTspice.

    boB



    In fact, what you have on your desk *is* a super computer,
    in the 1970's meaning of the words. It's just that it's
    bogged down running bloatware.

    Jeroen Belleman

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From John Larkin@21:1/5 to Dennis on Thu Apr 28 12:20:46 2022
    On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:

    On 4/28/22 11:26, boB wrote:

    I would love to have a super computer to run LTspice.

    I thought one of the problems with LTspice (and spice in general)
    performance is that the algorithms don't parallelize very well.

    LT runs on multiple cores now. I'd love the next gen LT Spice to run
    on an Nvidia card. 100x at least.

    --

    If a man will begin with certainties, he shall end with doubts,
    but if he will be content to begin with doubts he shall end in certainties. Francis Bacon

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to Jeroen Belleman on Thu Apr 28 21:57:27 2022
    On Thursday, April 28, 2022 at 1:47:11 PM UTC-4, Jeroen Belleman wrote:
    On 2022-04-28 18:26, boB wrote:
    [...]
    I would love to have a super computer to run LTspice.

    boB

    In fact, what you have on your desk *is* a super computer,
    in the 1970's meaning of the words. It's just that it's
    bogged down running bloatware.

    Even supercomputers from the 80s were not as fast as many of today's computers and the memory was often 16,000 times smaller than a typical laptop today.

    --

    Rick C.

    - Get 1,000 miles of free Supercharging
    - Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Phil Hobbs@21:1/5 to John Larkin on Fri Apr 29 02:09:19 2022
    John Larkin wrote:
    On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:

    On 4/28/22 11:26, boB wrote:

    I would love to have a super computer to run LTspice.

    I thought one of the problems with LTspice (and spice in general)
    performance is that the algorithms don't parallelize very well.

    LT runs on multiple cores now. I'd love the next gen LT Spice to run
    on an Nvidia card. 100x at least.


    The "number of threads" setting doesn't do anything very dramatic,
    though, at least last time I tried. Splitting up the calculation
    between cores would require all of them to communicate a couple of times
    per time step, but lots of other simulation codes do that.

    The main trouble is that the matrix defining the connectivity between
    nodes is highly irregular in general.

    Parallellizing that efficiently might well need a special-purpose
    compiler, sort of similar to the profile-guided optimizer in the guts of
    the FFTW code for computing DFTs. Probably not at all impossible, but
    not that straightforward to implement.

    Cheers

    Phil Hobbs

    --
    Dr Philip C D Hobbs
    Principal Consultant
    ElectroOptical Innovations LLC / Hobbs ElectroOptics
    Optics, Electro-optics, Photonics, Analog Electronics
    Briarcliff Manor NY 10510

    http://electrooptical.net
    http://hobbs-eo.com

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mike Monett@21:1/5 to Phil Hobbs on Fri Apr 29 08:24:23 2022
    Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:

    John Larkin wrote:
    On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:

    On 4/28/22 11:26, boB wrote:

    I would love to have a super computer to run LTspice.

    I thought one of the problems with LTspice (and spice in general)
    performance is that the algorithms don't parallelize very well.

    LT runs on multiple cores now. I'd love the next gen LT Spice to run
    on an Nvidia card. 100x at least.


    The "number of threads" setting doesn't do anything very dramatic,
    though, at least last time I tried. Splitting up the calculation
    between cores would require all of them to communicate a couple of times
    per time step, but lots of other simulation codes do that.

    The main trouble is that the matrix defining the connectivity between
    nodes is highly irregular in general.

    Parallellizing that efficiently might well need a special-purpose
    compiler, sort of similar to the profile-guided optimizer in the guts of
    the FFTW code for computing DFTs. Probably not at all impossible, but
    not that straightforward to implement.

    Cheers

    Phil Hobbs

    Supercomputers have thousands or hundreds of thousands of cores.

    Quote:

    "Cerebras Systems has unveiled its new Wafer Scale Engine 2 processor with
    a record-setting 2.6 trillion transistors and 850,000 AI-optimized cores.
    Its built for supercomputing tasks, and its the second time since 2019
    that Los Altos, California-based Cerebras has unveiled a chip that is
    basically an entire wafer."

    https://venturebeat.com/2021/04/20/cerebras-systems-launches-new-ai- supercomputing-processor-with-2-6-trillion-transistors/


    Man, I wish I were back living in Los Altos again.




    --
    MRM

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to Jeroen Belleman on Fri Apr 29 09:38:57 2022
    On 28/04/2022 18:47, Jeroen Belleman wrote:
    On 2022-04-28 18:26, boB wrote:
    [...]
    I would love to have a super computer to run LTspice.

    boB

    In fact, what you have on your desk *is* a super computer,
    in the 1970's meaning of the words. It's just that it's
    bogged down running bloatware.

    Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock and
    a whopping for the time 128MB of fast core memory with 40GB of disk. The
    one I used had an amazing for the time 1TB tape cassette backing store.
    It did 600 MFLOPs with the right sort of parallel vector code.

    That was back in the day when you needed special permission to use more
    than 4MB of core on the timesharing IBM 3081 (approx 7 MIPS).

    Current Intel 12 gen CPU desktops are ~4GHz, 16GB ram and >1TB of disk.
    (and the upper limits are even higher) That combo does ~66,000 MFLOPS.

    Spice simulation doesn't scale particularly well to large scale
    multiprocessor environments to many long range interractions.

    --
    Regards,
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to Phil Hobbs on Fri Apr 29 09:58:05 2022
    On 29/04/2022 07:09, Phil Hobbs wrote:
    John Larkin wrote:
    On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:

    On 4/28/22 11:26, boB wrote:

    I would love to have a super computer to run LTspice.

    I thought one of the problems with LTspice (and spice in general)
    performance is that the algorithms don't parallelize very well.

    LT runs on multiple cores now. I'd love the next gen LT Spice to run
    on an Nvidia card. 100x at least.


    The "number of threads" setting doesn't do anything very dramatic,
    though, at least last time I tried.  Splitting up the calculation
    between cores would require all of them to communicate a couple of times
    per time step, but lots of other simulation codes do that.

    If it is anything like chess problems then the memory bandwidth will
    saturate long before all cores+threads are used to optimum effect. After
    that point the additional threads merely cause it to run hotter.

    I found setting max threads to about 70% of those notionally available
    produced the most computing power with the least heat. After that the performance gain per thread was negligible but the extra heat was not.

    Having everything running full bore was actually slower and much hotter!

    The main trouble is that the matrix defining the connectivity between
    nodes is highly irregular in general.

    Parallellizing that efficiently might well need a special-purpose
    compiler, sort of similar to the profile-guided optimizer in the guts of
    the FFTW code for computing DFTs.  Probably not at all impossible, but
    not that straightforward to implement.

    I'm less than impressed with profile guided optimisers in compilers. The
    only time I tried it in anger the instrumentation code interfered with
    the execution of the algorithms to such an extent as to be meaningless.

    One gotcha I have identified in the latest MSC is that when it uses
    higher order SSE2, AVX, and AVX-512 implicitly in its code generation it
    does not align them on the stack properly so that sometimes they are
    split across two cache lines. I see two distinct speeds for each
    benchmark code segment depending on how the cache allignment falls.

    Basically the compiler forces stack alignment to 8 bytes and cache lines
    are 64 bytes but the compiler generated objects in play are 16 bytes, 32
    bytes or 64 bytes. Alignment failure fractions 1:4, 2:4 and 3:4.

    If you manually allocate such objects you can use pragmas to force
    optimal alignment but when the code generator chooses to use them
    internally you have no such control. Even so the MS compiler does
    generate blisteringly fast code compared to either Intel or GCC.

    --
    Regards,
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to Martin Brown on Fri Apr 29 06:46:03 2022
    On Friday, April 29, 2022 at 4:39:05 AM UTC-4, Martin Brown wrote:
    On 28/04/2022 18:47, Jeroen Belleman wrote:
    On 2022-04-28 18:26, boB wrote:
    [...]
    I would love to have a super computer to run LTspice.

    boB

    In fact, what you have on your desk *is* a super computer,
    in the 1970's meaning of the words. It's just that it's
    bogged down running bloatware.
    Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock and
    a whopping for the time 128MB of fast core memory with 40GB of disk. The
    one I used had an amazing for the time 1TB tape cassette backing store.
    It did 600 MFLOPs with the right sort of parallel vector code.

    That was back in the day when you needed special permission to use more
    than 4MB of core on the timesharing IBM 3081 (approx 7 MIPS).

    Current Intel 12 gen CPU desktops are ~4GHz, 16GB ram and >1TB of disk.
    (and the upper limits are even higher) That combo does ~66,000 MFLOPS.

    Spice simulation doesn't scale particularly well to large scale multiprocessor environments to many long range interractions.

    The Crays were nice if you had a few million dollars to spend. I worked for a startup building more affordable supercomputers in the same ball park of performance at a fraction of the price. Star Technologies, ST-100 supported 100 MFLOPS and 32 MB of
    memory, costing around $200,000 with 256 KB of RAM was a fraction of the cost of the only slightly faster Cray X-MP, available at the same time.

    --

    Rick C.

    + Get 1,000 miles of free Supercharging
    + Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Phil Hobbs@21:1/5 to Mike Monett on Fri Apr 29 10:03:23 2022
    Mike Monett wrote:
    Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:

    John Larkin wrote:
    On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:

    On 4/28/22 11:26, boB wrote:

    I would love to have a super computer to run LTspice.

    I thought one of the problems with LTspice (and spice in general)
    performance is that the algorithms don't parallelize very well.

    LT runs on multiple cores now. I'd love the next gen LT Spice to run
    on an Nvidia card. 100x at least.


    The "number of threads" setting doesn't do anything very dramatic,
    though, at least last time I tried. Splitting up the calculation
    between cores would require all of them to communicate a couple of times
    per time step, but lots of other simulation codes do that.

    The main trouble is that the matrix defining the connectivity between
    nodes is highly irregular in general.

    Parallellizing that efficiently might well need a special-purpose
    compiler, sort of similar to the profile-guided optimizer in the guts of
    the FFTW code for computing DFTs. Probably not at all impossible, but
    not that straightforward to implement.

    Cheers

    Phil Hobbs

    Supercomputers have thousands or hundreds of thousands of cores.

    Quote:

    "Cerebras Systems has unveiled its new Wafer Scale Engine 2 processor with
    a record-setting 2.6 trillion transistors and 850,000 AI-optimized cores. It’s built for supercomputing tasks, and it’s the second time since 2019 that Los Altos, California-based Cerebras has unveiled a chip that is basically an entire wafer."

    https://venturebeat.com/2021/04/20/cerebras-systems-launches-new-ai- supercomputing-processor-with-2-6-trillion-transistors/

    Number of cores isn't the problem. For fairly tightly-coupled tasks
    such as simulations, the issue is interconnect latency between cores,
    and the required bandwidth goes roughly as the cube or Moore's law, so
    it ran out of gas long ago.

    One thing that zillions of cores could do for SPICE is to do all the
    stepped parameter runs simultaneously. At that point all you need is
    infinite bandwidth to disk.

    Man, I wish I were back living in Los Altos again.

    I couldn't get out of there fast enough, and have never looked back.

    Cheers

    Phil Hobbs

    --
    Dr Philip C D Hobbs
    Principal Consultant
    ElectroOptical Innovations LLC / Hobbs ElectroOptics
    Optics, Electro-optics, Photonics, Analog Electronics
    Briarcliff Manor NY 10510

    http://electrooptical.net
    http://hobbs-eo.com

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Phil Hobbs@21:1/5 to Jeroen Belleman on Fri Apr 29 10:12:23 2022
    Jeroen Belleman wrote:
    On 2022-04-28 18:26, boB wrote:
    [...]
    I would love to have a super computer to run LTspice.

    boB



    In fact, what you have on your desk *is* a super computer,
    in the 1970's meaning of the words. It's just that it's
    bogged down running bloatware.

    Jeroen Belleman

    In the 1990s meaning of the words, in fact. My 2011-vintage desktop box
    runs 250 Gflops peak (2x 12-core Magny Cours, 64G main memory, RAID5 disks).

    My phone is a supercomputer by 1970s standards. ;)

    Cheers

    Phil Hobbs

    --
    Dr Philip C D Hobbs
    Principal Consultant
    ElectroOptical Innovations LLC / Hobbs ElectroOptics
    Optics, Electro-optics, Photonics, Analog Electronics
    Briarcliff Manor NY 10510

    http://electrooptical.net
    http://hobbs-eo.com

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Phil Hobbs@21:1/5 to Martin Brown on Fri Apr 29 10:10:16 2022
    Martin Brown wrote:
    On 29/04/2022 07:09, Phil Hobbs wrote:
    John Larkin wrote:
    On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:

    On 4/28/22 11:26, boB wrote:

    I would love to have a super computer to run LTspice.

    I thought one of the problems with LTspice (and spice in general)
    performance is that the algorithms don't parallelize very well.

    LT runs on multiple cores now. I'd love the next gen LT Spice to run
    on an Nvidia card. 100x at least.


    The "number of threads" setting doesn't do anything very dramatic,
    though, at least last time I tried.  Splitting up the calculation
    between cores would require all of them to communicate a couple of
    times per time step, but lots of other simulation codes do that.

    If it is anything like chess problems then the memory bandwidth will
    saturate long before all cores+threads are used to optimum effect. After
    that point the additional threads merely cause it to run hotter.

    I found setting max threads to about 70% of those notionally available produced the most computing power with the least heat. After that the performance gain per thread was negligible but the extra heat was not.

    Having everything running full bore was actually slower and much hotter!

    The main trouble is that the matrix defining the connectivity between
    nodes is highly irregular in general.

    Parallellizing that efficiently might well need a special-purpose
    compiler, sort of similar to the profile-guided optimizer in the guts
    of the FFTW code for computing DFTs.  Probably not at all impossible,
    but not that straightforward to implement.

    I'm less than impressed with profile guided optimisers in compilers. The
    only time I tried it in anger the instrumentation code interfered with
    the execution of the algorithms to such an extent as to be meaningless.

    It wouldn't need to be as general as that--one could simply sort for the most-connected nodes, and sort by weighted graph distance so as to
    minimize the number of connections across the chunks of netlist, then
    adjust the data structures for communication appropriately.

    It also wouldn't parallellize as well as FDTD, say, because there's less computation going on per time step, so the communication overhead is proportionately much greater.


    One gotcha I have identified in the latest MSC is that when it uses
    higher order SSE2, AVX, and AVX-512 implicitly in its code generation it
    does not align them on the stack properly so that sometimes they are
    split across two cache lines. I see two distinct speeds for each
    benchmark code segment depending on how the cache allignment falls.

    Basically the compiler forces stack alignment to 8 bytes and cache lines
    are 64 bytes but the compiler generated objects in play are 16 bytes, 32 bytes or 64 bytes. Alignment failure fractions 1:4, 2:4 and 3:4.

    If you manually allocate such objects you can use pragmas to force
    optimal alignment but when the code generator chooses to use them
    internally you have no such control. Even so the MS compiler does
    generate blisteringly fast code compared to either Intel or GCC.


    The FFTW profiler works pretty well IME, but I agree, doing it with the
    whole program isn't trivial.

    Cheers

    Phil Hobbs

    --
    Dr Philip C D Hobbs
    Principal Consultant
    ElectroOptical Innovations LLC / Hobbs ElectroOptics
    Optics, Electro-optics, Photonics, Analog Electronics
    Briarcliff Manor NY 10510

    http://electrooptical.net
    http://hobbs-eo.com

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to Phil Hobbs on Fri Apr 29 07:18:35 2022
    On Friday, April 29, 2022 at 10:12:30 AM UTC-4, Phil Hobbs wrote:
    Jeroen Belleman wrote:
    On 2022-04-28 18:26, boB wrote:
    [...]
    I would love to have a super computer to run LTspice.

    boB



    In fact, what you have on your desk *is* a super computer,
    in the 1970's meaning of the words. It's just that it's
    bogged down running bloatware.

    Jeroen Belleman
    In the 1990s meaning of the words, in fact. My 2011-vintage desktop box
    runs 250 Gflops peak (2x 12-core Magny Cours, 64G main memory, RAID5 disks).

    My phone is a supercomputer by 1970s standards. ;)

    And no more possible to build at that time than in ancient Rome. It's amazing how rapidly technology changes when spurred by the profit motive.

    --

    Rick C.

    -- Get 1,000 miles of free Supercharging
    -- Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From jlarkin@highlandsniptechnology.com@21:1/5 to jeroen@nospam.please on Fri Apr 29 07:31:57 2022
    On Thu, 28 Apr 2022 19:47:03 +0200, Jeroen Belleman
    <jeroen@nospam.please> wrote:

    On 2022-04-28 18:26, boB wrote:
    [...]
    I would love to have a super computer to run LTspice.

    boB



    In fact, what you have on your desk *is* a super computer,
    in the 1970's meaning of the words. It's just that it's
    bogged down running bloatware.

    Jeroen Belleman

    My phone probably has more compute power than all the computers in the
    world about 1960.



    --

    Anybody can count to one.

    - Robert Widlar

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From jlarkin@highlandsniptechnology.com@21:1/5 to pcdhSpamMeSenseless@electrooptical. on Fri Apr 29 07:30:45 2022
    On Fri, 29 Apr 2022 02:09:19 -0400, Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:

    John Larkin wrote:
    On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:

    On 4/28/22 11:26, boB wrote:

    I would love to have a super computer to run LTspice.

    I thought one of the problems with LTspice (and spice in general)
    performance is that the algorithms don't parallelize very well.

    LT runs on multiple cores now. I'd love the next gen LT Spice to run
    on an Nvidia card. 100x at least.


    The "number of threads" setting doesn't do anything very dramatic,
    though, at least last time I tried. Splitting up the calculation
    between cores would require all of them to communicate a couple of times
    per time step, but lots of other simulation codes do that.

    The main trouble is that the matrix defining the connectivity between
    nodes is highly irregular in general.

    Parallellizing that efficiently might well need a special-purpose
    compiler, sort of similar to the profile-guided optimizer in the guts of
    the FFTW code for computing DFTs. Probably not at all impossible, but
    not that straightforward to implement.

    Cheers

    Phil Hobbs

    Climate simulation uses enormous multi-CPU supercomputer rigs.

    OK, I suppose that makes your point.



    --

    Anybody can count to one.

    - Robert Widlar

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to jla...@highlandsniptechnology.com on Fri Apr 29 07:50:04 2022
    On Friday, April 29, 2022 at 10:32:07 AM UTC-4, jla...@highlandsniptechnology.com wrote:
    On Thu, 28 Apr 2022 19:47:03 +0200, Jeroen Belleman
    <jer...@nospam.please> wrote:

    On 2022-04-28 18:26, boB wrote:
    [...]
    I would love to have a super computer to run LTspice.

    boB



    In fact, what you have on your desk *is* a super computer,
    in the 1970's meaning of the words. It's just that it's
    bogged down running bloatware.

    Jeroen Belleman
    My phone probably has more compute power than all the computers in the
    world about 1960.

    And lets you watch cat videos anywhere you go.

    --

    Rick C.

    -+ Get 1,000 miles of free Supercharging
    -+ Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to Ricky on Fri Apr 29 16:38:55 2022
    On 29/04/2022 14:46, Ricky wrote:
    On Friday, April 29, 2022 at 4:39:05 AM UTC-4, Martin Brown wrote:
    On 28/04/2022 18:47, Jeroen Belleman wrote:
    On 2022-04-28 18:26, boB wrote: [...]
    I would love to have a super computer to run LTspice.

    boB

    In fact, what you have on your desk *is* a super computer, in the
    1970's meaning of the words. It's just that it's bogged down
    running bloatware.
    Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz
    clock and a whopping for the time 128MB of fast core memory with
    40GB of disk. The one I used had an amazing for the time 1TB tape
    cassette backing store. It did 600 MFLOPs with the right sort of
    parallel vector code.

    That was back in the day when you needed special permission to use
    more than 4MB of core on the timesharing IBM 3081 (approx 7 MIPS).

    Current Intel 12 gen CPU desktops are ~4GHz, 16GB ram and >1TB of
    disk. (and the upper limits are even higher) That combo does
    ~66,000 MFLOPS.

    Spice simulation doesn't scale particularly well to large scale
    multiprocessor environments to many long range interractions.

    The Crays were nice if you had a few million dollars to spend. I
    worked for a startup building more affordable supercomputers in the
    same ball park of performance at a fraction of the price. Star
    Technologies, ST-100 supported 100 MFLOPS and 32 MB of memory,
    costing around $200,000 with 256 KB of RAM was a fraction of the cost
    of the only slightly faster Cray X-MP, available at the same time.

    At the time I was doing that stuff the FPS-120 array processor attached
    to a PDP-11 or Vax was the poor man's supercomputer. Provided you had
    the right sort of problem it was very good indeed for price performance.
    (it was still fairly pricey)

    I got to port our code to everything from a humble Z80 (where it could
    only solve trivial toy problems) upwards to the high end Cray. The more expensive the computer the more tolerant of IBM extensions they tended
    to be. The Z80 FORTRAN IV I remember as being a stickler for the rules.


    --
    Regards,
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to jlarkin@highlandsniptechnology.com on Fri Apr 29 16:43:17 2022
    On 29/04/2022 15:30, jlarkin@highlandsniptechnology.com wrote:
    On Fri, 29 Apr 2022 02:09:19 -0400, Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:

    John Larkin wrote:
    On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:

    On 4/28/22 11:26, boB wrote:

    I would love to have a super computer to run LTspice.

    I thought one of the problems with LTspice (and spice in general)
    performance is that the algorithms don't parallelize very well.

    LT runs on multiple cores now. I'd love the next gen LT Spice to run
    on an Nvidia card. 100x at least.


    The "number of threads" setting doesn't do anything very dramatic,
    though, at least last time I tried. Splitting up the calculation
    between cores would require all of them to communicate a couple of times
    per time step, but lots of other simulation codes do that.

    The main trouble is that the matrix defining the connectivity between
    nodes is highly irregular in general.

    Parallellizing that efficiently might well need a special-purpose
    compiler, sort of similar to the profile-guided optimizer in the guts of
    the FFTW code for computing DFTs. Probably not at all impossible, but
    not that straightforward to implement.

    Cheers

    Phil Hobbs

    Climate simulation uses enormous multi-CPU supercomputer rigs.

    They are basically fluid in cell models with a fair number of parameters
    per cell but depending on your exact choice of geometry only 6 nearest neighbours in a 3D cubic computational grid (worst case 26 cells).

    That is a very regular interconnectivity and lends itself to vector
    processing (which is why we were using them) though for another problem.

    A handful of FLIC practitioners used tetrahedral or hexagonal close
    packed grids (4 nearest neighbours or 12 nearest neighbours).
    OK, I suppose that makes your point.

    When I was involved in such codes for relativistic particle beams we
    used its cylindrical symmetry to make the problem more tractable in 2D.
    The results agreed remarkably well with experiments so I see no need to ridicule other FLIC models as used in weather and climate research.

    --
    Regards,
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Joe Gwinn@21:1/5 to pcdhSpamMeSenseless@electrooptical. on Fri Apr 29 15:25:05 2022
    On Fri, 29 Apr 2022 10:03:23 -0400, Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:

    Mike Monett wrote:
    Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:

    John Larkin wrote:
    On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:

    On 4/28/22 11:26, boB wrote:

    I would love to have a super computer to run LTspice.

    I thought one of the problems with LTspice (and spice in general)
    performance is that the algorithms don't parallelize very well.

    LT runs on multiple cores now. I'd love the next gen LT Spice to run
    on an Nvidia card. 100x at least.


    The "number of threads" setting doesn't do anything very dramatic,
    though, at least last time I tried. Splitting up the calculation
    between cores would require all of them to communicate a couple of times >>> per time step, but lots of other simulation codes do that.

    The main trouble is that the matrix defining the connectivity between
    nodes is highly irregular in general.

    Parallellizing that efficiently might well need a special-purpose
    compiler, sort of similar to the profile-guided optimizer in the guts of >>> the FFTW code for computing DFTs. Probably not at all impossible, but
    not that straightforward to implement.

    Cheers

    Phil Hobbs

    Supercomputers have thousands or hundreds of thousands of cores.

    Quote:

    "Cerebras Systems has unveiled its new Wafer Scale Engine 2 processor with >> a record-setting 2.6 trillion transistors and 850,000 AI-optimized cores.
    Its built for supercomputing tasks, and its the second time since 2019
    that Los Altos, California-based Cerebras has unveiled a chip that is
    basically an entire wafer."

    https://venturebeat.com/2021/04/20/cerebras-systems-launches-new-ai-
    supercomputing-processor-with-2-6-trillion-transistors/

    Number of cores isn't the problem. For fairly tightly-coupled tasks
    such as simulations, the issue is interconnect latency between cores,
    and the required bandwidth goes roughly as the cube or Moore's law, so
    it ran out of gas long ago.

    One thing that zillions of cores could do for SPICE is to do all the
    stepped parameter runs simultaneously. At that point all you need is >infinite bandwidth to disk.

    This whole hairball is summarized in Amdahl's Law:

    .<https://en.wikipedia.org/wiki/Amdahl%27s_law#:~:text=In%20computer%20architecture%2C%20Amdahl's%20law,system%20whose%20resources%20are%20improved>


    Joe Gwinn

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Phil Hobbs@21:1/5 to Joe Gwinn on Fri Apr 29 20:51:43 2022
    Joe Gwinn wrote:
    On Fri, 29 Apr 2022 10:03:23 -0400, Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:

    Mike Monett wrote:
    Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:

    John Larkin wrote:
    On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none>
    wrote:

    On 4/28/22 11:26, boB wrote:

    I would love to have a super computer to run LTspice.

    I thought one of the problems with LTspice (and spice in
    general) performance is that the algorithms don't
    parallelize very well.

    LT runs on multiple cores now. I'd love the next gen LT Spice
    to run on an Nvidia card. 100x at least.


    The "number of threads" setting doesn't do anything very
    dramatic, though, at least last time I tried. Splitting up the
    calculation between cores would require all of them to
    communicate a couple of times per time step, but lots of other
    simulation codes do that.

    The main trouble is that the matrix defining the connectivity
    between nodes is highly irregular in general.

    Parallellizing that efficiently might well need a
    special-purpose compiler, sort of similar to the profile-guided
    optimizer in the guts of the FFTW code for computing DFTs.
    Probably not at all impossible, but not that straightforward to
    implement.

    Cheers

    Phil Hobbs

    Supercomputers have thousands or hundreds of thousands of cores.

    Quote:

    "Cerebras Systems has unveiled its new Wafer Scale Engine 2
    processor with a record-setting 2.6 trillion transistors and
    850,000 AI-optimized cores. It’s built for supercomputing tasks,
    and it’s the second time since 2019 that Los Altos,
    California-based Cerebras has unveiled a chip that is basically
    an entire wafer."

    https://venturebeat.com/2021/04/20/cerebras-systems-launches-new-ai- supercomputing-processor-with-2-6-trillion-transistors/

    Number of cores isn't the problem. For fairly tightly-coupled
    tasks such as simulations, the issue is interconnect latency
    between cores, and the required bandwidth goes roughly as the cube
    or Moore's law, so it ran out of gas long ago.

    One thing that zillions of cores could do for SPICE is to do all
    the stepped parameter runs simultaneously. At that point all you
    need is infinite bandwidth to disk.

    This whole hairball is summarized in Amdahl's Law:

    .<https://en.wikipedia.org/wiki/Amdahl%27s_law#:~:text=In%20computer%20architecture%2C%20Amdahl's%20law,system%20whose%20resources%20are%20improved>

    Not exactly. There's very little serial execution required to
    parallellize parameter stepping, or even genetic-algorithm optimization.

    Communications overhead isn't strictly serial either--N processors can
    have several times N communication channels. It's mostly a latency issue.

    Cheers

    Phil Hobbs

    --
    Dr Philip C D Hobbs
    Principal Consultant
    ElectroOptical Innovations LLC / Hobbs ElectroOptics
    Optics, Electro-optics, Photonics, Analog Electronics
    Briarcliff Manor NY 10510

    http://electrooptical.net
    http://hobbs-eo.com

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Joe Gwinn@21:1/5 to pcdhSpamMeSenseless@electrooptical. on Sat Apr 30 09:04:37 2022
    On Fri, 29 Apr 2022 20:51:43 -0400, Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:

    Joe Gwinn wrote:
    On Fri, 29 Apr 2022 10:03:23 -0400, Phil Hobbs
    <pcdhSpamMeSenseless@electrooptical.net> wrote:

    Mike Monett wrote:
    Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:

    John Larkin wrote:
    On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none>
    wrote:

    On 4/28/22 11:26, boB wrote:

    I would love to have a super computer to run LTspice.

    I thought one of the problems with LTspice (and spice in
    general) performance is that the algorithms don't
    parallelize very well.

    LT runs on multiple cores now. I'd love the next gen LT Spice
    to run on an Nvidia card. 100x at least.


    The "number of threads" setting doesn't do anything very
    dramatic, though, at least last time I tried. Splitting up the
    calculation between cores would require all of them to
    communicate a couple of times per time step, but lots of other
    simulation codes do that.

    The main trouble is that the matrix defining the connectivity
    between nodes is highly irregular in general.

    Parallellizing that efficiently might well need a
    special-purpose compiler, sort of similar to the profile-guided
    optimizer in the guts of the FFTW code for computing DFTs.
    Probably not at all impossible, but not that straightforward to
    implement.

    Cheers

    Phil Hobbs

    Supercomputers have thousands or hundreds of thousands of cores.

    Quote:

    "Cerebras Systems has unveiled its new Wafer Scale Engine 2
    processor with a record-setting 2.6 trillion transistors and
    850,000 AI-optimized cores. Its built for supercomputing tasks,
    and its the second time since 2019 that Los Altos,
    California-based Cerebras has unveiled a chip that is basically
    an entire wafer."

    https://venturebeat.com/2021/04/20/cerebras-systems-launches-new-ai- >supercomputing-processor-with-2-6-trillion-transistors/

    Number of cores isn't the problem. For fairly tightly-coupled
    tasks such as simulations, the issue is interconnect latency
    between cores, and the required bandwidth goes roughly as the cube
    or Moore's law, so it ran out of gas long ago.

    One thing that zillions of cores could do for SPICE is to do all
    the stepped parameter runs simultaneously. At that point all you
    need is infinite bandwidth to disk.

    This whole hairball is summarized in Amdahl's Law:

    .<https://en.wikipedia.org/wiki/Amdahl%27s_law#:~:text=In%20computer%20architecture%2C%20Amdahl's%20law,system%20whose%20resources%20are%20improved>

    Not exactly. There's very little serial execution required to
    parallellize parameter stepping, or even genetic-algorithm optimization.

    Communications overhead isn't strictly serial either--N processors can
    have several times N communication channels. It's mostly a latency issue.

    In general, yes. But far too far down in the weeds.

    Amdahl's Law is easier to explain to a business manager that thinks
    that parallelism solves all performance issues, if only the engineers
    would stop carping and do their jobs.

    And then there are the architectures that would do wondrous things, if
    only light were not so damn slow.

    Joe Gwinn

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to Joe Gwinn on Sat Apr 30 07:02:43 2022
    On Saturday, April 30, 2022 at 9:04:50 AM UTC-4, Joe Gwinn wrote:
    On Fri, 29 Apr 2022 20:51:43 -0400, Phil Hobbs <pcdhSpamM...@electrooptical.net> wrote:

    Joe Gwinn wrote:
    On Fri, 29 Apr 2022 10:03:23 -0400, Phil Hobbs
    <pcdhSpamM...@electrooptical.net> wrote:

    Mike Monett wrote:
    Phil Hobbs <pcdhSpamM...@electrooptical.net> wrote:

    John Larkin wrote:
    On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <den...@none.none>
    wrote:

    On 4/28/22 11:26, boB wrote:

    I would love to have a super computer to run LTspice.

    I thought one of the problems with LTspice (and spice in
    general) performance is that the algorithms don't
    parallelize very well.

    LT runs on multiple cores now. I'd love the next gen LT Spice
    to run on an Nvidia card. 100x at least.


    The "number of threads" setting doesn't do anything very
    dramatic, though, at least last time I tried. Splitting up the
    calculation between cores would require all of them to
    communicate a couple of times per time step, but lots of other
    simulation codes do that.

    The main trouble is that the matrix defining the connectivity
    between nodes is highly irregular in general.

    Parallellizing that efficiently might well need a
    special-purpose compiler, sort of similar to the profile-guided
    optimizer in the guts of the FFTW code for computing DFTs.
    Probably not at all impossible, but not that straightforward to
    implement.

    Cheers

    Phil Hobbs

    Supercomputers have thousands or hundreds of thousands of cores.

    Quote:

    "Cerebras Systems has unveiled its new Wafer Scale Engine 2
    processor with a record-setting 2.6 trillion transistors and
    850,000 AI-optimized cores. It’s built for supercomputing tasks,
    and it’s the second time since 2019 that Los Altos,
    California-based Cerebras has unveiled a chip that is basically
    an entire wafer."

    https://venturebeat.com/2021/04/20/cerebras-systems-launches-new-ai- >supercomputing-processor-with-2-6-trillion-transistors/

    Number of cores isn't the problem. For fairly tightly-coupled
    tasks such as simulations, the issue is interconnect latency
    between cores, and the required bandwidth goes roughly as the cube
    or Moore's law, so it ran out of gas long ago.

    One thing that zillions of cores could do for SPICE is to do all
    the stepped parameter runs simultaneously. At that point all you
    need is infinite bandwidth to disk.

    This whole hairball is summarized in Amdahl's Law:

    .<https://en.wikipedia.org/wiki/Amdahl%27s_law#:~:text=In%20computer%20architecture%2C%20Amdahl's%20law,system%20whose%20resources%20are%20improved>

    Not exactly. There's very little serial execution required to
    parallellize parameter stepping, or even genetic-algorithm optimization.

    Communications overhead isn't strictly serial either--N processors can >have several times N communication channels. It's mostly a latency issue.
    In general, yes. But far too far down in the weeds.

    Amdahl's Law is easier to explain to a business manager that thinks
    that parallelism solves all performance issues, if only the engineers
    would stop carping and do their jobs.

    And then there are the architectures that would do wondrous things, if
    only light were not so damn slow.

    People often focus on the fact that the size of the chip limits the speed without considering how the size might be reduced (and the speed increased) using multi-valued logic. I suppose the devil is in the details, but if more information can be carried
    on fewer wires, the routing area of a chip can be reduced, speeding the entire chip.

    I've only heard of memory type circuits being implemented with multivalued logic, since the bulk of the die area is storage and that shrinks considerably. I believe they are up to 16 values, so four bits in place of one, but I only see TLC which has 8
    values per cell. Logic chips are much harder to find using multi-valued logic. Obviously there are issues with making them work well.

    --

    Rick C.

    +- Get 1,000 miles of free Supercharging
    +- Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to Phil Hobbs on Sun May 1 10:46:58 2022
    On 30/04/2022 01:51, Phil Hobbs wrote:
    Joe Gwinn wrote:
    On Fri, 29 Apr 2022 10:03:23 -0400, Phil Hobbs
    <pcdhSpamMeSenseless@electrooptical.net> wrote:


    Number of cores isn't the problem.  For fairly tightly-coupled
    tasks such as simulations, the issue is interconnect latency
    between cores, and the required bandwidth goes roughly as the cube
    or Moore's law, so it ran out of gas long ago.

    One thing that zillions of cores could do for SPICE is to do all
    the stepped parameter runs simultaneously.  At that point all you
    need is infinite bandwidth to disk.

    Parallelism for exploring a wide range starting parameters and then
    evolving them based on how well the model fits seems to be in vogue now. eg

    https://arxiv.org/abs/1804.04737

    This whole hairball is summarized in Amdahl's Law:

    .<https://en.wikipedia.org/wiki/Amdahl%27s_law#:~:text=In%20computer%20architecture%2C%20Amdahl's%20law,system%20whose%20resources%20are%20improved>


    Not exactly.  There's very little serial execution required to
    parallellize parameter stepping, or even genetic-algorithm optimization.

    Communications overhead isn't strictly serial either--N processors can
    have several times N communication channels.  It's mostly a latency issue.

    Anyone who has ever done it quickly learns that by far the most
    important highest priority task is the not the computation itself but
    the management required to keep all of the cores doing useful work!

    It is easy to have all cores working flat out but if most of the
    parallelised work being done so quickly will be later shown to be
    redundant due to some higher level pruning algorithm all you are doing
    is generating more heat and only a miniscule performance gain (if that).

    SIMD has made quite a performance improvement for some problems on the
    Intel and AMD platforms. The compilers still haven't quite caught up
    with the hardware though. Alignment is now a rather annoying issue if
    you care about avoiding unnecessary cache misses and pipeline stalls.

    You can align your own structures correctly but can do nothing about
    virtual structures that the compiler creates and puts on the stack
    misaligned spanning two cache lines. The result is code which executes
    with two distinct characteristic times depending on where the cache line boundaries are in relation the top of stack when it is called!

    It really only matters in the very deepest levels of computationally
    intensive code which is probably why they don't try quite hard enough.
    Most people probably wouldn't notice ~5% changes unless they were
    benchmarking or monitoring MSRs for cache misses and pipeline stalls.

    --
    Regards,
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Phil Hobbs@21:1/5 to Martin Brown on Sun May 1 06:08:49 2022
    Martin Brown wrote:
    On 30/04/2022 01:51, Phil Hobbs wrote:
    Joe Gwinn wrote:
    On Fri, 29 Apr 2022 10:03:23 -0400, Phil Hobbs
    <pcdhSpamMeSenseless@electrooptical.net> wrote:


    Number of cores isn't the problem.  For fairly tightly-coupled
    tasks such as simulations, the issue is interconnect latency
    between cores, and the required bandwidth goes roughly as the cube
    or Moore's law, so it ran out of gas long ago.

    One thing that zillions of cores could do for SPICE is to do all
    the stepped parameter runs simultaneously.  At that point all you
    need is infinite bandwidth to disk.

    Parallelism for exploring a wide range starting parameters and then
    evolving them based on how well the model fits seems to be in vogue now. eg

    https://arxiv.org/abs/1804.04737

    This whole hairball is summarized in Amdahl's Law:

    .<https://en.wikipedia.org/wiki/Amdahl%27s_law#:~:text=In%20computer%20architecture%2C%20Amdahl's%20law,system%20whose%20resources%20are%20improved>


    Not exactly.  There's very little serial execution required to
    parallellize parameter stepping, or even genetic-algorithm optimization.

    Communications overhead isn't strictly serial either--N processors can
    have several times N communication channels.  It's mostly a latency
    issue.

    Anyone who has ever done it quickly learns that by far the most
    important highest priority task is the not the computation itself but
    the management required to keep all of the cores doing useful work!

    Yup. In my big EM code, that's handled by The Cluster Script From Hell. ;)


    It is easy to have all cores working flat out but if most of the
    parallelised work being done so quickly will be later shown to be
    redundant due to some higher level pruning algorithm all you are doing
    is generating more heat and only a miniscule performance gain (if that).

    SIMD has made quite a performance improvement for some problems on the
    Intel and AMD platforms. The compilers still haven't quite caught up
    with the hardware though. Alignment is now a rather annoying issue if
    you care about avoiding unnecessary cache misses and pipeline stalls.

    You can align your own structures correctly but can do nothing about
    virtual structures that the compiler creates and puts on the stack
    misaligned spanning two cache lines. The result is code which executes
    with two distinct characteristic times depending on where the cache line boundaries are in relation the top of stack when it is called!

    It really only matters in the very deepest levels of computationally intensive code which is probably why they don't try quite hard enough.
    Most people probably wouldn't notice ~5% changes unless they were benchmarking or monitoring MSRs for cache misses and pipeline stalls.

    Well, your average hardcore numerical guy would proably just buy two
    clusters and pick the one that finished first. ;)

    Fifteen or so years ago, I got about a 3:1 improvement in FDTD speed by precomputing a strategy that let me iterate over a list containing runs
    of voxels with the same material, vs. just putting a big switch
    statement inside a triply-nested loop (the usual approach).

    I mentioned it to another EM simulation guy at a conference once, who
    said, "So what? I'd just get a bigger cluster."

    Cheers

    Phil Hobbs


    --
    Dr Philip C D Hobbs
    Principal Consultant
    ElectroOptical Innovations LLC / Hobbs ElectroOptics
    Optics, Electro-optics, Photonics, Analog Electronics
    Briarcliff Manor NY 10510

    http://electrooptical.net
    http://hobbs-eo.com

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Cydrome Leader@21:1/5 to Martin Brown on Tue May 3 21:12:52 2022
    Martin Brown <'''newspam'''@nonad.co.uk> wrote:
    On 28/04/2022 18:47, Jeroen Belleman wrote:
    On 2022-04-28 18:26, boB wrote:
    [...]
    I would love to have a super computer to run LTspice.

    boB

    In fact, what you have on your desk *is* a super computer,
    in the 1970's meaning of the words. It's just that it's
    bogged down running bloatware.

    Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock and
    a whopping for the time 128MB of fast core memory with 40GB of disk. The

    what is fast core memory?

    one I used had an amazing for the time 1TB tape cassette backing store.
    It did 600 MFLOPs with the right sort of parallel vector code.

    That was back in the day when you needed special permission to use more
    than 4MB of core on the timesharing IBM 3081 (approx 7 MIPS).

    Current Intel 12 gen CPU desktops are ~4GHz, 16GB ram and >1TB of disk.
    (and the upper limits are even higher) That combo does ~66,000 MFLOPS.

    Spice simulation doesn't scale particularly well to large scale multiprocessor environments to many long range interractions.


    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From whit3rd@21:1/5 to jla...@highlandsniptechnology.com on Tue May 3 17:24:47 2022
    On Friday, April 29, 2022 at 7:30:55 AM UTC-7, jla...@highlandsniptechnology.com wrote:

    Climate simulation uses enormous multi-CPU supercomputer rigs.

    Not so; it's WEATHER mapping and prediction that uses the complex data sets
    for a varied bunch of globe locations doing sensing, to make a 3-d map for
    the planet's atmosphere. Climate is a much cruder problem, no details required. Much of the greenhouse gas analysis comes out of models
    that a PC spreadsheet would handle easily.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Cydrome Leader on Tue May 3 20:35:45 2022
    On 05/03/2022 03:12 PM, Cydrome Leader wrote:
    Martin Brown <'''newspam'''@nonad.co.uk> wrote:
    On 28/04/2022 18:47, Jeroen Belleman wrote:
    On 2022-04-28 18:26, boB wrote:
    [...]
    I would love to have a super computer to run LTspice.

    boB
    In fact, what you have on your desk *is* a super computer,
    in the 1970's meaning of the words. It's just that it's
    bogged down running bloatware.
    Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock and
    a whopping for the time 128MB of fast core memory with 40GB of disk. The
    what is fast core memory?


    A very expensive item:

    https://en.wikipedia.org/wiki/Magnetic-core_memory

    Fortunately by the X-MP's time SRAMs had replaced magnetic core.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ricky@21:1/5 to Cydrome Leader on Tue May 3 22:09:18 2022
    On Tuesday, May 3, 2022 at 5:12:59 PM UTC-4, Cydrome Leader wrote:
    Martin Brown <'''newspam'''@nonad.co.uk> wrote:
    On 28/04/2022 18:47, Jeroen Belleman wrote:
    On 2022-04-28 18:26, boB wrote:
    [...]
    I would love to have a super computer to run LTspice.

    boB

    In fact, what you have on your desk *is* a super computer,
    in the 1970's meaning of the words. It's just that it's
    bogged down running bloatware.

    Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock and
    a whopping for the time 128MB of fast core memory with 40GB of disk. The
    what is fast core memory?

    An oxymoron.

    --

    Rick C.

    ++ Get 1,000 miles of free Supercharging
    ++ Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From DecadentLinuxUserNumeroUno@decadenc@21:1/5 to whit3rd@gmail.com on Wed May 4 05:33:04 2022
    whit3rd <whit3rd@gmail.com> wrote in news:a7f5b2f5-3b81-4298-985c-1bbec41ed982n@googlegroups.com:

    On Friday, April 29, 2022 at 7:30:55 AM UTC-7, jla...@highlandsniptechnology.com wrote:

    Climate simulation uses enormous multi-CPU supercomputer rigs.

    Not so; it's WEATHER mapping and prediction that uses the complex
    data sets for a varied bunch of globe locations doing sensing, to
    make a 3-d map for the planet's atmosphere. Climate is a much
    cruder problem, no details required. Much of the greenhouse gas
    analysis comes out of models that a PC spreadsheet would handle
    easily.


    We have real time sat imagery of our weather patterns.

    *I* can see what is coming or not. The forecasting tool does not
    do that great a job and is it my phone's computer's forcast or coming
    from the site feeding me the weather imagery? Either way it aint that
    great and hardly the main utilization factor.

    Weather modeling is done on a bigger scale looking at storm systems
    crossing the ocean in our direction (US).

    Our local stuff used to be predicted by individual opinions of
    local meteorologists. Now even they all rely on a nationally
    available data set, which is where my app from a Michigan TV station
    sources its data. The app works fine here, hundreds of miles away.

    My phone is great. I also have an anatomy app on there and I can
    look at individual piece of cartilage and it will tell me what its
    name is. It looks real cool on my iPad. I have one for the brain as
    well.

    Movies used to take hours and hours of frame rendering time to
    'render' a frame of movie video and all the CGI was in its infancy.

    Now I have a multi-core Xeon and a Quadro graphics card and can do
    3D rendering at 4K resolution.
    And they just came out with Unreal Engine 5. It is friggin'
    amazing how far they've come.

    <https://www.youtube.com/watch?v=7ZLibi6s_ew>

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to rbowman on Wed May 4 09:07:13 2022
    On 04/05/2022 03:35, rbowman wrote:
    On 05/03/2022 03:12 PM, Cydrome Leader wrote:
    Martin Brown <'''newspam'''@nonad.co.uk> wrote:
    On 28/04/2022 18:47, Jeroen Belleman wrote:
    On 2022-04-28 18:26, boB wrote:
    [...]
    I would love to have a super computer to run LTspice.

    boB
    In fact, what you have on your desk *is* a super computer,
    in the 1970's meaning of the words. It's just that it's
    bogged down running bloatware.

    Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock and >>> a whopping for the time 128MB of fast core memory with 40GB of disk. The
    what is fast core memory?

    A very expensive item:

    https://en.wikipedia.org/wiki/Magnetic-core_memory

    Fortunately by the X-MP's time SRAMs had replaced magnetic core.

    But at the time it was still often called core (bulk) memory as opposed
    to faster cache memory. ISTR the memory chips were only 4k bits of SRAM.

    Keeping the thing compact and cool was a major part of the engineering.

    There is a rather nice article about its design online here.

    http://www.chilton-computing.org.uk/ccd/supercomputers/p005.htm

    --
    Regards,
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Martin Brown on Wed May 4 08:06:43 2022
    On 05/04/2022 02:07 AM, Martin Brown wrote:
    On 04/05/2022 03:35, rbowman wrote:
    On 05/03/2022 03:12 PM, Cydrome Leader wrote:
    Martin Brown <'''newspam'''@nonad.co.uk> wrote:
    On 28/04/2022 18:47, Jeroen Belleman wrote:
    On 2022-04-28 18:26, boB wrote:
    [...]
    I would love to have a super computer to run LTspice.

    boB
    In fact, what you have on your desk *is* a super computer,
    in the 1970's meaning of the words. It's just that it's
    bogged down running bloatware.

    Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock
    and
    a whopping for the time 128MB of fast core memory with 40GB of disk.
    The
    what is fast core memory?

    A very expensive item:

    https://en.wikipedia.org/wiki/Magnetic-core_memory

    Fortunately by the X-MP's time SRAMs had replaced magnetic core.

    But at the time it was still often called core (bulk) memory as opposed
    to faster cache memory. ISTR the memory chips were only 4k bits of SRAM.

    Keeping the thing compact and cool was a major part of the engineering.

    There is a rather nice article about its design online here.

    http://www.chilton-computing.org.uk/ccd/supercomputers/p005.htm


    We still examine core dumps. Compact and cool is still a problem not
    only for server farms but for things like the Intel NUC. Just can't
    escape history.

    Most of the buildings at RPI dated back to the early 20th century and AC
    was not an option. The new building to house the System 360/30 was an
    oasis on hot days. That thing was not compact and used magnetic core. I
    don't think it had enough computing power to run a modern refrigerator.

    iirc during the startup of the new NSA computing center in Bluffdale UT
    they found they had enough power to either run the servers or keep them
    cool. Back to the drawing board. It also boggles my mind that bitcoin
    mining is a major draw on the power grid.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Cydrome Leader@21:1/5 to rbowman on Wed May 4 18:35:37 2022
    rbowman <bowman@montana.com> wrote:
    On 05/03/2022 03:12 PM, Cydrome Leader wrote:
    Martin Brown <'''newspam'''@nonad.co.uk> wrote:
    On 28/04/2022 18:47, Jeroen Belleman wrote:
    On 2022-04-28 18:26, boB wrote:
    [...]
    I would love to have a super computer to run LTspice.

    boB
    In fact, what you have on your desk *is* a super computer,
    in the 1970's meaning of the words. It's just that it's
    bogged down running bloatware.
    Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock and >>> a whopping for the time 128MB of fast core memory with 40GB of disk. The
    what is fast core memory?


    A very expensive item:

    https://en.wikipedia.org/wiki/Magnetic-core_memory

    Fortunately by the X-MP's time SRAMs had replaced magnetic core.

    I'm not aware of any cray systems that used core memory. It just makes no
    sense for the speeds they ran at.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Cydrome Leader on Wed May 4 20:51:21 2022
    On 05/04/2022 12:35 PM, Cydrome Leader wrote:
    I'm not aware of any cray systems that used core memory. It just makes no sense for the speeds they ran at.

    I believe the CDC 7600 was the last Cray design to use magnetic core. He
    then left CDC and the Cray-1 was SRAM.

    The CDC 7600 was no slouch for its time.

    https://en.wikipedia.org/wiki/CDC_7600

    Control Data was ahead of its time and started the Committee for Social Responsibility shorty before Cray left (no correlation). Like many of
    the early giants time did not treat them well.

    https://www.nytimes.com/1979/01/07/archives/how-control-data-turns-a-profit-on-its-good-works-making-it-work.html

    https://gallery.lib.umn.edu/items/show/5867

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Les Cargill@21:1/5 to Martin Brown on Thu Jun 16 20:23:45 2022
    Martin Brown wrote:
    On 28/04/2022 18:47, Jeroen Belleman wrote:
    On 2022-04-28 18:26, boB wrote:
    [...]
    I would love to have a super computer to run LTspice.

    boB

    In fact, what you have on your desk *is* a super computer,
    in the 1970's meaning of the words. It's just that it's
    bogged down running bloatware.

    Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock and
    a whopping for the time 128MB of fast core memory with 40GB of disk. The
    one I used had an amazing for the time 1TB tape cassette backing store.
    It did 600 MFLOPs with the right sort of parallel vector code.

    That was back in the day when you needed special permission to use more
    than 4MB of core on the timesharing IBM 3081 (approx 7 MIPS).

    Current Intel 12 gen CPU desktops are ~4GHz, 16GB ram and >1TB of disk.
    (and the upper limits are even higher) That combo does ~66,000 MFLOPS.

    Spice simulation doesn't scale particularly well to large scale multiprocessor environments to many long range interractions.


    If you search for "circuit sim and CUDA" it's out there. There's a
    Github of "CUDA SPICE Circuit Simulator" .

    No clue if it's worthwhile.

    --
    Les Cargill

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to Les Cargill on Mon Jun 20 13:20:57 2022
    On 17/06/2022 02:23, Les Cargill wrote:
    Martin Brown wrote:

    Spice simulation doesn't scale particularly well to large scale
    multiprocessor environments too many long range interractions.

    If you search for "circuit sim and CUDA" it's out there. There's a
    Github of "CUDA SPICE Circuit Simulator" .

    No clue if it's worthwhile.

    My instinct is that it will generate a lot more heat to solve the
    problem a little bit quicker than a conventional system (unless you are
    able to split the problem into a large number of distinct separate
    simulations with different starting parameters.

    That is what happens on the system I am working on (not Spice). My bit
    of it is strictly single threaded but it runs a on every CPU. The next
    tier up manages the whole thing to keep them busy doing useful work.

    --
    Regards,
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From a a@21:1/5 to jla...@highlandsniptechnology.com on Mon Jun 20 05:47:02 2022
    On Tuesday, 26 April 2022 at 17:44:53 UTC+2, jla...@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer
    analysis of climate change. They analyzed five west coast "extreme
    storms" from 1982 to 2014.

    The conclusion from a senior scientist is that "it rains a lot more
    during the worst storms."



    --

    Anybody can count to one.

    - Robert Widlar
    Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.

    Freon is another fake.

    Climate is clocked by solar activity and by fluctuations in solar activity.

    So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.

    Removing trees within city limits, you can turn any city in heat island swith rising temperatures.=, since removing trees, grass, you destroy rainwater retension mechanism.

    Water absorbs heat from the sun by evaporation.

    So if no water in the ground, no water evaporated and heat accumulates, making local temperatures to rise.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Anthony William Sloman@21:1/5 to a a on Mon Jun 20 06:13:03 2022
    On Monday, June 20, 2022 at 2:47:09 PM UTC+2, a a wrote:
    On Tuesday, 26 April 2022 at 17:44:53 UTC+2, jla...@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer analysis of climate change. They analyzed five west coast "extreme
    storms" from 1982 to 2014.

    The conclusion from a senior scientist is that "it rains a lot more
    during the worst storms."

    Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.

    What a load of nonsense. Al Gore's 1992 book

    https://en.wikipedia.org/wiki/Earth_in_the_Balance

    was a remarkably expert bit of science popularisation. He got the science right, not that the denialist propaganda machine is willing to admit it.
    The book did make money, but not all that much. A decade later it did put Al Gore in a position to make money out of climate change, but that didn't mean that he wrote it with that in mind.

    Freon is another fake.

    In what sense? Chlorofluorocarbons do damage the ozone layer. We know exactly how - and we know that reducing their concentrations in the atmosphere is letting the ozone layer get denser again. The fakery here lies in your lie.

    Climate is clocked by solar activity and by fluctuations in solar activity.

    And the amount of CO2 and other green-house gases in the atmosphere. As Joseph Fourier worked out in 1824, if they weren't there the temperature of the surface of the Earth would be -18C. The difference between ice ages (atmospheric CO2 levels around 180
    ppm) and interglacials (atmospheric CO2 levels around 270 ppm) also depends on the more extensive ice cover during interglacials, but the CO2 levels do account for a lot of the difference.

    So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.

    Except that you can't. Solar activity doesn't explain the ice age to interglacial transitions, and only an ignorant idiot could imagine that they did

    Removing trees within city limits, you can turn any city in heat island with rising temperatures, since removing trees, grass, you destroy rainwater retention mechanism.

    Total nonsense.

    Water absorbs heat from the sun by evaporation.

    But the water vapour retains the heat at the bottom of the atmosphere.

    So if no water in the ground, no water evaporated and heat accumulates, making local temperatures to rise.

    So what?

    --
    Bill Sloman, Sydney

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From a a@21:1/5 to bill....@ieee.org on Mon Jun 20 06:35:57 2022
    On Monday, 20 June 2022 at 15:13:10 UTC+2, bill....@ieee.org wrote:
    On Monday, June 20, 2022 at 2:47:09 PM UTC+2, a a wrote:
    On Tuesday, 26 April 2022 at 17:44:53 UTC+2, jla...@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer analysis of climate change. They analyzed five west coast "extreme storms" from 1982 to 2014.

    The conclusion from a senior scientist is that "it rains a lot more during the worst storms."

    Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.
    What a load of nonsense. Al Gore's 1992 book

    https://en.wikipedia.org/wiki/Earth_in_the_Balance

    was a remarkably expert bit of science popularisation. He got the science right, not that the denialist propaganda machine is willing to admit it.
    The book did make money, but not all that much. A decade later it did put Al Gore in a position to make money out of climate change, but that didn't mean that he wrote it with that in mind.

    Freon is another fake.

    In what sense? Chlorofluorocarbons do damage the ozone layer. We know exactly how - and we know that reducing their concentrations in the atmosphere is letting the ozone layer get denser again. The fakery here lies in your lie.
    Climate is clocked by solar activity and by fluctuations in solar activity.
    And the amount of CO2 and other green-house gases in the atmosphere. As Joseph Fourier worked out in 1824, if they weren't there the temperature of the surface of the Earth would be -18C. The difference between ice ages (atmospheric CO2 levels around
    180 ppm) and interglacials (atmospheric CO2 levels around 270 ppm) also depends on the more extensive ice cover during interglacials, but the CO2 levels do account for a lot of the difference.
    So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.
    Except that you can't. Solar activity doesn't explain the ice age to interglacial transitions, and only an ignorant idiot could imagine that they did

    Removing trees within city limits, you can turn any city in heat island with rising temperatures, since removing trees, grass, you destroy rainwater retention mechanism.

    Total nonsense.
    Water absorbs heat from the sun by evaporation.
    But the water vapour retains the heat at the bottom of the atmosphere.
    So if no water in the ground, no water evaporated and heat accumulates, making local temperatures to rise.
    So what?

    --
    Bill Sloman, Sydney
    You are exactly "Total nonsense. "

    Sydney is low science city, so we don't care

    Call prof. Mann and tell him, there has been no sea level rise at Pacific islands at all, on the Maledives, for the last 1,000 years

    If Kremlin funds hundreds of so called pseudo scientists world-wide to sell more natural gas,
    so call Greta and ask her, where is she with the Global Warming fake today

    where is UNFCC Bonn agency, where is UN New York SIDS agency today
    (Small Island Developing States)

    If $Bs are pumped into your bank account, so you sell every paranoia as a genuine science

    Global Warming is an old fake funded by Putin to sell more natural gas

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Anthony William Sloman@21:1/5 to a a on Mon Jun 20 07:27:29 2022
    On Monday, June 20, 2022 at 3:36:04 PM UTC+2, a a wrote:
    On Monday, 20 June 2022 at 15:13:10 UTC+2, bill....@ieee.org wrote:
    On Monday, June 20, 2022 at 2:47:09 PM UTC+2, a a wrote:
    On Tuesday, 26 April 2022 at 17:44:53 UTC+2, jla...@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer analysis of climate change. They analyzed five west coast "extreme storms" from 1982 to 2014.

    The conclusion from a senior scientist is that "it rains a lot more during the worst storms."

    Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.
    What a load of nonsense. Al Gore's 1992 book

    https://en.wikipedia.org/wiki/Earth_in_the_Balance

    was a remarkably expert bit of science popularisation. He got the science right, not that the denialist propaganda machine is willing to admit it.
    The book did make money, but not all that much. A decade later it did put Al Gore in a position to make money out of climate change, but that didn't mean that he wrote it with that in mind.

    Freon is another fake.

    In what sense? Chlorofluorocarbons do damage the ozone layer. We know exactly how - and we know that reducing their concentrations in the atmosphere is letting the ozone layer get denser again. The fakery here lies in your lie.
    Climate is clocked by solar activity and by fluctuations in solar activity.
    And the amount of CO2 and other green-house gases in the atmosphere. As Joseph Fourier worked out in 1824, if they weren't there the temperature of the surface of the Earth would be -18C. The difference between ice ages (atmospheric CO2 levels around
    180 ppm) and interglacials (atmospheric CO2 levels around 270 ppm) also depends on the more extensive ice cover during interglacials, but the CO2 levels do account for a lot of the difference.
    So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.
    Except that you can't. Solar activity doesn't explain the ice age to interglacial transitions, and only an ignorant idiot could imagine that they did

    Removing trees within city limits, you can turn any city in heat island with rising temperatures, since removing trees, grass, you destroy rainwater retention mechanism.

    Total nonsense.

    Water absorbs heat from the sun by evaporation.

    But the water vapour retains the heat at the bottom of the atmosphere.

    So if no water in the ground, no water evaporated and heat accumulates, making local temperatures to rise.

    So what?

    You are exactly "Total nonsense. "

    You may like to think so, but you haven't explained why you think that. It's blindingly obvious that you couldn't, even if you were silly enough to try.

    Sydney is low science city, so we don't care.

    https://www.fqt.unsw.edu.au/news/top-physics-prizes-awarded-to-unsw-researchers

    I got in on a tour of that lab. I was impressed by their Raith electron beam microfabricator, which is a pretty impressive kind of lab tool.

    Call prof. Mann and tell him, there has been no sea level rise at Pacific islands at all, on the Maledives, for the last 1,000 years.

    The Maledives are in the Indian Ocean, not too far south of Ceylon. I'd prefer not to get jeered at as an ignorant idiot. And they do seem to be worried about
    sea level rise.

    https://en.wikipedia.org/wiki/Maldives#Sea_level_rise

    If Kremlin funds hundreds of so called pseudo scientists world-wide to sell more natural gas, so call Greta and ask her, where is she with the Global Warming fake today.

    It's not the Kremlin that's funding the lying that is going on. Exxon-Mobile does a lot of that, but they are funding the climate change denial propaganda that seems to be fooling you.

    where is UNFCC Bonn agency, where is UN New York SIDS agency today (Small Island Developing States)

    Why should I care?

    If $Bs are pumped into your bank account, so you sell every paranoia as a genuine science.

    If only.

    Global Warming is an old fake funded by Putin to sell more natural gas.

    https://history.aip.org/climate/index.htm

    It's been around for rather longer than Putin, and it's not great way of selling natural gas. We'll have to stop burning that as fuel as well as coal and oil if we are going to stop raising the CO2 level in the atmosphere, which is getting to be urgently
    necessary, even if ignorant idiots like you don't understand why.

    --
    Bill Sloman, Sydney

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From a a@21:1/5 to bill....@ieee.org on Mon Jun 20 08:46:52 2022
    On Monday, 20 June 2022 at 16:27:36 UTC+2, bill....@ieee.org wrote:
    On Monday, June 20, 2022 at 3:36:04 PM UTC+2, a a wrote:
    On Monday, 20 June 2022 at 15:13:10 UTC+2, bill....@ieee.org wrote:
    On Monday, June 20, 2022 at 2:47:09 PM UTC+2, a a wrote:
    On Tuesday, 26 April 2022 at 17:44:53 UTC+2, jla...@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer analysis of climate change. They analyzed five west coast "extreme storms" from 1982 to 2014.

    The conclusion from a senior scientist is that "it rains a lot more during the worst storms."

    Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.
    What a load of nonsense. Al Gore's 1992 book

    https://en.wikipedia.org/wiki/Earth_in_the_Balance

    was a remarkably expert bit of science popularisation. He got the science right, not that the denialist propaganda machine is willing to admit it.
    The book did make money, but not all that much. A decade later it did put Al Gore in a position to make money out of climate change, but that didn't mean that he wrote it with that in mind.

    Freon is another fake.

    In what sense? Chlorofluorocarbons do damage the ozone layer. We know exactly how - and we know that reducing their concentrations in the atmosphere is letting the ozone layer get denser again. The fakery here lies in your lie.
    Climate is clocked by solar activity and by fluctuations in solar activity.
    And the amount of CO2 and other green-house gases in the atmosphere. As Joseph Fourier worked out in 1824, if they weren't there the temperature of the surface of the Earth would be -18C. The difference between ice ages (atmospheric CO2 levels
    around 180 ppm) and interglacials (atmospheric CO2 levels around 270 ppm) also depends on the more extensive ice cover during interglacials, but the CO2 levels do account for a lot of the difference.
    So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.
    Except that you can't. Solar activity doesn't explain the ice age to interglacial transitions, and only an ignorant idiot could imagine that they did

    Removing trees within city limits, you can turn any city in heat island with rising temperatures, since removing trees, grass, you destroy rainwater retention mechanism.

    Total nonsense.

    Water absorbs heat from the sun by evaporation.

    But the water vapour retains the heat at the bottom of the atmosphere.

    So if no water in the ground, no water evaporated and heat accumulates, making local temperatures to rise.

    So what?

    You are exactly "Total nonsense. "
    You may like to think so, but you haven't explained why you think that. It's blindingly obvious that you couldn't, even if you were silly enough to try.

    Sydney is low science city, so we don't care.

    https://www.fqt.unsw.edu.au/news/top-physics-prizes-awarded-to-unsw-researchers

    I got in on a tour of that lab. I was impressed by their Raith electron beam microfabricator, which is a pretty impressive kind of lab tool.

    Call prof. Mann and tell him, there has been no sea level rise at Pacific islands at all, on the Maledives, for the last 1,000 years.

    The Maledives are in the Indian Ocean, not too far south of Ceylon. I'd prefer not to get jeered at as an ignorant idiot. And they do seem to be worried about
    sea level rise.

    https://en.wikipedia.org/wiki/Maldives#Sea_level_rise

    If Kremlin funds hundreds of so called pseudo scientists world-wide to sell more natural gas, so call Greta and ask her, where is she with the Global Warming fake today.

    It's not the Kremlin that's funding the lying that is going on. Exxon-Mobile does a lot of that, but they are funding the climate change denial propaganda that seems to be fooling you.
    where is UNFCC Bonn agency, where is UN New York SIDS agency today (Small Island Developing States)
    Why should I care?

    If $Bs are pumped into your bank account, so you sell every paranoia as a genuine science.

    If only.

    Global Warming is an old fake funded by Putin to sell more natural gas.

    https://history.aip.org/climate/index.htm

    It's been around for rather longer than Putin, and it's not great way of selling natural gas. We'll have to stop burning that as fuel as well as coal and oil if we are going to stop raising the CO2 level in the atmosphere, which is getting to be
    urgently necessary, even if ignorant idiots like you don't understand why.

    --
    Bill Sloman, Sydney
    ==. And they do seem to be worried about sea level rise.

    since Global Warming fake is long lasting fake, funded by Kremlin

    It was my excellent long-year job to move UN agencies from Global Warming fake to Climate Change

    Climate Change is pure tautology by Heraclitus
    Everything flows - Panta rhei

    BTW
    Australia, Sydney is low on science due low population, not attracting foreign scientists, researchers
    and low AUD exchange rate

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From a a@21:1/5 to bill....@ieee.org on Mon Jun 20 08:52:03 2022
    On Monday, 20 June 2022 at 16:27:36 UTC+2, bill....@ieee.org wrote:
    On Monday, June 20, 2022 at 3:36:04 PM UTC+2, a a wrote:
    On Monday, 20 June 2022 at 15:13:10 UTC+2, bill....@ieee.org wrote:
    On Monday, June 20, 2022 at 2:47:09 PM UTC+2, a a wrote:
    On Tuesday, 26 April 2022 at 17:44:53 UTC+2, jla...@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer analysis of climate change. They analyzed five west coast "extreme storms" from 1982 to 2014.

    The conclusion from a senior scientist is that "it rains a lot more during the worst storms."

    Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.
    What a load of nonsense. Al Gore's 1992 book

    https://en.wikipedia.org/wiki/Earth_in_the_Balance

    was a remarkably expert bit of science popularisation. He got the science right, not that the denialist propaganda machine is willing to admit it.
    The book did make money, but not all that much. A decade later it did put Al Gore in a position to make money out of climate change, but that didn't mean that he wrote it with that in mind.

    Freon is another fake.

    In what sense? Chlorofluorocarbons do damage the ozone layer. We know exactly how - and we know that reducing their concentrations in the atmosphere is letting the ozone layer get denser again. The fakery here lies in your lie.
    Climate is clocked by solar activity and by fluctuations in solar activity.
    And the amount of CO2 and other green-house gases in the atmosphere. As Joseph Fourier worked out in 1824, if they weren't there the temperature of the surface of the Earth would be -18C. The difference between ice ages (atmospheric CO2 levels
    around 180 ppm) and interglacials (atmospheric CO2 levels around 270 ppm) also depends on the more extensive ice cover during interglacials, but the CO2 levels do account for a lot of the difference.
    So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.
    Except that you can't. Solar activity doesn't explain the ice age to interglacial transitions, and only an ignorant idiot could imagine that they did

    Removing trees within city limits, you can turn any city in heat island with rising temperatures, since removing trees, grass, you destroy rainwater retention mechanism.

    Total nonsense.

    Water absorbs heat from the sun by evaporation.

    But the water vapour retains the heat at the bottom of the atmosphere.

    So if no water in the ground, no water evaporated and heat accumulates, making local temperatures to rise.

    So what?

    You are exactly "Total nonsense. "
    You may like to think so, but you haven't explained why you think that. It's blindingly obvious that you couldn't, even if you were silly enough to try.

    Sydney is low science city, so we don't care.

    It's been around for rather longer than Putin, and it's not great way of selling natural gas. We'll have to stop burning that as fuel as well as coal and oil if we are going to stop raising the CO2 level in the atmosphere,

    --
    Bill Sloman, Sydney
    ----Sydney is low science city, so we don't care.

    CO2 is welcome
    CO2 is Plant Food
    Plants are Animal Food
    Animals are Human Food

    More CO2 more Human Food
    to end the world hunger

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Anthony William Sloman@21:1/5 to a a on Tue Jun 21 06:26:35 2022
    On Monday, June 20, 2022 at 5:46:59 PM UTC+2, a a wrote:
    On Monday, 20 June 2022 at 16:27:36 UTC+2, bill....@ieee.org wrote:
    On Monday, June 20, 2022 at 3:36:04 PM UTC+2, a a wrote:
    On Monday, 20 June 2022 at 15:13:10 UTC+2, bill....@ieee.org wrote:
    On Monday, June 20, 2022 at 2:47:09 PM UTC+2, a a wrote:
    On Tuesday, 26 April 2022 at 17:44:53 UTC+2, jla...@highlandsniptechnology.com wrote:
    Lawrence Berkeley Lab announced the results from a new supercomputer
    analysis of climate change. They analyzed five west coast "extreme storms" from 1982 to 2014.

    The conclusion from a senior scientist is that "it rains a lot more
    during the worst storms."

    Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.
    What a load of nonsense. Al Gore's 1992 book

    https://en.wikipedia.org/wiki/Earth_in_the_Balance

    was a remarkably expert bit of science popularisation. He got the science right, not that the denialist propaganda machine is willing to admit it.
    The book did make money, but not all that much. A decade later it did put Al Gore in a position to make money out of climate change, but that didn't mean that he wrote it with that in mind.

    Freon is another fake.

    In what sense? Chlorofluorocarbons do damage the ozone layer. We know exactly how - and we know that reducing their concentrations in the atmosphere is letting the ozone layer get denser again. The fakery here lies in your lie.
    Climate is clocked by solar activity and by fluctuations in solar activity.
    And the amount of CO2 and other green-house gases in the atmosphere. As Joseph Fourier worked out in 1824, if they weren't there the temperature of the surface of the Earth would be -18C. The difference between ice ages (atmospheric CO2 levels
    around 180 ppm) and interglacials (atmospheric CO2 levels around 270 ppm) also depends on the more extensive ice cover during interglacials, but the CO2 levels do account for a lot of the difference.
    So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.
    Except that you can't. Solar activity doesn't explain the ice age to interglacial transitions, and only an ignorant idiot could imagine that they did

    Removing trees within city limits, you can turn any city in heat island with rising temperatures, since removing trees, grass, you destroy rainwater retention mechanism.

    Total nonsense.

    Water absorbs heat from the sun by evaporation.

    But the water vapour retains the heat at the bottom of the atmosphere.

    So if no water in the ground, no water evaporated and heat accumulates, making local temperatures to rise.

    So what?

    You are exactly "Total nonsense. "

    You may like to think so, but you haven't explained why you think that. It's blindingly obvious that you couldn't, even if you were silly enough to try.

    Sydney is low science city, so we don't care.

    https://www.fqt.unsw.edu.au/news/top-physics-prizes-awarded-to-unsw-researchers

    I got in on a tour of that lab. I was impressed by their Raith electron beam microfabricator, which is a pretty impressive kind of lab tool.

    Call prof. Mann and tell him, there has been no sea level rise at Pacific islands at all, on the Maledives, for the last 1,000 years.

    The Maledives are in the Indian Ocean, not too far south of Ceylon. I'd prefer not to get jeered at as an ignorant idiot. And they do seem to be worried about
    sea level rise.

    https://en.wikipedia.org/wiki/Maldives#Sea_level_rise

    If Kremlin funds hundreds of so called pseudo scientists world-wide to sell more natural gas, so call Greta and ask her, where is she with the Global Warming fake today.

    It's not the Kremlin that's funding the lying that is going on. Exxon-Mobile does a lot of that, but they are funding the climate change denial propaganda that seems to be fooling you.

    where is UNFCC Bonn agency, where is UN New York SIDS agency today (Small Island Developing States)

    Why should I care?

    If $Bs are pumped into your bank account, so you sell every paranoia as a genuine science.

    If only.

    Global Warming is an old fake funded by Putin to sell more natural gas.

    https://history.aip.org/climate/index.htm

    It's been around for rather longer than Putin, and it's not great way of selling natural gas. We'll have to stop burning that as fuel as well as coal and oil if we are going to stop raising the CO2 level in the atmosphere, which is getting to be
    urgently necessary, even if ignorant idiots like you don't understand why.

    ==. And they do seem to be worried about sea level rise.

    Since Global Warming fake is long lasting fake, funded by Kremlin

    You seem to to want to thinks so, but who cares what an idiot wants to think?

    It was my excellent long-year job to move UN agencies from Global Warming fake to Climate Change

    Climate Change is pure tautology by Heraclitus Everything flows - Panta rhei

    This particular flow is having inconvenient consequences. Happily, it is potentially reversible, even if idiots like you prefer not to recognise the fact.

    BTW
    Australia, Sydney is low on science due low population, not attracting foreign scientists, researchers and low AUD exchange rate.

    https://www.macrotrends.net/cities/206167/sydney/population

    5,057,000 people is a respectable population. Sydney University, the University of New South Wales, Macquarie University, UTC and the University of Western Sydney all attract foreign researchers - I've been out to dinner with some of them.

    Sydney University is #28 on at least one world ranking, not that that means very much.

    https://www.usnews.com/education/best-global-universities/rankings

    You seem to be pig-ignorant in a whole range of areas, not just climate science.

    --
    Bill Sloman, Sydney

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Anthony William Sloman@21:1/5 to a a on Tue Jun 21 06:38:10 2022
    On Monday, June 20, 2022 at 5:52:09 PM UTC+2, a a wrote:
    On Monday, 20 June 2022 at 16:27:36 UTC+2, bill....@ieee.org wrote:
    On Monday, June 20, 2022 at 3:36:04 PM UTC+2, a a wrote:
    On Monday, 20 June 2022 at 15:13:10 UTC+2, bill....@ieee.org wrote:
    On Monday, June 20, 2022 at 2:47:09 PM UTC+2, a a wrote:
    On Tuesday, 26 April 2022 at 17:44:53 UTC+2, jla...@highlandsniptechnology.com wrote:

    <snip>

    ----Sydney is low science city, so we don't care.

    Or so a a likes to think. He's an idiot, so nobody else cares.

    CO2 is welcome

    Only by people as pig-ignorant as a a.

    CO2 is Plant Food

    It's one of them. There are others, all equally necessary.

    Plants are Animal Food
    Animals are Human Food

    More CO2 more Human Food to end the world hunger.

    If only it were that simple. A a is much too stupid to cope with all the rest of the stuff that is going on, so he misses the point that not all plants are animal food,
    and superfluity of weeds isn't going to do much towards ending world hunger.

    --
    Bill Sloman, Sydney

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From whit3rd@21:1/5 to a a on Tue Jun 21 10:47:55 2022
    On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:

    Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.

    False, of course. Al Gore had some election-year lies aimed that way, and as for 'make money',
    all Mann got was some frivolous lawsuits. The court had his legal team's fees reimbursed
    by the jackals that brought the suit against him. The court did that because the suits
    were found to be 'barratry', rather than being serious complaints.

    Freon is another fake.

    Not so; that's a DuPont tradename for fluorocarbon products, under worldwide ban
    due to ozone depletion.

    Climate is clocked by solar activity and by fluctuations in solar activity.

    'clocked by'??? Climate is affected by solar heat (influx of heat dominates during the day) and
    radiative cooling (heat dissipates into space, dominates the heatflow at night). The steady-state
    average temperature isn't proportional to anything the Sun does, but is set by the difference of
    those two heat flows (the difference is zero when steady-state temperature is achieved).
    So, your identification of 'solar activity' is only a half-truth at best, and in most literature,
    'solar activity' only means sunspot fluctuations, not solar heat output. Indeed, the solar
    heat output is set by fusion rates in the sun's center, many thousands of miles away from
    the photosphere where we see sunspots.

    So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.

    Utterance of nonsense is detected. Agriculture, forestry, water resources, sea life are all being hurt
    by climate change, and gazing at Mr.Sun isn't a rational plan to deal with it.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From a a@21:1/5 to All on Tue Jun 21 13:39:39 2022
    On Tuesday, 21 June 2022 at 19:48:01 UTC+2, whit3rd wrote:
    On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:

    Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.
    False, of course. Al Gore had some election-year lies aimed that way, and as for 'make money',
    all Mann got was some frivolous lawsuits. The court had his legal team's fees reimbursed
    by the jackals that brought the suit against him. The court did that because the suits
    were found to be 'barratry', rather than being serious complaints.

    Freon is another fake.

    Not so; that's a DuPont tradename for fluorocarbon products, under worldwide ban
    due to ozone depletion.
    Climate is clocked by solar activity and by fluctuations in solar activity.
    'clocked by'??? Climate is affected by solar heat (influx of heat dominates during the day) and
    radiative cooling (heat dissipates into space, dominates the heatflow at night). The steady-state
    average temperature isn't proportional to anything the Sun does, but is set by the difference of
    those two heat flows (the difference is zero when steady-state temperature is achieved).
    So, your identification of 'solar activity' is only a half-truth at best, and in most literature,
    'solar activity' only means sunspot fluctuations, not solar heat output. Indeed, the solar
    heat output is set by fusion rates in the sun's center, many thousands of miles away from
    the photosphere where we see sunspots.
    So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.
    Utterance of nonsense is detected. Agriculture, forestry, water resources, sea life are all being hurt
    by climate change, and gazing at Mr.Sun isn't a rational plan to deal with it.
    you are completely wrong

    We all Love Carbon
    We all Love CO2

    The balance of soil carbon is held in peat and wetlands (150 GtC), and in plant litter at the soil surface (50 GtC). This compares to 780 GtC in the atmosphere, and 600 GtC in all living organisms. The oceanic pool of carbon accounts for 38,200 GtC.
    Soil carbon - Wikipedia
    en.wikipedia.org/wiki/Soil_carbon

    Forest soils

    Forest soils constitute a large pool of carbon. Anthropogenic activities such as deforestation cause releases of carbon from this pool, which may significantly increase the concentration of greenhouse gas (GHG) in the atmosphere.[24] Under the United
    Nations Framework Convention on Climate Change (UNFCCC), countries must estimate and report GHG emissions and removals, including changes in carbon stocks in all five pools (above- and below-ground biomass, dead wood, litter, and soil carbon) and
    associated emissions and removals from land use, land-use change and forestry activities, according to the Intergovernmental Panel on Climate Change's good practice guidance.[25][26] Tropical deforestation represents nearly 25 percent of total
    anthropogenic GHG emissions worldwide.[27] Deforestation, forest degradation, and changes in land management practices can cause releases of carbon from soil to the atmosphere. For these reasons, reliable estimates of soil organic carbon stock and stock
    changes are needed for Reducing emissions from deforestation and forest degradation and GHG reporting under the UNFCCC.

    The government of Tanzania—together with the Food and Agriculture Organization of the United Nations[28] and the financial support of the government of Finland—have implemented a forest soil carbon monitoring program[29] to estimate soil carbon stock,
    using both survey and modelling-based methods.

    West Africa has experienced significant loss of forest that contains high levels of soil organic carbon.[30][31] This is mostly due to expansion of small scale, non-mechanized agriculture using burning as a form of land clearance [32]

    --
    I am really sorry, you represent low science - no science
    but

    GHG - greenhouse gas emissions is an old fake and political agenda developed by US politicians to kill China economy

    but China mirorred the attack and turned himself into reduced emissions Global Factory
    in green technologies like solar panels, wind turbines

    Water H2O in its gaseous state, or water vapor, is the only greenhouse gas because of its high heat of vaporization

    --

    Climate Change is my agenda, developed and injected into UN agencies to let them nicely switch from Global Warming fake
    into Climate Changes tautology by Heraclitus, everything flows, Panta rhei

    It took me years to contact eco fools, funded by Putin and Kremlin to give up Global Warming fake

    Putin and Kremlin injected $Bs into pockets of eco fools to boost sales of natural gas as green fuel

    Today not a single $ slips into pockets of eco fools, so


    my
    Climate Change is clocked by fluctuations in solar activity Agenda
    is backed today by NASA (Solar Lab) and others with public money, not coming from Russia.

    If you still love Global Warming fake by Al Gore, UNFCC team, Prof. Mann,
    I have nothing to say

    But call Prof. Mann one day to learn, there is not a single piece of support for Global Warming fake in his Report, published by UNFCC and awarded with Nobel Prize

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From whit3rd@21:1/5 to a a on Tue Jun 21 18:58:24 2022
    On Tuesday, June 21, 2022 at 1:39:46 PM UTC-7, a a wrote:
    On Tuesday, 21 June 2022 at 19:48:01 UTC+2, whit3rd wrote:
    On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:

    Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.
    False, of course. Al Gore had some election-year lies aimed that way, and as for 'make money',
    all Mann got was some frivolous lawsuits. The court had his legal team's fees reimbursed
    by the jackals that brought the suit against him. The court did that because the suits
    were found to be 'barratry', rather than being serious complaints.

    Freon is another fake.

    Not so; that's a DuPont tradename for fluorocarbon products, under worldwide ban
    due to ozone depletion.

    Climate is clocked by solar activity and by fluctuations in solar activity.

    'clocked by'??? Climate is affected by solar heat (influx of heat dominates during the day) and
    radiative cooling (heat dissipates into space, dominates the heatflow at night). The steady-state
    average temperature isn't proportional to anything the Sun does, but is set by the difference of
    those two heat flows (the difference is zero when steady-state temperature is achieved).
    So, your identification of 'solar activity' is only a half-truth at best, and in most literature,
    'solar activity' only means sunspot fluctuations, not solar heat output. Indeed, the solar
    heat output is set by fusion rates in the sun's center, many thousands of miles away from
    the photosphere where we see sunspots.

    So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.

    Utterance of nonsense is detected. Agriculture, forestry, water resources, sea life are all being hurt
    by climate change, and gazing at Mr.Sun isn't a rational plan to deal with it.

    you are completely wrong

    Okay, where? Zero details? It sounds like a lame excuse for a lack of criticism.

    We all Love Carbon

    That's an odd perversion; diamonds are pretty, though.

    We all Love CO2

    Live with it, yes; also a few other gasses.


    The balance of soil carbon is held in peat and wetlands (150 GtC), and ...

    and temporary surface repositories are not part of the carbon cycle in the atmosphere versus Earth's
    crust, because they can go either up or down (they're inbetween, available to burn or get buried).

    GHG - greenhouse gas emissions is an old fake and political agenda developed by US politicians to kill China economy

    There's an agenda around it, nowadays, and politicians. The US can (and has) developed bits,
    as have other nations; 192 of 'em if I remember the Paris Accords count.

    but China mirorred the attack and turned himself into reduced emissions Global Factory
    in green technologies like solar panels, wind turbines

    It wasn't an attack. China is one of the 192 that joined the Paris Accord, and as you
    say they're developing bits of their own agenda, as well as exporting bits.

    Climate Change is clocked by fluctuations in solar activity Agenda
    is backed today by NASA (Solar Lab) and others with public money, not coming from Russia.

    Not true. Studied by solar scientists, yes; but what does 'clocked' mean? And, where's
    the citation we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
    are you really convinced sunspot activity is relevant?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Anass Luca@21:1/5 to Anthony William Sloman on Wed Jun 22 06:38:44 2022
    Anthony William Sloman <bill.sloman@ieee.org> wrote:
    On Monday, June 20, 2022 at 5:46:59 PM UTC+2, a a wrote:
    BTW
    Australia, Sydney is low on science due low population, not
    attracting foreign scientists, researchers and low AUD exchange
    rate.

    ...

    You seem to be pig-ignorant in a whole range of areas, not just
    climate science.

    Given a a's nature since its first appearance here, and its absolute
    stupidity, I suspect a a is actually one of John Doe's nym-shift names.
    The two have about the same low IQ level and both are off-topic
    disruptive trolls.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From a a@21:1/5 to All on Wed Jun 22 02:30:53 2022
    On Wednesday, 22 June 2022 at 03:58:30 UTC+2, whit3rd wrote:
    On Tuesday, June 21, 2022 at 1:39:46 PM UTC-7, a a wrote:
    On Tuesday, 21 June 2022 at 19:48:01 UTC+2, whit3rd wrote:
    On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:

    Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.
    False, of course. Al Gore had some election-year lies aimed that way, and as for 'make money',
    all Mann got was some frivolous lawsuits. The court had his legal team's fees reimbursed
    by the jackals that brought the suit against him. The court did that because the suits
    were found to be 'barratry', rather than being serious complaints.

    Freon is another fake.

    Not so; that's a DuPont tradename for fluorocarbon products, under worldwide ban
    due to ozone depletion.

    Climate is clocked by solar activity and by fluctuations in solar activity.

    'clocked by'??? Climate is affected by solar heat (influx of heat dominates during the day) and
    radiative cooling (heat dissipates into space, dominates the heatflow at night). The steady-state
    average temperature isn't proportional to anything the Sun does, but is set by the difference of
    those two heat flows (the difference is zero when steady-state temperature is achieved).
    So, your identification of 'solar activity' is only a half-truth at best, and in most literature,
    'solar activity' only means sunspot fluctuations, not solar heat output. Indeed, the solar
    heat output is set by fusion rates in the sun's center, many thousands of miles away from
    the photosphere where we see sunspots.

    So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.

    Utterance of nonsense is detected. Agriculture, forestry, water resources, sea life are all being hurt
    by climate change, and gazing at Mr.Sun isn't a rational plan to deal with it.

    you are completely wrong
    Okay, where? Zero details? It sounds like a lame excuse for a lack of criticism.

    We all Love Carbon

    That's an odd perversion; diamonds are pretty, though.

    We all Love CO2

    Live with it, yes; also a few other gasses.


    The balance of soil carbon is held in peat and wetlands (150 GtC), and ...

    and temporary surface repositories are not part of the carbon cycle in the atmosphere versus Earth's
    crust, because they can go either up or down (they're inbetween, available to burn or get buried).
    GHG - greenhouse gas emissions is an old fake and political agenda developed by US politicians to kill China economy
    There's an agenda around it, nowadays, and politicians. The US can (and has) developed bits,
    as have other nations; 192 of 'em if I remember the Paris Accords count.
    but China mirorred the attack and turned himself into reduced emissions Global Factory
    in green technologies like solar panels, wind turbines
    It wasn't an attack. China is one of the 192 that joined the Paris Accord, and as you
    say they're developing bits of their own agenda, as well as exporting bits.
    Climate Change is clocked by fluctuations in solar activity Agenda
    is backed today by NASA (Solar Lab) and others with public money, not coming from Russia.
    Not true. Studied by solar scientists, yes; but what does 'clocked' mean? And, where's
    the citation we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
    are you really convinced sunspot activity is relevant?

    read first
    https://www.nasa.gov/mission_pages/sdo/main/index.html

    We All love Carbon
    We All Love CO2

    CO2 is Plant Food
    Plants are Animal Food
    Animals are Human Food

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From whit3rd@21:1/5 to a a on Wed Jun 22 02:51:26 2022
    On Wednesday, June 22, 2022 at 2:31:02 AM UTC-7, a a wrote:
    On Wednesday, 22 June 2022 at 03:58:30 UTC+2, whit3rd wrote:

    ...where's
    the citation we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
    are you really convinced sunspot activity is relevant?

    read first

    https://www.nasa.gov/mission_pages/sdo/main/index.html

    No, not going to read an index... you didn't cite a single work, or result, and that makes this an evasion rather than an answer.

    Solar dynamics isn't significant heat modulation related to global warming. Here's a picture from NASA <https://climate.nasa.gov/faq/14/is-the-sun-causing-global-warming/>

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From a a@21:1/5 to All on Wed Jun 22 02:28:29 2022
    On Wednesday, 22 June 2022 at 03:58:30 UTC+2, whit3rd wrote:
    On Tuesday, June 21, 2022 at 1:39:46 PM UTC-7, a a wrote:
    On Tuesday, 21 June 2022 at 19:48:01 UTC+2, whit3rd wrote:
    On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:

    Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.
    False, of course. Al Gore had some election-year lies aimed that way, and as for 'make money',
    all Mann got was some frivolous lawsuits. The court had his legal team's fees reimbursed
    by the jackals that brought the suit against him. The court did that because the suits
    were found to be 'barratry', rather than being serious complaints.

    Freon is another fake.

    Not so; that's a DuPont tradename for fluorocarbon products, under worldwide ban
    due to ozone depletion.

    Climate is clocked by solar activity and by fluctuations in solar activity.

    'clocked by'??? Climate is affected by solar heat (influx of heat dominates during the day) and
    radiative cooling (heat dissipates into space, dominates the heatflow at night). The steady-state
    average temperature isn't proportional to anything the Sun does, but is set by the difference of
    those two heat flows (the difference is zero when steady-state temperature is achieved).
    So, your identification of 'solar activity' is only a half-truth at best, and in most literature,
    'solar activity' only means sunspot fluctuations, not solar heat output. Indeed, the solar
    heat output is set by fusion rates in the sun's center, many thousands of miles away from
    the photosphere where we see sunspots.

    So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.

    Utterance of nonsense is detected. Agriculture, forestry, water resources, sea life are all being hurt
    by climate change, and gazing at Mr.Sun isn't a rational plan to deal with it.

    you are completely wrong
    Okay, where? Zero details? It sounds like a lame excuse for a lack of criticism.

    We all Love Carbon

    That's an odd perversion; diamonds are pretty, though.

    We all Love CO2

    Live with it, yes; also a few other gasses.


    The balance of soil carbon is held in peat and wetlands (150 GtC), and ...

    and temporary surface repositories are not part of the carbon cycle in the atmosphere versus Earth's
    crust, because they can go either up or down (they're inbetween, available to burn or get buried).
    GHG - greenhouse gas emissions is an old fake and political agenda developed by US politicians to kill China economy
    There's an agenda around it, nowadays, and politicians. The US can (and has) developed bits,
    as have other nations; 192 of 'em if I remember the Paris Accords count.
    but China mirorred the attack and turned himself into reduced emissions Global Factory
    in green technologies like solar panels, wind turbines
    It wasn't an attack. China is one of the 192 that joined the Paris Accord, and as you
    say they're developing bits of their own agenda, as well as exporting bits.
    Climate Change is clocked by fluctuations in solar activity Agenda
    is backed today by NASA (Solar Lab) and others with public money, not coming from Russia.
    Not true. Studied by solar scientists, yes; but what does 'clocked' mean? And, where's
    the citation we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
    are you really convinced sunspot activity is relevant?
    every troll is free to contact NASA Solar Lab directly to get data on
    - solar storn
    - solar flares
    - solar radiation
    - sunspots


    What is NASA's Parker Solar Probe?
    NASA’s Parker Solar Probe will be the first-ever mission to "touch" the Sun. The spacecraft, about the size of a small car, will travel directly into the Sun's atmosphere about 4 million miles from the surface. Parker Solar Probe launched aboard a
    Delta IV-Heavy rocket from Cape Canaveral, Aug. 12, 2018 at 3:31 a.m. Eastern time.


    ==we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
    ---are you really convinced sunspot activity is relevant?


    https://sdo.gsfc.nasa.gov/

    https://science.gsfc.nasa.gov/heliophysics/solar/

    Overview

    The Solar Physics Laboratory works to understand the Sun as a star and as the primary driver of activity throughout the solar system. Our research expands knowledge of the Earth-Sun system and helps to enable robotic and human exploration.

    We develop innovative instruments and mission concepts, theoretical models, and techniques to access and analyze data. The Laboratory provides project scientists for NASA missions, assists with strategic planning and mission definition, and communicates
    research results to the international scientific community and the public. Contact Us

    General inquiries about the scientific programs at NASA's Goddard Space Flight Center may be directed to the Office of Communications at 1.301.286.8955.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From a a@21:1/5 to All on Wed Jun 22 02:37:06 2022
    On Wednesday, 22 June 2022 at 03:58:30 UTC+2, whit3rd wrote:
    On Tuesday, June 21, 2022 at 1:39:46 PM UTC-7, a a wrote:
    On Tuesday, 21 June 2022 at 19:48:01 UTC+2, whit3rd wrote:
    On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:

    Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.
    False, of course. Al Gore had some election-year lies aimed that way, and as for 'make money',
    all Mann got was some frivolous lawsuits. The court had his legal team's fees reimbursed
    by the jackals that brought the suit against him. The court did that because the suits
    were found to be 'barratry', rather than being serious complaints.

    Freon is another fake.

    Not so; that's a DuPont tradename for fluorocarbon products, under worldwide ban
    due to ozone depletion.

    Climate is clocked by solar activity and by fluctuations in solar activity.

    'clocked by'??? Climate is affected by solar heat (influx of heat dominates during the day) and
    radiative cooling (heat dissipates into space, dominates the heatflow at night). The steady-state
    average temperature isn't proportional to anything the Sun does, but is set by the difference of
    those two heat flows (the difference is zero when steady-state temperature is achieved).
    So, your identification of 'solar activity' is only a half-truth at best, and in most literature,
    'solar activity' only means sunspot fluctuations, not solar heat output. Indeed, the solar
    heat output is set by fusion rates in the sun's center, many thousands of miles away from
    the photosphere where we see sunspots.

    So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.

    Utterance of nonsense is detected. Agriculture, forestry, water resources, sea life are all being hurt
    by climate change, and gazing at Mr.Sun isn't a rational plan to deal with it.

    you are completely wrong
    Okay, where? Zero details? It sounds like a lame excuse for a lack of criticism.

    We all Love Carbon

    That's an odd perversion; diamonds are pretty, though.

    We all Love CO2

    Live with it, yes; also a few other gasses.


    The balance of soil carbon is held in peat and wetlands (150 GtC), and ...

    and temporary surface repositories are not part of the carbon cycle in the atmosphere versus Earth's
    crust, because they can go either up or down (they're inbetween, available to burn or get buried).
    GHG - greenhouse gas emissions is an old fake and political agenda developed by US politicians to kill China economy
    There's an agenda around it, nowadays, and politicians. The US can (and has) developed bits,
    as have other nations; 192 of 'em if I remember the Paris Accords count.
    but China mirorred the attack and turned himself into reduced emissions Global Factory
    in green technologies like solar panels, wind turbines
    It wasn't an attack. China is one of the 192 that joined the Paris Accord, and as you
    say they're developing bits of their own agenda, as well as exporting bits.
    Climate Change is clocked by fluctuations in solar activity Agenda
    is backed today by NASA (Solar Lab) and others with public money, not coming from Russia.
    Not true. Studied by solar scientists, yes; but what does 'clocked' mean? And, where's
    the citation we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
    are you really convinced sunspot activity is relevant?

    visit NASA’s Solar Dynamics Observatory
    one day

    https://www.nasa.gov/mission_pages/sdo/main/index.html

    https://sdo.gsfc.nasa.gov/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From a a@21:1/5 to All on Wed Jun 22 03:54:46 2022
    On Wednesday, 22 June 2022 at 11:51:32 UTC+2, whit3rd wrote:
    On Wednesday, June 22, 2022 at 2:31:02 AM UTC-7, a a wrote:
    On Wednesday, 22 June 2022 at 03:58:30 UTC+2, whit3rd wrote:
    ...where's
    the citation we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
    are you really convinced sunspot activity is relevant?

    read first

    https://www.nasa.gov/mission_pages/sdo/main/index.html
    No, not going to read an index... you didn't cite a single work, or result, and that makes this an evasion rather than an answer.

    Solar dynamics isn't significant heat modulation related to global warming. Here's a picture from NASA <https://climate.nasa.gov/faq/14/is-the-sun-causing-global-warming/>
    excellent, excellent
    look again at your chart
    Total Solar Irradiance values are fixed at 1361+/- W/m2 level

    What matters is "Total"

    Temperature increase by 1 degree C over the span of 100 years, as declared by Prof. Mann
    is exactly within calculation/ data collected error.


    What matters is Plasma Leaving the Sun and flares, clocking the Climate Changes on the Earth

    http://sdoisgo.blogspot.com/2022/03/an-x13-flare-and-cool-view-of-plasma.html

    https://sdo.gsfc.nasa.gov/


    There is an only one scientist at NASA, Solar Dynamics, who can tell you the thruth behind the coronal loops and plasma leaving the Sun cycles

    BTW
    Ask your friends from NASA Climate
    to ink image and comments with full name next time

    https://climate.nasa.gov/faq/14/is-the-sun-causing-global-warming/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Anthony William Sloman@21:1/5 to a a on Wed Jun 22 03:25:03 2022
    On Tuesday, June 21, 2022 at 10:39:46 PM UTC+2, a a wrote:
    On Tuesday, 21 June 2022 at 19:48:01 UTC+2, whit3rd wrote:
    On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:

    <snip>

    you are completely wrong

    a a does like posting claims like that. Since he is an obvious idiot, it is a waste of bandwidth.

    We all Love Carbon
    We all Love CO2

    Only those who are as brain-dead as a a.

    <snipped a large chunk of uncomprehended cut and paste which did say anything relevant>

    I am really sorry, you represent low science - no science but

    Your sorrow should be directed at you own abysmal ignorance.

    <snipped more fatuous nonsense >

    Water H2O in its gaseous state, or water vapor, is the only greenhouse gas because of its high heat of vaporization

    Water vapour - like gaseous CO2 - is a greenhouse gas because it absorbs and re-radiates specific near-infrared frequencies. Other molecules - like methane - that are active in the infra-red are also green-house gases.

    https://webbook.nist.gov/cgi/cbook.cgi?ID=C7732185&Mask=800#Electronic-Spec

    You've actually got to take the rotational modes into account to work out what the infra-red spectrum actually looks like, and get reliable greenhouse numbers.

    https://en.wikipedia.org/wiki/Svante_Arrhenius

    had the right idea in 1896, but Knut Ångström in 1900 published low resolution infrared absorbtion spectra which appeared to show he'd got it wrong. When we finally got spectrometers that could resolve the rotational fine structure, Arrhenius was
    vindicated, but he was dead by then.

    The heat of vapourisation doesn't come into it

    <snipped more incoherent raving>

    --
    Bill Sloman, Sydney

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From a a@21:1/5 to bill....@ieee.org on Wed Jun 22 04:05:12 2022
    On Wednesday, 22 June 2022 at 12:25:11 UTC+2, bill....@ieee.org wrote:
    On Tuesday, June 21, 2022 at 10:39:46 PM UTC+2, a a wrote:
    On Tuesday, 21 June 2022 at 19:48:01 UTC+2, whit3rd wrote:
    On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:
    <snip>

    you are completely wrong

    a a does like posting claims like that. Since he is an obvious idiot, it is a waste of bandwidth.
    We all Love Carbon
    We all Love CO2
    Only those who are as brain-dead as a a.

    <snipped a large chunk of uncomprehended cut and paste which did say anything relevant>
    I am really sorry, you represent low science - no science but
    Your sorrow should be directed at you own abysmal ignorance.

    <snipped more fatuous nonsense >
    Water H2O in its gaseous state, or water vapor, is the only greenhouse gas because of its high heat of vaporization
    Water vapour - like gaseous CO2 - is a greenhouse gas because it absorbs and re-radiates specific near-infrared frequencies. Other molecules - like methane - that are active in the infra-red are also green-house gases.

    https://webbook.nist.gov/cgi/cbook.cgi?ID=C7732185&Mask=800#Electronic-Spec

    You've actually got to take the rotational modes into account to work out what the infra-red spectrum actually looks like, and get reliable greenhouse numbers.

    https://en.wikipedia.org/wiki/Svante_Arrhenius

    had the right idea in 1896, but Knut Ångström in 1900 published low resolution infrared absorbtion spectra which appeared to show he'd got it wrong. When we finally got spectrometers that could resolve the rotational fine structure, Arrhenius was
    vindicated, but he was dead by then.

    The heat of vapourisation doesn't come into it

    <snipped more incoherent raving>

    --
    Bill Sloman, Sydney
    as you can see,
    Sydney is low science, making irrelevant claims

    Water H2O in its gaseous state, or water vapor, is the only greenhouse gas because of its high heat of vaporization

    <<Water vapour - like gaseous CO2 - is a greenhouse gas because it absorbs and re-radiates specific near-infrared <<frequencies. Other molecules - like methane - that are active in the infra-red are also green-house gases.

    it doesn't matter
    what matters is !!!!

    Water’s heat of vaporization is around 540 cal/g at 100 °C, water's boiling point.

    Why does water have a high heat of vaporization?
    That is, water has a high heat of vaporization, the amount of energy needed to change one gram of a liquid substance to a gas at constant temperature. Water’s heat of vaporization is around 540 cal/g at 100 °C, water's boiling point.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Anthony William Sloman@21:1/5 to a a on Wed Jun 22 04:37:39 2022
    On Wednesday, June 22, 2022 at 1:05:18 PM UTC+2, a a wrote:
    On Wednesday, 22 June 2022 at 12:25:11 UTC+2, bill....@ieee.org wrote:
    On Tuesday, June 21, 2022 at 10:39:46 PM UTC+2, a a wrote:
    On Tuesday, 21 June 2022 at 19:48:01 UTC+2, whit3rd wrote:
    On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:
    <snip>

    you are completely wrong

    a a does like posting claims like that. Since he is an obvious idiot, it is a waste of bandwidth.
    We all Love Carbon
    We all Love CO2
    Only those who are as brain-dead as a a.

    <snipped a large chunk of uncomprehended cut and paste which did say anything relevant>

    I am really sorry, you represent low science - no science but
    Your sorrow should be directed at you own abysmal ignorance.

    <snipped more fatuous nonsense >
    Water H2O in its gaseous state, or water vapor, is the only greenhouse gas because of its high heat of vaporization
    Water vapour - like gaseous CO2 - is a greenhouse gas because it absorbs and re-radiates specific near-infrared frequencies. Other molecules - like methane - that are active in the infra-red are also green-house gases.

    https://webbook.nist.gov/cgi/cbook.cgi?ID=C7732185&Mask=800#Electronic-Spec

    You've actually got to take the rotational modes into account to work out what the infra-red spectrum actually looks like, and get reliable greenhouse numbers.

    https://en.wikipedia.org/wiki/Svante_Arrhenius

    had the right idea in 1896, but Knut Ångström in 1900 published low resolution infrared absorbtion spectra which appeared to show he'd got it wrong. When we finally got spectrometers that could resolve the rotational fine structure, Arrhenius was
    vindicated, but he was dead by then.

    The heat of vapourisation doesn't come into it

    <snipped more incoherent raving>

    as you can see, Sydney is low science, making irrelevant claims

    The science I've got came from Melbourne, where I got a Ph.D. in Physical Chemistry. If you'd got the talent - you clearly don't - you could learn just as much in Sydney.

    Water H2O in its gaseous state, or water vapor, is the only greenhouse gas because of its high heat of vaporization.

    <<Water vapour - like gaseous CO2 - is a greenhouse gas because it absorbs and re-radiates specific near-infrared frequencies. Other molecules - like methane - that are active in the infra-red are also green-house gases.

    it doesn't matter! What matters is !!!!

    According to a a who clearly doesn't know what he is talking about.

    Water’s heat of vaporization is around 540 cal/g at 100 °C, water's boiling point.

    Why does water have a high heat of vaporization?
    That is, water has a high heat of vaporization, the amount of energy needed to change one gram of a liquid substance to a gas at constant temperature. Water’s heat of vaporization is around 540 cal/g at 100 °C, water's boiling point.

    It's all about hydrogen bonding. In liquid water the individual hydrogen atoms are strongly bonded to their particular oxygen atom, but they also bond to adjacent oxygen atoms in adjacent water molecules. This happens to a lesser extent in hydrogen
    sulphide, but the sulphur atom is bigger and the hydrogen bonding to it corresponding weaker.

    This has absolutely nothing to do with water's effectiveness as a greenhouse gas. I can't imagine what lunatic delusion has caused you to imagine that it has.

    In fact the fact that water vapour condenses out of the atmosphere at high altitudes where the air is cold means that the effective radiating altitude for the water infra-red bands is lower than it is for the CO2 bands because there's more CO2 vapour in
    the upper atmosphere than there is water vapour.

    The effective radiating altitude for any given infra-red wavelength is the one where a photon of that wavelength has an even chance of getting away into outer space rather than being captured and re-radiated by some molecule or other.

    Fourier worked out - back in 1824 - that the average temperature of the re-radiating surface of the Earth had to be about -18C. What took a while to become clear was that for a lot of frequencies that re-radiating surface is quite a long way above the
    surface we stand on, where it's quite a lot colder. Look up "lapse rate" sometime.

    https://en.wikipedia.org/wiki/Lapse_rate

    --
    Bill Sloman, Sydney

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Anthony William Sloman@21:1/5 to a a on Wed Jun 22 04:58:36 2022
    On Wednesday, June 22, 2022 at 1:20:42 PM UTC+2, a a wrote:
    On Wednesday, 22 June 2022 at 12:54:53 UTC+2, a a wrote:
    On Wednesday, 22 June 2022 at 11:51:32 UTC+2, whit3rd wrote:
    On Wednesday, June 22, 2022 at 2:31:02 AM UTC-7, a a wrote:
    On Wednesday, 22 June 2022 at 03:58:30 UTC+2, whit3rd wrote:
    ...where's
    the citation we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
    are you really convinced sunspot activity is relevant?

    read first

    https://www.nasa.gov/mission_pages/sdo/main/index.html

    No, not going to read an index... you didn't cite a single work, or result, and that makes this an evasion rather than an answer.

    Solar dynamics isn't significant heat modulation related to global warming.
    Here's a picture from NASA <https://climate.nasa.gov/faq/14/is-the-sun-causing-global-warming/>

    excellent, excellent
    look again at your chart
    Total Solar Irradiance values are fixed at 1361+/- W/m2 level

    What matters is "Total"

    Temperature increase by 1 degree C over the span of 100 years, as declared by Prof. Mann

    Michael Mann's "hockeystick" covered the past 1000 years, not just the past century.

    https://en.wikipedia.org/wiki/Hockey_stick_graph

    It's probably one of the best replicated sets of results ever, and has been extended back over the past 20 million years by exploiting other climate proxies.

    is exactly within calculation/ data collected error.

    Rubbish.

    What matters is Plasma Leaving the Sun and flares, clocking the Climate Changes on the Earth

    http://sdoisgo.blogspot.com/2022/03/an-x13-flare-and-cool-view-of-plasma.html

    https://sdo.gsfc.nasa.gov/

    There is an only one scientist at NASA, Solar Dynamics, who can tell you the truth behind the coronal loops and plasma leaving the Sun cycles.

    So the climate change denial propaganda machine has managed to bribe one of the staff. If you named him, he'd probably get fired.

    BTW
    Ask your friends from NASA Climate
    to ink image and comments with full name next time.

    You need to ask them how they explain the ice-age inter-glacial alternation.

    https://en.wikipedia.org/wiki/Milankovitch_cycles

    It's all about subtle changes in the earth's orientation, and has nothing to do with sun-spot cycles and solar flares.

    https://climate.nasa.gov/faq/14/is-the-sun-causing-global-warming/

    in reply to low-science Sydney:

    SDO is designed to help us understand the Sun's influence on Earth and Near-Earth space by studying the solar atmosphere on small scales of space and time and in many wavelengths simultaneously.

    https://sdo.gsfc.nasa.gov/

    https://www.blogger.com/profile/16479620366654056823

    http://sdoisgo.blogspot.com/

    http://sdoisgo.blogspot.com/2022/03/an-x13-flare-and-cool-view-of-plasma.html

    Blog Description
    This is the Solar Dynamics Observatory Mission blog. It will consist of mission status, news, and event updates.

    That's not any kind of reply. Nothing in there that I can see suggests that changes in solar dynamics could explain current global warming or the last couple of million years of ice-age inter-glacial alternation. It's just an ill-informed smoke screen.

    --
    Bill Sloman, Sydney

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From a a@21:1/5 to a a on Wed Jun 22 04:20:36 2022
    On Wednesday, 22 June 2022 at 12:54:53 UTC+2, a a wrote:
    On Wednesday, 22 June 2022 at 11:51:32 UTC+2, whit3rd wrote:
    On Wednesday, June 22, 2022 at 2:31:02 AM UTC-7, a a wrote:
    On Wednesday, 22 June 2022 at 03:58:30 UTC+2, whit3rd wrote:
    ...where's
    the citation we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
    are you really convinced sunspot activity is relevant?

    read first

    https://www.nasa.gov/mission_pages/sdo/main/index.html
    No, not going to read an index... you didn't cite a single work, or result, and that makes this an evasion rather than an answer.

    Solar dynamics isn't significant heat modulation related to global warming. Here's a picture from NASA <https://climate.nasa.gov/faq/14/is-the-sun-causing-global-warming/>
    excellent, excellent
    look again at your chart
    Total Solar Irradiance values are fixed at 1361+/- W/m2 level

    What matters is "Total"

    Temperature increase by 1 degree C over the span of 100 years, as declared by Prof. Mann
    is exactly within calculation/ data collected error.


    What matters is Plasma Leaving the Sun and flares, clocking the Climate Changes on the Earth

    http://sdoisgo.blogspot.com/2022/03/an-x13-flare-and-cool-view-of-plasma.html

    https://sdo.gsfc.nasa.gov/


    There is an only one scientist at NASA, Solar Dynamics, who can tell you the thruth behind the coronal loops and plasma leaving the Sun cycles

    BTW
    Ask your friends from NASA Climate
    to ink image and comments with full name next time

    https://climate.nasa.gov/faq/14/is-the-sun-causing-global-warming/
    in reply
    to low-science Sydney:

    SDO is designed to help us understand the Sun's influence on Earth and Near-Earth space by studying the solar atmosphere on small scales of space and time and in many wavelengths simultaneously.

    https://sdo.gsfc.nasa.gov/

    https://www.blogger.com/profile/16479620366654056823

    http://sdoisgo.blogspot.com/

    http://sdoisgo.blogspot.com/2022/03/an-x13-flare-and-cool-view-of-plasma.html

    Blog Description
    This is the Solar Dynamics Observatory Mission blog. It will consist of mission status, news, and event updates.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)