Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast "extreme
storms" from 1982 to 2014.
The conclusion from a senior scientist is that "it rains a lot more
during the worst storms."
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast "extreme
storms" from 1982 to 2014.
On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast "extreme
storms" from 1982 to 2014.
https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer
---------<quote>-----------------
Lawrence Berkeley National Laboratory scientists are looking to make
highly detailed, 1 kilometer scale cloud models to improve climate >predictions. Using current supercomputer designs of combining
microprocessors used in personal computers, a system capable of making
such models would cost about $1 billion and use up 200 megawatts of
energy. A supercomputer using 20 million embedded processors, on the
other hand, would cost about $75 million and use less than 4 megawatts
of energy, according to Lawrence Berkeley National Laboratory researchers. >-------------<end quote>--------------
4 megawatts/200 megawatts - do the computers factor in their heat
generation in the climate models?
John ;-#)#
On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <sp...@flippers.com> wrote:
On 2022/04/26 8:44 a.m., jla...@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast "extreme
storms" from 1982 to 2014.
https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer
---------<quote>-----------------
Lawrence Berkeley National Laboratory scientists are looking to make
highly detailed, 1 kilometer scale cloud models to improve climate >predictions. Using current supercomputer designs of combining >microprocessors used in personal computers, a system capable of making
such models would cost about $1 billion and use up 200 megawatts of
energy. A supercomputer using 20 million embedded processors, on the
other hand, would cost about $75 million and use less than 4 megawatts
of energy, according to Lawrence Berkeley National Laboratory researchers. >-------------<end quote>--------------
4 megawatts/200 megawatts - do the computers factor in their heat >generation in the climate models?
Does LBL measure energy in megawatts?
Do bigger computers predict climate better?
Oh dear.
On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <spam@flippers.com>
wrote:
On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast "extreme
storms" from 1982 to 2014.
https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer
---------<quote>-----------------
Lawrence Berkeley National Laboratory scientists are looking to make
highly detailed, 1 kilometer scale cloud models to improve climate >>predictions. Using current supercomputer designs of combining >>microprocessors used in personal computers, a system capable of making
such models would cost about $1 billion and use up 200 megawatts of
energy. A supercomputer using 20 million embedded processors, on the
other hand, would cost about $75 million and use less than 4 megawatts
of energy, according to Lawrence Berkeley National Laboratory researchers. >>-------------<end quote>--------------
4 megawatts/200 megawatts - do the computers factor in their heat >>generation in the climate models?
John ;-#)#
Does LBL measure energy in megawatts?
Do bigger computers predict climate better?
Oh dear.
jlarkin@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast "extreme
storms" from 1982 to 2014.
The conclusion from a senior scientist is that "it rains a lot more
during the worst storms."
I'm surprised they even noticed that detail. Too bad they never talked to >anybody over at the NOAA about how things work.
On a sunny day (Tue, 26 Apr 2022 16:56:33 -0000 (UTC)) it happened Cydrome Leader <pres...@MUNGEpanix.com> wrote in <t49881$clq$2...@reader1.panix.com>: >jla...@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast "extreme
storms" from 1982 to 2014.
The conclusion from a senior scientist is that "it rains a lot more
during the worst storms."
I'm surprised they even noticed that detail. Too bad they never talked to >anybody over at the NOAA about how things work.There is a lot about need to publish
Somebody I knew did a PhD in psychology or something
He promoted on a paper about the sex-life of some group living in the wild.
I asked him if he went there and experienced it...
No :)
if you read sciencedaily.com every day there are papers and things discovered that are either too obvious to read or too vague to be useful?
Do plants have feeling?
Do monkeys feel emotions?
sort of things
Of course they do.
Today:
Prehistoric People Created Art by Firelight
of course they did, no flashlights back then in a dark cave.
On a sunny day (Tue, 26 Apr 2022 13:53:08 -0700) it happened John Larkin <jlarkin@highland_atwork_technology.com> wrote in <fpmg6hhot88ajjqkcb6nv9mkbjm7s9q85k@4ax.com>:
On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <spam@flippers.com>
wrote:
On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast "extreme
storms" from 1982 to 2014.
https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer
---------<quote>-----------------
Lawrence Berkeley National Laboratory scientists are looking to make
highly detailed, 1 kilometer scale cloud models to improve climate
predictions. Using current supercomputer designs of combining
microprocessors used in personal computers, a system capable of making
such models would cost about $1 billion and use up 200 megawatts of
energy. A supercomputer using 20 million embedded processors, on the
other hand, would cost about $75 million and use less than 4 megawatts
of energy, according to Lawrence Berkeley National Laboratory researchers. >>> -------------<end quote>--------------
4 megawatts/200 megawatts - do the computers factor in their heat
generation in the climate models?
John ;-#)#
Does LBL measure energy in megawatts?
Do bigger computers predict climate better?
Oh dear.
I have read CERN uses more power than all windmills together deliver in Switzerland.
On Wednesday, April 27, 2022 at 6:53:20 AM UTC+10, John Larkin
wrote:
On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson
<sp...@flippers.com> wrote:
On 2022/04/26 8:44 a.m., jla...@highlandsniptechnology.com
wrote:
Lawrence Berkeley Lab announced the results from a new
supercomputer analysis of climate change. They analyzed five
west coast "extreme storms" from 1982 to 2014.
https://www.greenbiz.com/article/berkeley-lab-tensilica-collabora
te-energy-efficient-climate-modeling-supercomputer
---------<quote>-----------------
Lawrence Berkeley National Laboratory scientists are looking to
make highly detailed, 1 kilometer scale cloud models to improve
climate predictions. Using current supercomputer designs of
combining microprocessors used in personal computers, a system
capable of making such models would cost about $1 billion and
use up 200 megawatts of energy. A supercomputer using 20 million
embedded processors, on the other hand, would cost about $75
million and use less than 4 megawatts of energy, according to
Lawrence Berkeley National Laboratory researchers.
-------------<end quote>--------------
4 megawatts/200 megawatts - do the computers factor in their
heat generation in the climate models?
Probably don't have to bother. It's lost in the rounding errors.
Does LBL measure energy in megawatts?
No, but the media department won't be staffed with people with
degrees in physics (or any hard science).
Do bigger computers predict climate better?
That remains to be seen, but modelling individual cloud masses at
the 1km scale should work better than plugging in average cloud
cover for regions broken up into 100km by 100km squares The IEEE
Spectum published an article on "Cloud computing" a few years ago
that addressed this issue.
Oh dear.
John Larkin doesn't know much, and what he thinks he know mostly
comes from Anthony Watts' climate change denial web site.
On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <spam@flippers.com>
wrote:
On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast "extreme
storms" from 1982 to 2014.
https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer
---------<quote>-----------------
Lawrence Berkeley National Laboratory scientists are looking to make
highly detailed, 1 kilometer scale cloud models to improve climate >>predictions. Using current supercomputer designs of combining >>microprocessors used in personal computers, a system capable of making
such models would cost about $1 billion and use up 200 megawatts of
energy. A supercomputer using 20 million embedded processors, on the
other hand, would cost about $75 million and use less than 4 megawatts
of energy, according to Lawrence Berkeley National Laboratory researchers. >>-------------<end quote>--------------
4 megawatts/200 megawatts - do the computers factor in their heat >>generation in the climate models?
John ;-#)#
Does LBL measure energy in megawatts?
Do bigger computers predict climate better?
Oh dear.
On Tue, 26 Apr 2022 13:53:08 -0700, John Larkin ><jlarkin@highland_atwork_technology.com> wrote:
On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <spam@flippers.com> >>wrote:
On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast "extreme
storms" from 1982 to 2014.
https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer
---------<quote>-----------------
Lawrence Berkeley National Laboratory scientists are looking to make >>>highly detailed, 1 kilometer scale cloud models to improve climate >>>predictions. Using current supercomputer designs of combining >>>microprocessors used in personal computers, a system capable of making >>>such models would cost about $1 billion and use up 200 megawatts of >>>energy. A supercomputer using 20 million embedded processors, on the >>>other hand, would cost about $75 million and use less than 4 megawatts
of energy, according to Lawrence Berkeley National Laboratory researchers. >>>-------------<end quote>--------------
4 megawatts/200 megawatts - do the computers factor in their heat >>>generation in the climate models?
John ;-#)#
Does LBL measure energy in megawatts?
Do bigger computers predict climate better?
Oh dear.
I think the jury has already returned that there is climate
change/global warming and it is probably already too late to do much
about it with the short time needed for countries and people to react.
I would love to have a super computer to run LTspice.
I would love to have a super computer to run LTspice.
boB
On 4/28/22 11:26, boB wrote:
I would love to have a super computer to run LTspice.I thought one of the problems with LTspice (and spice in general)
performance is that the algorithms don't parallelize very well.
On 2022-04-28 18:26, boB wrote:
[...]
I would love to have a super computer to run LTspice.
boB
In fact, what you have on your desk *is* a super computer,
in the 1970's meaning of the words. It's just that it's
bogged down running bloatware.
On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:
On 4/28/22 11:26, boB wrote:
I would love to have a super computer to run LTspice.I thought one of the problems with LTspice (and spice in general)
performance is that the algorithms don't parallelize very well.
LT runs on multiple cores now. I'd love the next gen LT Spice to run
on an Nvidia card. 100x at least.
John Larkin wrote:
On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:
On 4/28/22 11:26, boB wrote:
I would love to have a super computer to run LTspice.I thought one of the problems with LTspice (and spice in general)
performance is that the algorithms don't parallelize very well.
LT runs on multiple cores now. I'd love the next gen LT Spice to run
on an Nvidia card. 100x at least.
The "number of threads" setting doesn't do anything very dramatic,
though, at least last time I tried. Splitting up the calculation
between cores would require all of them to communicate a couple of times
per time step, but lots of other simulation codes do that.
The main trouble is that the matrix defining the connectivity between
nodes is highly irregular in general.
Parallellizing that efficiently might well need a special-purpose
compiler, sort of similar to the profile-guided optimizer in the guts of
the FFTW code for computing DFTs. Probably not at all impossible, but
not that straightforward to implement.
Cheers
Phil Hobbs
On 2022-04-28 18:26, boB wrote:
[...]
I would love to have a super computer to run LTspice.
boB
In fact, what you have on your desk *is* a super computer,
in the 1970's meaning of the words. It's just that it's
bogged down running bloatware.
John Larkin wrote:
On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:
On 4/28/22 11:26, boB wrote:
I would love to have a super computer to run LTspice.I thought one of the problems with LTspice (and spice in general)
performance is that the algorithms don't parallelize very well.
LT runs on multiple cores now. I'd love the next gen LT Spice to run
on an Nvidia card. 100x at least.
The "number of threads" setting doesn't do anything very dramatic,
though, at least last time I tried. Splitting up the calculation
between cores would require all of them to communicate a couple of times
per time step, but lots of other simulation codes do that.
The main trouble is that the matrix defining the connectivity between
nodes is highly irregular in general.
Parallellizing that efficiently might well need a special-purpose
compiler, sort of similar to the profile-guided optimizer in the guts of
the FFTW code for computing DFTs. Probably not at all impossible, but
not that straightforward to implement.
On 28/04/2022 18:47, Jeroen Belleman wrote:
On 2022-04-28 18:26, boB wrote:
[...]
I would love to have a super computer to run LTspice.
boB
In fact, what you have on your desk *is* a super computer,Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock and
in the 1970's meaning of the words. It's just that it's
bogged down running bloatware.
a whopping for the time 128MB of fast core memory with 40GB of disk. The
one I used had an amazing for the time 1TB tape cassette backing store.
It did 600 MFLOPs with the right sort of parallel vector code.
That was back in the day when you needed special permission to use more
than 4MB of core on the timesharing IBM 3081 (approx 7 MIPS).
Current Intel 12 gen CPU desktops are ~4GHz, 16GB ram and >1TB of disk.
(and the upper limits are even higher) That combo does ~66,000 MFLOPS.
Spice simulation doesn't scale particularly well to large scale multiprocessor environments to many long range interractions.
Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:
John Larkin wrote:
On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:
On 4/28/22 11:26, boB wrote:
I would love to have a super computer to run LTspice.I thought one of the problems with LTspice (and spice in general)
performance is that the algorithms don't parallelize very well.
LT runs on multiple cores now. I'd love the next gen LT Spice to run
on an Nvidia card. 100x at least.
The "number of threads" setting doesn't do anything very dramatic,
though, at least last time I tried. Splitting up the calculation
between cores would require all of them to communicate a couple of times
per time step, but lots of other simulation codes do that.
The main trouble is that the matrix defining the connectivity between
nodes is highly irregular in general.
Parallellizing that efficiently might well need a special-purpose
compiler, sort of similar to the profile-guided optimizer in the guts of
the FFTW code for computing DFTs. Probably not at all impossible, but
not that straightforward to implement.
Cheers
Phil Hobbs
Supercomputers have thousands or hundreds of thousands of cores.
Quote:
"Cerebras Systems has unveiled its new Wafer Scale Engine 2 processor with
a record-setting 2.6 trillion transistors and 850,000 AI-optimized cores. It’s built for supercomputing tasks, and it’s the second time since 2019 that Los Altos, California-based Cerebras has unveiled a chip that is basically an entire wafer."
https://venturebeat.com/2021/04/20/cerebras-systems-launches-new-ai- supercomputing-processor-with-2-6-trillion-transistors/
Man, I wish I were back living in Los Altos again.
On 2022-04-28 18:26, boB wrote:
[...]
I would love to have a super computer to run LTspice.
boB
In fact, what you have on your desk *is* a super computer,
in the 1970's meaning of the words. It's just that it's
bogged down running bloatware.
Jeroen Belleman
On 29/04/2022 07:09, Phil Hobbs wrote:
John Larkin wrote:
On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:
On 4/28/22 11:26, boB wrote:
I would love to have a super computer to run LTspice.I thought one of the problems with LTspice (and spice in general)
performance is that the algorithms don't parallelize very well.
LT runs on multiple cores now. I'd love the next gen LT Spice to run
on an Nvidia card. 100x at least.
The "number of threads" setting doesn't do anything very dramatic,
though, at least last time I tried. Splitting up the calculation
between cores would require all of them to communicate a couple of
times per time step, but lots of other simulation codes do that.
If it is anything like chess problems then the memory bandwidth will
saturate long before all cores+threads are used to optimum effect. After
that point the additional threads merely cause it to run hotter.
I found setting max threads to about 70% of those notionally available produced the most computing power with the least heat. After that the performance gain per thread was negligible but the extra heat was not.
Having everything running full bore was actually slower and much hotter!
The main trouble is that the matrix defining the connectivity between
nodes is highly irregular in general.
Parallellizing that efficiently might well need a special-purpose
compiler, sort of similar to the profile-guided optimizer in the guts
of the FFTW code for computing DFTs. Probably not at all impossible,
but not that straightforward to implement.
I'm less than impressed with profile guided optimisers in compilers. The
only time I tried it in anger the instrumentation code interfered with
the execution of the algorithms to such an extent as to be meaningless.
One gotcha I have identified in the latest MSC is that when it uses
higher order SSE2, AVX, and AVX-512 implicitly in its code generation it
does not align them on the stack properly so that sometimes they are
split across two cache lines. I see two distinct speeds for each
benchmark code segment depending on how the cache allignment falls.
Basically the compiler forces stack alignment to 8 bytes and cache lines
are 64 bytes but the compiler generated objects in play are 16 bytes, 32 bytes or 64 bytes. Alignment failure fractions 1:4, 2:4 and 3:4.
If you manually allocate such objects you can use pragmas to force
optimal alignment but when the code generator chooses to use them
internally you have no such control. Even so the MS compiler does
generate blisteringly fast code compared to either Intel or GCC.
Jeroen Belleman wrote:
On 2022-04-28 18:26, boB wrote:
[...]
I would love to have a super computer to run LTspice.
boB
In fact, what you have on your desk *is* a super computer,
in the 1970's meaning of the words. It's just that it's
bogged down running bloatware.
Jeroen BellemanIn the 1990s meaning of the words, in fact. My 2011-vintage desktop box
runs 250 Gflops peak (2x 12-core Magny Cours, 64G main memory, RAID5 disks).
My phone is a supercomputer by 1970s standards. ;)
On 2022-04-28 18:26, boB wrote:
[...]
I would love to have a super computer to run LTspice.
boB
In fact, what you have on your desk *is* a super computer,
in the 1970's meaning of the words. It's just that it's
bogged down running bloatware.
Jeroen Belleman
John Larkin wrote:
On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:
On 4/28/22 11:26, boB wrote:
I would love to have a super computer to run LTspice.I thought one of the problems with LTspice (and spice in general)
performance is that the algorithms don't parallelize very well.
LT runs on multiple cores now. I'd love the next gen LT Spice to run
on an Nvidia card. 100x at least.
The "number of threads" setting doesn't do anything very dramatic,
though, at least last time I tried. Splitting up the calculation
between cores would require all of them to communicate a couple of times
per time step, but lots of other simulation codes do that.
The main trouble is that the matrix defining the connectivity between
nodes is highly irregular in general.
Parallellizing that efficiently might well need a special-purpose
compiler, sort of similar to the profile-guided optimizer in the guts of
the FFTW code for computing DFTs. Probably not at all impossible, but
not that straightforward to implement.
Cheers
Phil Hobbs
On Thu, 28 Apr 2022 19:47:03 +0200, Jeroen Belleman
<jer...@nospam.please> wrote:
On 2022-04-28 18:26, boB wrote:
[...]
I would love to have a super computer to run LTspice.
boB
In fact, what you have on your desk *is* a super computer,
in the 1970's meaning of the words. It's just that it's
bogged down running bloatware.
Jeroen BellemanMy phone probably has more compute power than all the computers in the
world about 1960.
On Friday, April 29, 2022 at 4:39:05 AM UTC-4, Martin Brown wrote:
On 28/04/2022 18:47, Jeroen Belleman wrote:
On 2022-04-28 18:26, boB wrote: [...]Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz
I would love to have a super computer to run LTspice.
boB
In fact, what you have on your desk *is* a super computer, in the
1970's meaning of the words. It's just that it's bogged down
running bloatware.
clock and a whopping for the time 128MB of fast core memory with
40GB of disk. The one I used had an amazing for the time 1TB tape
cassette backing store. It did 600 MFLOPs with the right sort of
parallel vector code.
That was back in the day when you needed special permission to use
more than 4MB of core on the timesharing IBM 3081 (approx 7 MIPS).
Current Intel 12 gen CPU desktops are ~4GHz, 16GB ram and >1TB of
disk. (and the upper limits are even higher) That combo does
~66,000 MFLOPS.
Spice simulation doesn't scale particularly well to large scale
multiprocessor environments to many long range interractions.
The Crays were nice if you had a few million dollars to spend. I
worked for a startup building more affordable supercomputers in the
same ball park of performance at a fraction of the price. Star
Technologies, ST-100 supported 100 MFLOPS and 32 MB of memory,
costing around $200,000 with 256 KB of RAM was a fraction of the cost
of the only slightly faster Cray X-MP, available at the same time.
On Fri, 29 Apr 2022 02:09:19 -0400, Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:
John Larkin wrote:
On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:
On 4/28/22 11:26, boB wrote:
I would love to have a super computer to run LTspice.I thought one of the problems with LTspice (and spice in general)
performance is that the algorithms don't parallelize very well.
LT runs on multiple cores now. I'd love the next gen LT Spice to run
on an Nvidia card. 100x at least.
The "number of threads" setting doesn't do anything very dramatic,
though, at least last time I tried. Splitting up the calculation
between cores would require all of them to communicate a couple of times
per time step, but lots of other simulation codes do that.
The main trouble is that the matrix defining the connectivity between
nodes is highly irregular in general.
Parallellizing that efficiently might well need a special-purpose
compiler, sort of similar to the profile-guided optimizer in the guts of
the FFTW code for computing DFTs. Probably not at all impossible, but
not that straightforward to implement.
Cheers
Phil Hobbs
Climate simulation uses enormous multi-CPU supercomputer rigs.
OK, I suppose that makes your point.
Mike Monett wrote:
Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:
John Larkin wrote:
On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:
On 4/28/22 11:26, boB wrote:
I would love to have a super computer to run LTspice.I thought one of the problems with LTspice (and spice in general)
performance is that the algorithms don't parallelize very well.
LT runs on multiple cores now. I'd love the next gen LT Spice to run
on an Nvidia card. 100x at least.
The "number of threads" setting doesn't do anything very dramatic,
though, at least last time I tried. Splitting up the calculation
between cores would require all of them to communicate a couple of times >>> per time step, but lots of other simulation codes do that.
The main trouble is that the matrix defining the connectivity between
nodes is highly irregular in general.
Parallellizing that efficiently might well need a special-purpose
compiler, sort of similar to the profile-guided optimizer in the guts of >>> the FFTW code for computing DFTs. Probably not at all impossible, but
not that straightforward to implement.
Cheers
Phil Hobbs
Supercomputers have thousands or hundreds of thousands of cores.
Quote:
"Cerebras Systems has unveiled its new Wafer Scale Engine 2 processor with >> a record-setting 2.6 trillion transistors and 850,000 AI-optimized cores.
It’s built for supercomputing tasks, and it’s the second time since 2019
that Los Altos, California-based Cerebras has unveiled a chip that is
basically an entire wafer."
https://venturebeat.com/2021/04/20/cerebras-systems-launches-new-ai-
supercomputing-processor-with-2-6-trillion-transistors/
Number of cores isn't the problem. For fairly tightly-coupled tasks
such as simulations, the issue is interconnect latency between cores,
and the required bandwidth goes roughly as the cube or Moore's law, so
it ran out of gas long ago.
One thing that zillions of cores could do for SPICE is to do all the
stepped parameter runs simultaneously. At that point all you need is >infinite bandwidth to disk.
On Fri, 29 Apr 2022 10:03:23 -0400, Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:
Mike Monett wrote:
Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:
John Larkin wrote:
On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none>
wrote:
On 4/28/22 11:26, boB wrote:
I would love to have a super computer to run LTspice.I thought one of the problems with LTspice (and spice in
general) performance is that the algorithms don't
parallelize very well.
LT runs on multiple cores now. I'd love the next gen LT Spice
to run on an Nvidia card. 100x at least.
The "number of threads" setting doesn't do anything very
dramatic, though, at least last time I tried. Splitting up the
calculation between cores would require all of them to
communicate a couple of times per time step, but lots of other
simulation codes do that.
The main trouble is that the matrix defining the connectivity
between nodes is highly irregular in general.
Parallellizing that efficiently might well need a
special-purpose compiler, sort of similar to the profile-guided
optimizer in the guts of the FFTW code for computing DFTs.
Probably not at all impossible, but not that straightforward to
implement.
Cheers
Phil Hobbs
Supercomputers have thousands or hundreds of thousands of cores.
Quote:
"Cerebras Systems has unveiled its new Wafer Scale Engine 2
processor with a record-setting 2.6 trillion transistors and
850,000 AI-optimized cores. It’s built for supercomputing tasks,
and it’s the second time since 2019 that Los Altos,
California-based Cerebras has unveiled a chip that is basically
an entire wafer."
https://venturebeat.com/2021/04/20/cerebras-systems-launches-new-ai- supercomputing-processor-with-2-6-trillion-transistors/
Number of cores isn't the problem. For fairly tightly-coupled
tasks such as simulations, the issue is interconnect latency
between cores, and the required bandwidth goes roughly as the cube
or Moore's law, so it ran out of gas long ago.
One thing that zillions of cores could do for SPICE is to do all
the stepped parameter runs simultaneously. At that point all you
need is infinite bandwidth to disk.
This whole hairball is summarized in Amdahl's Law:
.<https://en.wikipedia.org/wiki/Amdahl%27s_law#:~:text=In%20computer%20architecture%2C%20Amdahl's%20law,system%20whose%20resources%20are%20improved>
Joe Gwinn wrote:
On Fri, 29 Apr 2022 10:03:23 -0400, Phil Hobbs
<pcdhSpamMeSenseless@electrooptical.net> wrote:
Mike Monett wrote:
Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:
John Larkin wrote:
On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none>
wrote:
On 4/28/22 11:26, boB wrote:
I would love to have a super computer to run LTspice.I thought one of the problems with LTspice (and spice in
general) performance is that the algorithms don't
parallelize very well.
LT runs on multiple cores now. I'd love the next gen LT Spice
to run on an Nvidia card. 100x at least.
The "number of threads" setting doesn't do anything very
dramatic, though, at least last time I tried. Splitting up the
calculation between cores would require all of them to
communicate a couple of times per time step, but lots of other
simulation codes do that.
The main trouble is that the matrix defining the connectivity
between nodes is highly irregular in general.
Parallellizing that efficiently might well need a
special-purpose compiler, sort of similar to the profile-guided
optimizer in the guts of the FFTW code for computing DFTs.
Probably not at all impossible, but not that straightforward to
implement.
Cheers
Phil Hobbs
Supercomputers have thousands or hundreds of thousands of cores.
Quote:
"Cerebras Systems has unveiled its new Wafer Scale Engine 2
processor with a record-setting 2.6 trillion transistors and
850,000 AI-optimized cores. It’s built for supercomputing tasks,
and it’s the second time since 2019 that Los Altos,
California-based Cerebras has unveiled a chip that is basically
an entire wafer."
https://venturebeat.com/2021/04/20/cerebras-systems-launches-new-ai- >supercomputing-processor-with-2-6-trillion-transistors/
Number of cores isn't the problem. For fairly tightly-coupled
tasks such as simulations, the issue is interconnect latency
between cores, and the required bandwidth goes roughly as the cube
or Moore's law, so it ran out of gas long ago.
One thing that zillions of cores could do for SPICE is to do all
the stepped parameter runs simultaneously. At that point all you
need is infinite bandwidth to disk.
This whole hairball is summarized in Amdahl's Law:
.<https://en.wikipedia.org/wiki/Amdahl%27s_law#:~:text=In%20computer%20architecture%2C%20Amdahl's%20law,system%20whose%20resources%20are%20improved>
Not exactly. There's very little serial execution required to
parallellize parameter stepping, or even genetic-algorithm optimization.
Communications overhead isn't strictly serial either--N processors can
have several times N communication channels. It's mostly a latency issue.
On Fri, 29 Apr 2022 20:51:43 -0400, Phil Hobbs <pcdhSpamM...@electrooptical.net> wrote:
Joe Gwinn wrote:
On Fri, 29 Apr 2022 10:03:23 -0400, Phil Hobbs
<pcdhSpamM...@electrooptical.net> wrote:
Mike Monett wrote:
Phil Hobbs <pcdhSpamM...@electrooptical.net> wrote:
John Larkin wrote:
On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <den...@none.none>
wrote:
On 4/28/22 11:26, boB wrote:
I would love to have a super computer to run LTspice.I thought one of the problems with LTspice (and spice in
general) performance is that the algorithms don't
parallelize very well.
LT runs on multiple cores now. I'd love the next gen LT Spice
to run on an Nvidia card. 100x at least.
The "number of threads" setting doesn't do anything very
dramatic, though, at least last time I tried. Splitting up the
calculation between cores would require all of them to
communicate a couple of times per time step, but lots of other
simulation codes do that.
The main trouble is that the matrix defining the connectivity
between nodes is highly irregular in general.
Parallellizing that efficiently might well need a
special-purpose compiler, sort of similar to the profile-guided
optimizer in the guts of the FFTW code for computing DFTs.
Probably not at all impossible, but not that straightforward to
implement.
Cheers
Phil Hobbs
Supercomputers have thousands or hundreds of thousands of cores.
Quote:
"Cerebras Systems has unveiled its new Wafer Scale Engine 2
processor with a record-setting 2.6 trillion transistors and
850,000 AI-optimized cores. It’s built for supercomputing tasks,
and it’s the second time since 2019 that Los Altos,
California-based Cerebras has unveiled a chip that is basically
an entire wafer."
https://venturebeat.com/2021/04/20/cerebras-systems-launches-new-ai- >supercomputing-processor-with-2-6-trillion-transistors/
Number of cores isn't the problem. For fairly tightly-coupled
tasks such as simulations, the issue is interconnect latency
between cores, and the required bandwidth goes roughly as the cube
or Moore's law, so it ran out of gas long ago.
One thing that zillions of cores could do for SPICE is to do all
the stepped parameter runs simultaneously. At that point all you
need is infinite bandwidth to disk.
This whole hairball is summarized in Amdahl's Law:
.<https://en.wikipedia.org/wiki/Amdahl%27s_law#:~:text=In%20computer%20architecture%2C%20Amdahl's%20law,system%20whose%20resources%20are%20improved>
Not exactly. There's very little serial execution required to
parallellize parameter stepping, or even genetic-algorithm optimization.
Communications overhead isn't strictly serial either--N processors can >have several times N communication channels. It's mostly a latency issue.In general, yes. But far too far down in the weeds.
Amdahl's Law is easier to explain to a business manager that thinks
that parallelism solves all performance issues, if only the engineers
would stop carping and do their jobs.
And then there are the architectures that would do wondrous things, if
only light were not so damn slow.
Joe Gwinn wrote:
On Fri, 29 Apr 2022 10:03:23 -0400, Phil Hobbs
<pcdhSpamMeSenseless@electrooptical.net> wrote:
Number of cores isn't the problem. For fairly tightly-coupled
tasks such as simulations, the issue is interconnect latency
between cores, and the required bandwidth goes roughly as the cube
or Moore's law, so it ran out of gas long ago.
One thing that zillions of cores could do for SPICE is to do all
the stepped parameter runs simultaneously. At that point all you
need is infinite bandwidth to disk.
This whole hairball is summarized in Amdahl's Law:
.<https://en.wikipedia.org/wiki/Amdahl%27s_law#:~:text=In%20computer%20architecture%2C%20Amdahl's%20law,system%20whose%20resources%20are%20improved>
Not exactly. There's very little serial execution required to
parallellize parameter stepping, or even genetic-algorithm optimization.
Communications overhead isn't strictly serial either--N processors can
have several times N communication channels. It's mostly a latency issue.
On 30/04/2022 01:51, Phil Hobbs wrote:
Joe Gwinn wrote:
On Fri, 29 Apr 2022 10:03:23 -0400, Phil Hobbs
<pcdhSpamMeSenseless@electrooptical.net> wrote:
Number of cores isn't the problem. For fairly tightly-coupled
tasks such as simulations, the issue is interconnect latency
between cores, and the required bandwidth goes roughly as the cube
or Moore's law, so it ran out of gas long ago.
One thing that zillions of cores could do for SPICE is to do all
the stepped parameter runs simultaneously. At that point all you
need is infinite bandwidth to disk.
Parallelism for exploring a wide range starting parameters and then
evolving them based on how well the model fits seems to be in vogue now. eg
https://arxiv.org/abs/1804.04737
This whole hairball is summarized in Amdahl's Law:
.<https://en.wikipedia.org/wiki/Amdahl%27s_law#:~:text=In%20computer%20architecture%2C%20Amdahl's%20law,system%20whose%20resources%20are%20improved>
Not exactly. There's very little serial execution required to
parallellize parameter stepping, or even genetic-algorithm optimization.
Communications overhead isn't strictly serial either--N processors can
have several times N communication channels. It's mostly a latency
issue.
Anyone who has ever done it quickly learns that by far the most
important highest priority task is the not the computation itself but
the management required to keep all of the cores doing useful work!
It is easy to have all cores working flat out but if most of the
parallelised work being done so quickly will be later shown to be
redundant due to some higher level pruning algorithm all you are doing
is generating more heat and only a miniscule performance gain (if that).
SIMD has made quite a performance improvement for some problems on the
Intel and AMD platforms. The compilers still haven't quite caught up
with the hardware though. Alignment is now a rather annoying issue if
you care about avoiding unnecessary cache misses and pipeline stalls.
You can align your own structures correctly but can do nothing about
virtual structures that the compiler creates and puts on the stack
misaligned spanning two cache lines. The result is code which executes
with two distinct characteristic times depending on where the cache line boundaries are in relation the top of stack when it is called!
It really only matters in the very deepest levels of computationally intensive code which is probably why they don't try quite hard enough.
Most people probably wouldn't notice ~5% changes unless they were benchmarking or monitoring MSRs for cache misses and pipeline stalls.
On 28/04/2022 18:47, Jeroen Belleman wrote:
On 2022-04-28 18:26, boB wrote:
[...]
I would love to have a super computer to run LTspice.
boB
In fact, what you have on your desk *is* a super computer,
in the 1970's meaning of the words. It's just that it's
bogged down running bloatware.
Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock and
a whopping for the time 128MB of fast core memory with 40GB of disk. The
one I used had an amazing for the time 1TB tape cassette backing store.
It did 600 MFLOPs with the right sort of parallel vector code.
That was back in the day when you needed special permission to use more
than 4MB of core on the timesharing IBM 3081 (approx 7 MIPS).
Current Intel 12 gen CPU desktops are ~4GHz, 16GB ram and >1TB of disk.
(and the upper limits are even higher) That combo does ~66,000 MFLOPS.
Spice simulation doesn't scale particularly well to large scale multiprocessor environments to many long range interractions.
Climate simulation uses enormous multi-CPU supercomputer rigs.
Martin Brown <'''newspam'''@nonad.co.uk> wrote:
On 28/04/2022 18:47, Jeroen Belleman wrote:what is fast core memory?
On 2022-04-28 18:26, boB wrote:Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock and
[...]
I would love to have a super computer to run LTspice.In fact, what you have on your desk *is* a super computer,
boB
in the 1970's meaning of the words. It's just that it's
bogged down running bloatware.
a whopping for the time 128MB of fast core memory with 40GB of disk. The
Martin Brown <'''newspam'''@nonad.co.uk> wrote:
On 28/04/2022 18:47, Jeroen Belleman wrote:
On 2022-04-28 18:26, boB wrote:
[...]
I would love to have a super computer to run LTspice.
boB
In fact, what you have on your desk *is* a super computer,
in the 1970's meaning of the words. It's just that it's
bogged down running bloatware.
Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock andwhat is fast core memory?
a whopping for the time 128MB of fast core memory with 40GB of disk. The
On Friday, April 29, 2022 at 7:30:55 AM UTC-7, jla...@highlandsniptechnology.com wrote:
Climate simulation uses enormous multi-CPU supercomputer rigs.
Not so; it's WEATHER mapping and prediction that uses the complex
data sets for a varied bunch of globe locations doing sensing, to
make a 3-d map for the planet's atmosphere. Climate is a much
cruder problem, no details required. Much of the greenhouse gas
analysis comes out of models that a PC spreadsheet would handle
easily.
On 05/03/2022 03:12 PM, Cydrome Leader wrote:
Martin Brown <'''newspam'''@nonad.co.uk> wrote:
On 28/04/2022 18:47, Jeroen Belleman wrote:
On 2022-04-28 18:26, boB wrote:
[...]
I would love to have a super computer to run LTspice.In fact, what you have on your desk *is* a super computer,
boB
in the 1970's meaning of the words. It's just that it's
bogged down running bloatware.
Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock and >>> a whopping for the time 128MB of fast core memory with 40GB of disk. Thewhat is fast core memory?
A very expensive item:
https://en.wikipedia.org/wiki/Magnetic-core_memory
Fortunately by the X-MP's time SRAMs had replaced magnetic core.
On 04/05/2022 03:35, rbowman wrote:
On 05/03/2022 03:12 PM, Cydrome Leader wrote:
Martin Brown <'''newspam'''@nonad.co.uk> wrote:
On 28/04/2022 18:47, Jeroen Belleman wrote:
On 2022-04-28 18:26, boB wrote:
[...]
I would love to have a super computer to run LTspice.In fact, what you have on your desk *is* a super computer,
boB
in the 1970's meaning of the words. It's just that it's
bogged down running bloatware.
Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clockwhat is fast core memory?
and
a whopping for the time 128MB of fast core memory with 40GB of disk.
The
A very expensive item:
https://en.wikipedia.org/wiki/Magnetic-core_memory
Fortunately by the X-MP's time SRAMs had replaced magnetic core.
But at the time it was still often called core (bulk) memory as opposed
to faster cache memory. ISTR the memory chips were only 4k bits of SRAM.
Keeping the thing compact and cool was a major part of the engineering.
There is a rather nice article about its design online here.
http://www.chilton-computing.org.uk/ccd/supercomputers/p005.htm
On 05/03/2022 03:12 PM, Cydrome Leader wrote:
Martin Brown <'''newspam'''@nonad.co.uk> wrote:
On 28/04/2022 18:47, Jeroen Belleman wrote:what is fast core memory?
On 2022-04-28 18:26, boB wrote:Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock and >>> a whopping for the time 128MB of fast core memory with 40GB of disk. The
[...]
I would love to have a super computer to run LTspice.In fact, what you have on your desk *is* a super computer,
boB
in the 1970's meaning of the words. It's just that it's
bogged down running bloatware.
A very expensive item:
https://en.wikipedia.org/wiki/Magnetic-core_memory
Fortunately by the X-MP's time SRAMs had replaced magnetic core.
I'm not aware of any cray systems that used core memory. It just makes no sense for the speeds they ran at.
On 28/04/2022 18:47, Jeroen Belleman wrote:
On 2022-04-28 18:26, boB wrote:
[...]
I would love to have a super computer to run LTspice.
boB
In fact, what you have on your desk *is* a super computer,
in the 1970's meaning of the words. It's just that it's
bogged down running bloatware.
Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock and
a whopping for the time 128MB of fast core memory with 40GB of disk. The
one I used had an amazing for the time 1TB tape cassette backing store.
It did 600 MFLOPs with the right sort of parallel vector code.
That was back in the day when you needed special permission to use more
than 4MB of core on the timesharing IBM 3081 (approx 7 MIPS).
Current Intel 12 gen CPU desktops are ~4GHz, 16GB ram and >1TB of disk.
(and the upper limits are even higher) That combo does ~66,000 MFLOPS.
Spice simulation doesn't scale particularly well to large scale multiprocessor environments to many long range interractions.
Martin Brown wrote:
Spice simulation doesn't scale particularly well to large scale
multiprocessor environments too many long range interractions.
If you search for "circuit sim and CUDA" it's out there. There's a
Github of "CUDA SPICE Circuit Simulator" .
No clue if it's worthwhile.
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast "extreme
storms" from 1982 to 2014.
The conclusion from a senior scientist is that "it rains a lot more
during the worst storms."
--Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.
Anybody can count to one.
- Robert Widlar
On Tuesday, 26 April 2022 at 17:44:53 UTC+2, jla...@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer analysis of climate change. They analyzed five west coast "extreme
storms" from 1982 to 2014.
The conclusion from a senior scientist is that "it rains a lot more
during the worst storms."
Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.
Freon is another fake.
Climate is clocked by solar activity and by fluctuations in solar activity.
So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.
Removing trees within city limits, you can turn any city in heat island with rising temperatures, since removing trees, grass, you destroy rainwater retention mechanism.
Water absorbs heat from the sun by evaporation.
So if no water in the ground, no water evaporated and heat accumulates, making local temperatures to rise.
On Monday, June 20, 2022 at 2:47:09 PM UTC+2, a a wrote:180 ppm) and interglacials (atmospheric CO2 levels around 270 ppm) also depends on the more extensive ice cover during interglacials, but the CO2 levels do account for a lot of the difference.
On Tuesday, 26 April 2022 at 17:44:53 UTC+2, jla...@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer analysis of climate change. They analyzed five west coast "extreme storms" from 1982 to 2014.
The conclusion from a senior scientist is that "it rains a lot more during the worst storms."
Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.What a load of nonsense. Al Gore's 1992 book
https://en.wikipedia.org/wiki/Earth_in_the_Balance
was a remarkably expert bit of science popularisation. He got the science right, not that the denialist propaganda machine is willing to admit it.
The book did make money, but not all that much. A decade later it did put Al Gore in a position to make money out of climate change, but that didn't mean that he wrote it with that in mind.
Freon is another fake.
In what sense? Chlorofluorocarbons do damage the ozone layer. We know exactly how - and we know that reducing their concentrations in the atmosphere is letting the ozone layer get denser again. The fakery here lies in your lie.
Climate is clocked by solar activity and by fluctuations in solar activity.And the amount of CO2 and other green-house gases in the atmosphere. As Joseph Fourier worked out in 1824, if they weren't there the temperature of the surface of the Earth would be -18C. The difference between ice ages (atmospheric CO2 levels around
You are exactly "Total nonsense. "So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.Except that you can't. Solar activity doesn't explain the ice age to interglacial transitions, and only an ignorant idiot could imagine that they did
Removing trees within city limits, you can turn any city in heat island with rising temperatures, since removing trees, grass, you destroy rainwater retention mechanism.
Total nonsense.
Water absorbs heat from the sun by evaporation.But the water vapour retains the heat at the bottom of the atmosphere.
So if no water in the ground, no water evaporated and heat accumulates, making local temperatures to rise.So what?
--
Bill Sloman, Sydney
On Monday, 20 June 2022 at 15:13:10 UTC+2, bill....@ieee.org wrote:180 ppm) and interglacials (atmospheric CO2 levels around 270 ppm) also depends on the more extensive ice cover during interglacials, but the CO2 levels do account for a lot of the difference.
On Monday, June 20, 2022 at 2:47:09 PM UTC+2, a a wrote:
On Tuesday, 26 April 2022 at 17:44:53 UTC+2, jla...@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer analysis of climate change. They analyzed five west coast "extreme storms" from 1982 to 2014.
The conclusion from a senior scientist is that "it rains a lot more during the worst storms."
Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.What a load of nonsense. Al Gore's 1992 book
https://en.wikipedia.org/wiki/Earth_in_the_Balance
was a remarkably expert bit of science popularisation. He got the science right, not that the denialist propaganda machine is willing to admit it.
The book did make money, but not all that much. A decade later it did put Al Gore in a position to make money out of climate change, but that didn't mean that he wrote it with that in mind.
Freon is another fake.
In what sense? Chlorofluorocarbons do damage the ozone layer. We know exactly how - and we know that reducing their concentrations in the atmosphere is letting the ozone layer get denser again. The fakery here lies in your lie.
Climate is clocked by solar activity and by fluctuations in solar activity.And the amount of CO2 and other green-house gases in the atmosphere. As Joseph Fourier worked out in 1824, if they weren't there the temperature of the surface of the Earth would be -18C. The difference between ice ages (atmospheric CO2 levels around
So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.Except that you can't. Solar activity doesn't explain the ice age to interglacial transitions, and only an ignorant idiot could imagine that they did
Removing trees within city limits, you can turn any city in heat island with rising temperatures, since removing trees, grass, you destroy rainwater retention mechanism.
Total nonsense.
Water absorbs heat from the sun by evaporation.
But the water vapour retains the heat at the bottom of the atmosphere.
So if no water in the ground, no water evaporated and heat accumulates, making local temperatures to rise.
So what?
You are exactly "Total nonsense. "
Sydney is low science city, so we don't care.
Call prof. Mann and tell him, there has been no sea level rise at Pacific islands at all, on the Maledives, for the last 1,000 years.
If Kremlin funds hundreds of so called pseudo scientists world-wide to sell more natural gas, so call Greta and ask her, where is she with the Global Warming fake today.
where is UNFCC Bonn agency, where is UN New York SIDS agency today (Small Island Developing States)
If $Bs are pumped into your bank account, so you sell every paranoia as a genuine science.
Global Warming is an old fake funded by Putin to sell more natural gas.
On Monday, June 20, 2022 at 3:36:04 PM UTC+2, a a wrote:around 180 ppm) and interglacials (atmospheric CO2 levels around 270 ppm) also depends on the more extensive ice cover during interglacials, but the CO2 levels do account for a lot of the difference.
On Monday, 20 June 2022 at 15:13:10 UTC+2, bill....@ieee.org wrote:
On Monday, June 20, 2022 at 2:47:09 PM UTC+2, a a wrote:
On Tuesday, 26 April 2022 at 17:44:53 UTC+2, jla...@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer analysis of climate change. They analyzed five west coast "extreme storms" from 1982 to 2014.
The conclusion from a senior scientist is that "it rains a lot more during the worst storms."
Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.What a load of nonsense. Al Gore's 1992 book
https://en.wikipedia.org/wiki/Earth_in_the_Balance
was a remarkably expert bit of science popularisation. He got the science right, not that the denialist propaganda machine is willing to admit it.
The book did make money, but not all that much. A decade later it did put Al Gore in a position to make money out of climate change, but that didn't mean that he wrote it with that in mind.
Freon is another fake.
In what sense? Chlorofluorocarbons do damage the ozone layer. We know exactly how - and we know that reducing their concentrations in the atmosphere is letting the ozone layer get denser again. The fakery here lies in your lie.
Climate is clocked by solar activity and by fluctuations in solar activity.And the amount of CO2 and other green-house gases in the atmosphere. As Joseph Fourier worked out in 1824, if they weren't there the temperature of the surface of the Earth would be -18C. The difference between ice ages (atmospheric CO2 levels
urgently necessary, even if ignorant idiots like you don't understand why.So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.Except that you can't. Solar activity doesn't explain the ice age to interglacial transitions, and only an ignorant idiot could imagine that they did
Removing trees within city limits, you can turn any city in heat island with rising temperatures, since removing trees, grass, you destroy rainwater retention mechanism.
Total nonsense.
Water absorbs heat from the sun by evaporation.
But the water vapour retains the heat at the bottom of the atmosphere.
So if no water in the ground, no water evaporated and heat accumulates, making local temperatures to rise.
So what?
You are exactly "Total nonsense. "You may like to think so, but you haven't explained why you think that. It's blindingly obvious that you couldn't, even if you were silly enough to try.
Sydney is low science city, so we don't care.
https://www.fqt.unsw.edu.au/news/top-physics-prizes-awarded-to-unsw-researchers
I got in on a tour of that lab. I was impressed by their Raith electron beam microfabricator, which is a pretty impressive kind of lab tool.
Call prof. Mann and tell him, there has been no sea level rise at Pacific islands at all, on the Maledives, for the last 1,000 years.
The Maledives are in the Indian Ocean, not too far south of Ceylon. I'd prefer not to get jeered at as an ignorant idiot. And they do seem to be worried about
sea level rise.
https://en.wikipedia.org/wiki/Maldives#Sea_level_rise
If Kremlin funds hundreds of so called pseudo scientists world-wide to sell more natural gas, so call Greta and ask her, where is she with the Global Warming fake today.
It's not the Kremlin that's funding the lying that is going on. Exxon-Mobile does a lot of that, but they are funding the climate change denial propaganda that seems to be fooling you.
where is UNFCC Bonn agency, where is UN New York SIDS agency today (Small Island Developing States)Why should I care?
If $Bs are pumped into your bank account, so you sell every paranoia as a genuine science.
If only.
Global Warming is an old fake funded by Putin to sell more natural gas.
https://history.aip.org/climate/index.htm
It's been around for rather longer than Putin, and it's not great way of selling natural gas. We'll have to stop burning that as fuel as well as coal and oil if we are going to stop raising the CO2 level in the atmosphere, which is getting to be
--==. And they do seem to be worried about sea level rise.
Bill Sloman, Sydney
On Monday, June 20, 2022 at 3:36:04 PM UTC+2, a a wrote:around 180 ppm) and interglacials (atmospheric CO2 levels around 270 ppm) also depends on the more extensive ice cover during interglacials, but the CO2 levels do account for a lot of the difference.
On Monday, 20 June 2022 at 15:13:10 UTC+2, bill....@ieee.org wrote:
On Monday, June 20, 2022 at 2:47:09 PM UTC+2, a a wrote:
On Tuesday, 26 April 2022 at 17:44:53 UTC+2, jla...@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer analysis of climate change. They analyzed five west coast "extreme storms" from 1982 to 2014.
The conclusion from a senior scientist is that "it rains a lot more during the worst storms."
Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.What a load of nonsense. Al Gore's 1992 book
https://en.wikipedia.org/wiki/Earth_in_the_Balance
was a remarkably expert bit of science popularisation. He got the science right, not that the denialist propaganda machine is willing to admit it.
The book did make money, but not all that much. A decade later it did put Al Gore in a position to make money out of climate change, but that didn't mean that he wrote it with that in mind.
Freon is another fake.
In what sense? Chlorofluorocarbons do damage the ozone layer. We know exactly how - and we know that reducing their concentrations in the atmosphere is letting the ozone layer get denser again. The fakery here lies in your lie.
Climate is clocked by solar activity and by fluctuations in solar activity.And the amount of CO2 and other green-house gases in the atmosphere. As Joseph Fourier worked out in 1824, if they weren't there the temperature of the surface of the Earth would be -18C. The difference between ice ages (atmospheric CO2 levels
----Sydney is low science city, so we don't care.So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.Except that you can't. Solar activity doesn't explain the ice age to interglacial transitions, and only an ignorant idiot could imagine that they did
Removing trees within city limits, you can turn any city in heat island with rising temperatures, since removing trees, grass, you destroy rainwater retention mechanism.
Total nonsense.
Water absorbs heat from the sun by evaporation.
But the water vapour retains the heat at the bottom of the atmosphere.
So if no water in the ground, no water evaporated and heat accumulates, making local temperatures to rise.
So what?
You are exactly "Total nonsense. "You may like to think so, but you haven't explained why you think that. It's blindingly obvious that you couldn't, even if you were silly enough to try.
Sydney is low science city, so we don't care.
It's been around for rather longer than Putin, and it's not great way of selling natural gas. We'll have to stop burning that as fuel as well as coal and oil if we are going to stop raising the CO2 level in the atmosphere,
--
Bill Sloman, Sydney
On Monday, 20 June 2022 at 16:27:36 UTC+2, bill....@ieee.org wrote:around 180 ppm) and interglacials (atmospheric CO2 levels around 270 ppm) also depends on the more extensive ice cover during interglacials, but the CO2 levels do account for a lot of the difference.
On Monday, June 20, 2022 at 3:36:04 PM UTC+2, a a wrote:
On Monday, 20 June 2022 at 15:13:10 UTC+2, bill....@ieee.org wrote:
On Monday, June 20, 2022 at 2:47:09 PM UTC+2, a a wrote:
On Tuesday, 26 April 2022 at 17:44:53 UTC+2, jla...@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast "extreme storms" from 1982 to 2014.
The conclusion from a senior scientist is that "it rains a lot more
during the worst storms."
Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.What a load of nonsense. Al Gore's 1992 book
https://en.wikipedia.org/wiki/Earth_in_the_Balance
was a remarkably expert bit of science popularisation. He got the science right, not that the denialist propaganda machine is willing to admit it.
The book did make money, but not all that much. A decade later it did put Al Gore in a position to make money out of climate change, but that didn't mean that he wrote it with that in mind.
Freon is another fake.
In what sense? Chlorofluorocarbons do damage the ozone layer. We know exactly how - and we know that reducing their concentrations in the atmosphere is letting the ozone layer get denser again. The fakery here lies in your lie.
Climate is clocked by solar activity and by fluctuations in solar activity.And the amount of CO2 and other green-house gases in the atmosphere. As Joseph Fourier worked out in 1824, if they weren't there the temperature of the surface of the Earth would be -18C. The difference between ice ages (atmospheric CO2 levels
urgently necessary, even if ignorant idiots like you don't understand why.So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.Except that you can't. Solar activity doesn't explain the ice age to interglacial transitions, and only an ignorant idiot could imagine that they did
Removing trees within city limits, you can turn any city in heat island with rising temperatures, since removing trees, grass, you destroy rainwater retention mechanism.
Total nonsense.
Water absorbs heat from the sun by evaporation.
But the water vapour retains the heat at the bottom of the atmosphere.
So if no water in the ground, no water evaporated and heat accumulates, making local temperatures to rise.
So what?
You are exactly "Total nonsense. "
You may like to think so, but you haven't explained why you think that. It's blindingly obvious that you couldn't, even if you were silly enough to try.
Sydney is low science city, so we don't care.
https://www.fqt.unsw.edu.au/news/top-physics-prizes-awarded-to-unsw-researchers
I got in on a tour of that lab. I was impressed by their Raith electron beam microfabricator, which is a pretty impressive kind of lab tool.
Call prof. Mann and tell him, there has been no sea level rise at Pacific islands at all, on the Maledives, for the last 1,000 years.
The Maledives are in the Indian Ocean, not too far south of Ceylon. I'd prefer not to get jeered at as an ignorant idiot. And they do seem to be worried about
sea level rise.
https://en.wikipedia.org/wiki/Maldives#Sea_level_rise
If Kremlin funds hundreds of so called pseudo scientists world-wide to sell more natural gas, so call Greta and ask her, where is she with the Global Warming fake today.
It's not the Kremlin that's funding the lying that is going on. Exxon-Mobile does a lot of that, but they are funding the climate change denial propaganda that seems to be fooling you.
where is UNFCC Bonn agency, where is UN New York SIDS agency today (Small Island Developing States)
Why should I care?
If $Bs are pumped into your bank account, so you sell every paranoia as a genuine science.
If only.
Global Warming is an old fake funded by Putin to sell more natural gas.
https://history.aip.org/climate/index.htm
It's been around for rather longer than Putin, and it's not great way of selling natural gas. We'll have to stop burning that as fuel as well as coal and oil if we are going to stop raising the CO2 level in the atmosphere, which is getting to be
==. And they do seem to be worried about sea level rise.
Since Global Warming fake is long lasting fake, funded by Kremlin
It was my excellent long-year job to move UN agencies from Global Warming fake to Climate Change
Climate Change is pure tautology by Heraclitus Everything flows - Panta rhei
BTW
Australia, Sydney is low on science due low population, not attracting foreign scientists, researchers and low AUD exchange rate.
On Monday, 20 June 2022 at 16:27:36 UTC+2, bill....@ieee.org wrote:
On Monday, June 20, 2022 at 3:36:04 PM UTC+2, a a wrote:
On Monday, 20 June 2022 at 15:13:10 UTC+2, bill....@ieee.org wrote:
On Monday, June 20, 2022 at 2:47:09 PM UTC+2, a a wrote:
On Tuesday, 26 April 2022 at 17:44:53 UTC+2, jla...@highlandsniptechnology.com wrote:
----Sydney is low science city, so we don't care.
CO2 is welcome
CO2 is Plant Food
Plants are Animal Food
Animals are Human Food
More CO2 more Human Food to end the world hunger.
Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.
Freon is another fake.
Climate is clocked by solar activity and by fluctuations in solar activity.
So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.
On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:you are completely wrong
Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.False, of course. Al Gore had some election-year lies aimed that way, and as for 'make money',
all Mann got was some frivolous lawsuits. The court had his legal team's fees reimbursed
by the jackals that brought the suit against him. The court did that because the suits
were found to be 'barratry', rather than being serious complaints.
Freon is another fake.
Not so; that's a DuPont tradename for fluorocarbon products, under worldwide ban
due to ozone depletion.
Climate is clocked by solar activity and by fluctuations in solar activity.'clocked by'??? Climate is affected by solar heat (influx of heat dominates during the day) and
radiative cooling (heat dissipates into space, dominates the heatflow at night). The steady-state
average temperature isn't proportional to anything the Sun does, but is set by the difference of
those two heat flows (the difference is zero when steady-state temperature is achieved).
So, your identification of 'solar activity' is only a half-truth at best, and in most literature,
'solar activity' only means sunspot fluctuations, not solar heat output. Indeed, the solar
heat output is set by fusion rates in the sun's center, many thousands of miles away from
the photosphere where we see sunspots.
So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.Utterance of nonsense is detected. Agriculture, forestry, water resources, sea life are all being hurt
by climate change, and gazing at Mr.Sun isn't a rational plan to deal with it.
On Tuesday, 21 June 2022 at 19:48:01 UTC+2, whit3rd wrote:
On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:
Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.False, of course. Al Gore had some election-year lies aimed that way, and as for 'make money',
all Mann got was some frivolous lawsuits. The court had his legal team's fees reimbursed
by the jackals that brought the suit against him. The court did that because the suits
were found to be 'barratry', rather than being serious complaints.
Freon is another fake.
Not so; that's a DuPont tradename for fluorocarbon products, under worldwide ban
due to ozone depletion.
Climate is clocked by solar activity and by fluctuations in solar activity.
'clocked by'??? Climate is affected by solar heat (influx of heat dominates during the day) and
radiative cooling (heat dissipates into space, dominates the heatflow at night). The steady-state
average temperature isn't proportional to anything the Sun does, but is set by the difference of
those two heat flows (the difference is zero when steady-state temperature is achieved).
So, your identification of 'solar activity' is only a half-truth at best, and in most literature,
'solar activity' only means sunspot fluctuations, not solar heat output. Indeed, the solar
heat output is set by fusion rates in the sun's center, many thousands of miles away from
the photosphere where we see sunspots.
So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.
Utterance of nonsense is detected. Agriculture, forestry, water resources, sea life are all being hurt
by climate change, and gazing at Mr.Sun isn't a rational plan to deal with it.
you are completely wrong
We all Love Carbon
We all Love CO2
The balance of soil carbon is held in peat and wetlands (150 GtC), and ...
GHG - greenhouse gas emissions is an old fake and political agenda developed by US politicians to kill China economy
but China mirorred the attack and turned himself into reduced emissions Global Factory
in green technologies like solar panels, wind turbines
Climate Change is clocked by fluctuations in solar activity Agenda
is backed today by NASA (Solar Lab) and others with public money, not coming from Russia.
On Monday, June 20, 2022 at 5:46:59 PM UTC+2, a a wrote:
BTW
Australia, Sydney is low on science due low population, not
attracting foreign scientists, researchers and low AUD exchange
rate.
...
You seem to be pig-ignorant in a whole range of areas, not just
climate science.
On Tuesday, June 21, 2022 at 1:39:46 PM UTC-7, a a wrote:
On Tuesday, 21 June 2022 at 19:48:01 UTC+2, whit3rd wrote:
On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:
Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.False, of course. Al Gore had some election-year lies aimed that way, and as for 'make money',
all Mann got was some frivolous lawsuits. The court had his legal team's fees reimbursed
by the jackals that brought the suit against him. The court did that because the suits
were found to be 'barratry', rather than being serious complaints.
Freon is another fake.
Not so; that's a DuPont tradename for fluorocarbon products, under worldwide ban
due to ozone depletion.
Climate is clocked by solar activity and by fluctuations in solar activity.
'clocked by'??? Climate is affected by solar heat (influx of heat dominates during the day) and
radiative cooling (heat dissipates into space, dominates the heatflow at night). The steady-state
average temperature isn't proportional to anything the Sun does, but is set by the difference of
those two heat flows (the difference is zero when steady-state temperature is achieved).
So, your identification of 'solar activity' is only a half-truth at best, and in most literature,
'solar activity' only means sunspot fluctuations, not solar heat output. Indeed, the solar
heat output is set by fusion rates in the sun's center, many thousands of miles away from
the photosphere where we see sunspots.
So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.
Utterance of nonsense is detected. Agriculture, forestry, water resources, sea life are all being hurt
by climate change, and gazing at Mr.Sun isn't a rational plan to deal with it.
you are completely wrongOkay, where? Zero details? It sounds like a lame excuse for a lack of criticism.
We all Love Carbon
That's an odd perversion; diamonds are pretty, though.
We all Love CO2
Live with it, yes; also a few other gasses.
The balance of soil carbon is held in peat and wetlands (150 GtC), and ...
and temporary surface repositories are not part of the carbon cycle in the atmosphere versus Earth's
crust, because they can go either up or down (they're inbetween, available to burn or get buried).
GHG - greenhouse gas emissions is an old fake and political agenda developed by US politicians to kill China economyThere's an agenda around it, nowadays, and politicians. The US can (and has) developed bits,
as have other nations; 192 of 'em if I remember the Paris Accords count.
but China mirorred the attack and turned himself into reduced emissions Global FactoryIt wasn't an attack. China is one of the 192 that joined the Paris Accord, and as you
in green technologies like solar panels, wind turbines
say they're developing bits of their own agenda, as well as exporting bits.
Climate Change is clocked by fluctuations in solar activity AgendaNot true. Studied by solar scientists, yes; but what does 'clocked' mean? And, where's
is backed today by NASA (Solar Lab) and others with public money, not coming from Russia.
the citation we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
are you really convinced sunspot activity is relevant?
On Wednesday, 22 June 2022 at 03:58:30 UTC+2, whit3rd wrote:
...where's
the citation we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
are you really convinced sunspot activity is relevant?
read first
https://www.nasa.gov/mission_pages/sdo/main/index.html
On Tuesday, June 21, 2022 at 1:39:46 PM UTC-7, a a wrote:
On Tuesday, 21 June 2022 at 19:48:01 UTC+2, whit3rd wrote:
On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:
Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.False, of course. Al Gore had some election-year lies aimed that way, and as for 'make money',
all Mann got was some frivolous lawsuits. The court had his legal team's fees reimbursed
by the jackals that brought the suit against him. The court did that because the suits
were found to be 'barratry', rather than being serious complaints.
Freon is another fake.
Not so; that's a DuPont tradename for fluorocarbon products, under worldwide ban
due to ozone depletion.
Climate is clocked by solar activity and by fluctuations in solar activity.
'clocked by'??? Climate is affected by solar heat (influx of heat dominates during the day) and
radiative cooling (heat dissipates into space, dominates the heatflow at night). The steady-state
average temperature isn't proportional to anything the Sun does, but is set by the difference of
those two heat flows (the difference is zero when steady-state temperature is achieved).
So, your identification of 'solar activity' is only a half-truth at best, and in most literature,
'solar activity' only means sunspot fluctuations, not solar heat output. Indeed, the solar
heat output is set by fusion rates in the sun's center, many thousands of miles away from
the photosphere where we see sunspots.
So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.
Utterance of nonsense is detected. Agriculture, forestry, water resources, sea life are all being hurt
by climate change, and gazing at Mr.Sun isn't a rational plan to deal with it.
you are completely wrongOkay, where? Zero details? It sounds like a lame excuse for a lack of criticism.
We all Love Carbon
That's an odd perversion; diamonds are pretty, though.
We all Love CO2
Live with it, yes; also a few other gasses.
every troll is free to contact NASA Solar Lab directly to get data onThe balance of soil carbon is held in peat and wetlands (150 GtC), and ...
and temporary surface repositories are not part of the carbon cycle in the atmosphere versus Earth's
crust, because they can go either up or down (they're inbetween, available to burn or get buried).
GHG - greenhouse gas emissions is an old fake and political agenda developed by US politicians to kill China economyThere's an agenda around it, nowadays, and politicians. The US can (and has) developed bits,
as have other nations; 192 of 'em if I remember the Paris Accords count.
but China mirorred the attack and turned himself into reduced emissions Global FactoryIt wasn't an attack. China is one of the 192 that joined the Paris Accord, and as you
in green technologies like solar panels, wind turbines
say they're developing bits of their own agenda, as well as exporting bits.
Climate Change is clocked by fluctuations in solar activity AgendaNot true. Studied by solar scientists, yes; but what does 'clocked' mean? And, where's
is backed today by NASA (Solar Lab) and others with public money, not coming from Russia.
the citation we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
are you really convinced sunspot activity is relevant?
On Tuesday, June 21, 2022 at 1:39:46 PM UTC-7, a a wrote:
On Tuesday, 21 June 2022 at 19:48:01 UTC+2, whit3rd wrote:
On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:
Climate Change is an old fake by Al Gore, Prof. Mann and their team to make money fast.False, of course. Al Gore had some election-year lies aimed that way, and as for 'make money',
all Mann got was some frivolous lawsuits. The court had his legal team's fees reimbursed
by the jackals that brought the suit against him. The court did that because the suits
were found to be 'barratry', rather than being serious complaints.
Freon is another fake.
Not so; that's a DuPont tradename for fluorocarbon products, under worldwide ban
due to ozone depletion.
Climate is clocked by solar activity and by fluctuations in solar activity.
'clocked by'??? Climate is affected by solar heat (influx of heat dominates during the day) and
radiative cooling (heat dissipates into space, dominates the heatflow at night). The steady-state
average temperature isn't proportional to anything the Sun does, but is set by the difference of
those two heat flows (the difference is zero when steady-state temperature is achieved).
So, your identification of 'solar activity' is only a half-truth at best, and in most literature,
'solar activity' only means sunspot fluctuations, not solar heat output. Indeed, the solar
heat output is set by fusion rates in the sun's center, many thousands of miles away from
the photosphere where we see sunspots.
So it's a waste of time and money to study Climate Change, living on the Earth, if you can easily study fluctuations in solar activity to get science on what really controls the Climate.
Utterance of nonsense is detected. Agriculture, forestry, water resources, sea life are all being hurt
by climate change, and gazing at Mr.Sun isn't a rational plan to deal with it.
you are completely wrongOkay, where? Zero details? It sounds like a lame excuse for a lack of criticism.
We all Love Carbon
That's an odd perversion; diamonds are pretty, though.
We all Love CO2
Live with it, yes; also a few other gasses.
The balance of soil carbon is held in peat and wetlands (150 GtC), and ...
and temporary surface repositories are not part of the carbon cycle in the atmosphere versus Earth's
crust, because they can go either up or down (they're inbetween, available to burn or get buried).
GHG - greenhouse gas emissions is an old fake and political agenda developed by US politicians to kill China economyThere's an agenda around it, nowadays, and politicians. The US can (and has) developed bits,
as have other nations; 192 of 'em if I remember the Paris Accords count.
but China mirorred the attack and turned himself into reduced emissions Global FactoryIt wasn't an attack. China is one of the 192 that joined the Paris Accord, and as you
in green technologies like solar panels, wind turbines
say they're developing bits of their own agenda, as well as exporting bits.
Climate Change is clocked by fluctuations in solar activity AgendaNot true. Studied by solar scientists, yes; but what does 'clocked' mean? And, where's
is backed today by NASA (Solar Lab) and others with public money, not coming from Russia.
the citation we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
are you really convinced sunspot activity is relevant?
On Wednesday, June 22, 2022 at 2:31:02 AM UTC-7, a a wrote:excellent, excellent
On Wednesday, 22 June 2022 at 03:58:30 UTC+2, whit3rd wrote:
...where's
the citation we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
are you really convinced sunspot activity is relevant?
read first
https://www.nasa.gov/mission_pages/sdo/main/index.htmlNo, not going to read an index... you didn't cite a single work, or result, and that makes this an evasion rather than an answer.
Solar dynamics isn't significant heat modulation related to global warming. Here's a picture from NASA <https://climate.nasa.gov/faq/14/is-the-sun-causing-global-warming/>
On Tuesday, 21 June 2022 at 19:48:01 UTC+2, whit3rd wrote:
On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:
you are completely wrong
We all Love Carbon
We all Love CO2
I am really sorry, you represent low science - no science but
Water H2O in its gaseous state, or water vapor, is the only greenhouse gas because of its high heat of vaporization
On Tuesday, June 21, 2022 at 10:39:46 PM UTC+2, a a wrote:vindicated, but he was dead by then.
On Tuesday, 21 June 2022 at 19:48:01 UTC+2, whit3rd wrote:<snip>
On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:
you are completely wrong
a a does like posting claims like that. Since he is an obvious idiot, it is a waste of bandwidth.
We all Love CarbonOnly those who are as brain-dead as a a.
We all Love CO2
<snipped a large chunk of uncomprehended cut and paste which did say anything relevant>
I am really sorry, you represent low science - no science butYour sorrow should be directed at you own abysmal ignorance.
<snipped more fatuous nonsense >
Water H2O in its gaseous state, or water vapor, is the only greenhouse gas because of its high heat of vaporizationWater vapour - like gaseous CO2 - is a greenhouse gas because it absorbs and re-radiates specific near-infrared frequencies. Other molecules - like methane - that are active in the infra-red are also green-house gases.
https://webbook.nist.gov/cgi/cbook.cgi?ID=C7732185&Mask=800#Electronic-Spec
You've actually got to take the rotational modes into account to work out what the infra-red spectrum actually looks like, and get reliable greenhouse numbers.
https://en.wikipedia.org/wiki/Svante_Arrhenius
had the right idea in 1896, but Knut Ångström in 1900 published low resolution infrared absorbtion spectra which appeared to show he'd got it wrong. When we finally got spectrometers that could resolve the rotational fine structure, Arrhenius was
The heat of vapourisation doesn't come into itas you can see,
<snipped more incoherent raving>
--
Bill Sloman, Sydney
Water H2O in its gaseous state, or water vapor, is the only greenhouse gas because of its high heat of vaporization
On Wednesday, 22 June 2022 at 12:25:11 UTC+2, bill....@ieee.org wrote:vindicated, but he was dead by then.
On Tuesday, June 21, 2022 at 10:39:46 PM UTC+2, a a wrote:
On Tuesday, 21 June 2022 at 19:48:01 UTC+2, whit3rd wrote:<snip>
On Monday, June 20, 2022 at 5:47:09 AM UTC-7, a a wrote:
you are completely wrong
a a does like posting claims like that. Since he is an obvious idiot, it is a waste of bandwidth.
We all Love CarbonOnly those who are as brain-dead as a a.
We all Love CO2
<snipped a large chunk of uncomprehended cut and paste which did say anything relevant>
I am really sorry, you represent low science - no science butYour sorrow should be directed at you own abysmal ignorance.
<snipped more fatuous nonsense >
Water H2O in its gaseous state, or water vapor, is the only greenhouse gas because of its high heat of vaporizationWater vapour - like gaseous CO2 - is a greenhouse gas because it absorbs and re-radiates specific near-infrared frequencies. Other molecules - like methane - that are active in the infra-red are also green-house gases.
https://webbook.nist.gov/cgi/cbook.cgi?ID=C7732185&Mask=800#Electronic-Spec
You've actually got to take the rotational modes into account to work out what the infra-red spectrum actually looks like, and get reliable greenhouse numbers.
https://en.wikipedia.org/wiki/Svante_Arrhenius
had the right idea in 1896, but Knut Ångström in 1900 published low resolution infrared absorbtion spectra which appeared to show he'd got it wrong. When we finally got spectrometers that could resolve the rotational fine structure, Arrhenius was
The heat of vapourisation doesn't come into it
<snipped more incoherent raving>
as you can see, Sydney is low science, making irrelevant claims
Water H2O in its gaseous state, or water vapor, is the only greenhouse gas because of its high heat of vaporization.
<<Water vapour - like gaseous CO2 - is a greenhouse gas because it absorbs and re-radiates specific near-infrared frequencies. Other molecules - like methane - that are active in the infra-red are also green-house gases.
it doesn't matter! What matters is !!!!
Water’s heat of vaporization is around 540 cal/g at 100 °C, water's boiling point.
Why does water have a high heat of vaporization?
That is, water has a high heat of vaporization, the amount of energy needed to change one gram of a liquid substance to a gas at constant temperature. Water’s heat of vaporization is around 540 cal/g at 100 °C, water's boiling point.
On Wednesday, 22 June 2022 at 12:54:53 UTC+2, a a wrote:
On Wednesday, 22 June 2022 at 11:51:32 UTC+2, whit3rd wrote:
On Wednesday, June 22, 2022 at 2:31:02 AM UTC-7, a a wrote:
On Wednesday, 22 June 2022 at 03:58:30 UTC+2, whit3rd wrote:
...where's
the citation we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
are you really convinced sunspot activity is relevant?
read first
https://www.nasa.gov/mission_pages/sdo/main/index.html
No, not going to read an index... you didn't cite a single work, or result, and that makes this an evasion rather than an answer.
Solar dynamics isn't significant heat modulation related to global warming.
Here's a picture from NASA <https://climate.nasa.gov/faq/14/is-the-sun-causing-global-warming/>
excellent, excellent
look again at your chart
Total Solar Irradiance values are fixed at 1361+/- W/m2 level
What matters is "Total"
Temperature increase by 1 degree C over the span of 100 years, as declared by Prof. Mann
is exactly within calculation/ data collected error.
What matters is Plasma Leaving the Sun and flares, clocking the Climate Changes on the Earth
http://sdoisgo.blogspot.com/2022/03/an-x13-flare-and-cool-view-of-plasma.html
https://sdo.gsfc.nasa.gov/
There is an only one scientist at NASA, Solar Dynamics, who can tell you the truth behind the coronal loops and plasma leaving the Sun cycles.
BTW
Ask your friends from NASA Climate
to ink image and comments with full name next time.
https://climate.nasa.gov/faq/14/is-the-sun-causing-global-warming/
in reply to low-science Sydney:
SDO is designed to help us understand the Sun's influence on Earth and Near-Earth space by studying the solar atmosphere on small scales of space and time and in many wavelengths simultaneously.
https://sdo.gsfc.nasa.gov/
https://www.blogger.com/profile/16479620366654056823
http://sdoisgo.blogspot.com/
http://sdoisgo.blogspot.com/2022/03/an-x13-flare-and-cool-view-of-plasma.html
Blog Description
This is the Solar Dynamics Observatory Mission blog. It will consist of mission status, news, and event updates.
On Wednesday, 22 June 2022 at 11:51:32 UTC+2, whit3rd wrote:
On Wednesday, June 22, 2022 at 2:31:02 AM UTC-7, a a wrote:
On Wednesday, 22 June 2022 at 03:58:30 UTC+2, whit3rd wrote:
...where's
the citation we'd expect of a result from the NASA Solar Physics laboratory, and (for that matter)
are you really convinced sunspot activity is relevant?
read first
https://www.nasa.gov/mission_pages/sdo/main/index.htmlNo, not going to read an index... you didn't cite a single work, or result, and that makes this an evasion rather than an answer.
Solar dynamics isn't significant heat modulation related to global warming. Here's a picture from NASA <https://climate.nasa.gov/faq/14/is-the-sun-causing-global-warming/>excellent, excellent
look again at your chart
Total Solar Irradiance values are fixed at 1361+/- W/m2 level
What matters is "Total"
Temperature increase by 1 degree C over the span of 100 years, as declared by Prof. Mann
is exactly within calculation/ data collected error.
What matters is Plasma Leaving the Sun and flares, clocking the Climate Changes on the Earth
http://sdoisgo.blogspot.com/2022/03/an-x13-flare-and-cool-view-of-plasma.html
https://sdo.gsfc.nasa.gov/
There is an only one scientist at NASA, Solar Dynamics, who can tell you the thruth behind the coronal loops and plasma leaving the Sun cyclesin reply
BTW
Ask your friends from NASA Climate
to ink image and comments with full name next time
https://climate.nasa.gov/faq/14/is-the-sun-causing-global-warming/
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 232 |
Nodes: | 16 (2 / 14) |
Uptime: | 169:21:43 |
Calls: | 4,972 |
Calls today: | 4 |
Files: | 11,527 |
Messages: | 4,001,830 |