• Cosmological Problems

    From Richard D. Saam@21:1/5 to All on Wed Oct 24 20:42:11 2018
    There have been recent cosmological experimental disclosures
    represented in part by the following:

    The 7Be(n,p)7Li reaction and the Cosmological Lithium Problem:
    measurement of the cross section in a wide energy range at n\_TOF (CERN) https://arxiv.org/abs/1806.03050
    "The new estimate of the 7Be destruction rate based on the new results
    yields a decrease of the predicted cosmological Lithium abundance of
    ∼10%, insufficient to provide a viable solution to the Cosmological
    Lithium Problem."
    What is the resolution to the Lithium problem?

    No WIMPS have been found in reference to the dark matter problem.
    What are the dark matter alternatives?

    Collective Effects in Nuclear Collisions: Experimental Overview https://arxiv.org/abs/1810.06978
    Viscosity plays an important role in measured LHC RHIC nuclear dynamics
    Does this viscosity experimental result
    influence BBN gas phased mechanisms?

    MILKY WAY CEPHEID STANDARDS FOR MEASURING COSMIC DISTANCES AND
    APPLICATION TO Gaia DR2:
    IMPLICATIONS FOR THE HUBBLE CONSTANT
    https://arxiv.org/abs/1804.10655
    The Planck H0 = 67.4 km/s/Mpc is based on CMB.
    The reported H0 = 73.24 km/s/Mpc is based on photometric parallaxes.
    What mechanism explains the difference?

    The universe increased expanding rate
    is an expression of dark energy measured by supernovae type II events.
    What is dark energy?

    The cosmological constant problem or the vacuum catastrophe
    indicates a vacuum energy theory differing from experiment
    by 120 orders of magnitude.
    What is the vacuum energy?

    Is there a common theoretical mechanistic thread
    connecting these experimental dots?

    Richard D Saam

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Phillip Helbig (undress to reply)@21:1/5 to Richard D. Saam on Thu Oct 25 10:15:44 2018
    In article <qZidnZewGob5elHGnZ2dnUU7-YnNnZ2d@giganews.com>,=20
    "Richard D. Saam" <rdsaam@att.net> writes:

    There have been recent cosmological experimental disclosures
    represented in part by the following:
    =20
    The 7Be(n,p)7Li reaction and the Cosmological Lithium Problem:=20
    measurement of the cross section in a wide energy range at n\_TOF (CERN) https://arxiv.org/abs/1806.03050
    "The new estimate of the 7Be destruction rate based on the new results=20 yields a decrease of the predicted cosmological Lithium abundance of=20
    ~10%, insufficient to provide a viable solution to the Cosmological=20 Lithium Problem."
    What is the resolution to the Lithium problem?

    I, personally, don't know. Note, however, that the observations are not=20 easy.

    No WIMPS have been found in reference to the dark matter problem.
    What are the dark matter alternatives?

    Primordial black holes are still viable. Also, absence of evidence is=20
    not evidence of absence. Even though we knew the sources and how many=20
    were produced, it still took a long time before neutrinos were=20
    discovered. Since practically nothing is known about WIMPs, there are=20
    no robust predictions for cross sections and hence reaction rates.

    Collective Effects in Nuclear Collisions: Experimental Overview https://arxiv.org/abs/1810.06978
    Viscosity plays an important role in measured LHC RHIC nuclear dynamics
    Does this viscosity experimental result
    influence BBN gas phased mechanisms?

    Do you have reason to think so? BBN seems reasonably successful.

    MILKY WAY CEPHEID STANDARDS FOR MEASURING COSMIC DISTANCES AND=20
    APPLICATION TO Gaia DR2:
    IMPLICATIONS FOR THE HUBBLE CONSTANT
    https://arxiv.org/abs/1804.10655
    The Planck H0 =3D 67.4 km/s/Mpc is based on CMB.
    The reported H0 =3D 73.24 km/s/Mpc is based on photometric parallaxes.
    What mechanism explains the difference?

    Add the error bars and you have a three-sigma difference. Most would=20 consider it irresponsible to base a detection on three sigma, so why=20
    base a tension on it.

    The universe increased expanding rate
    is an expression of dark energy measured by supernovae type II events.
    What is dark energy?

    Observationally, it is indistinguishable from a cosmological constant. =20 Theoretically, there is no reason it is not the cosmological constant. =20 There is no problem.

    The cosmological constant problem or the vacuum catastrophe
    indicates a vacuum energy theory differing from experiment
    by 120 orders of magnitude.
    What is the vacuum energy?

    It is clear from quantum field theory what it is. Why the observed=20 cosmological constant is much smaller is not completely clear, but 40=20
    years ago Weinberg came up with an anthropic explanation, which no-one=20
    has refuted. Also, look for the paper by Bianchi and Rovelli. This is=20 probably another non-problem.

    Is there a common theoretical mechanistic thread
    connecting these experimental dots?

    Probably not. To prove that there is, one would have to construct such=20
    a theory.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Richard D. Saam@21:1/5 to All on Sat Oct 27 09:49:48 2018
    On 10/25/18 12:15 PM, Phillip Helbig (undress to reply) wrote:
    In article <qZidnZewGob5elHGnZ2dnUU7-YnNnZ2d@giganews.com>,=20
    "Richard D. Saam" <rdsaam@att.net> writes:

    There have been recent cosmological experimental disclosures
    represented in part by the following:

    The 7Be(n,p)7Li reaction and the Cosmological Lithium Problem:=20
    measurement of the cross section in a wide energy range at n\_TOF (CERN)
    https://arxiv.org/abs/1806.03050
    "The new estimate of the 7Be destruction rate based on the new results
    yields a decrease of the predicted cosmological Lithium abundance of
    ~10%, insufficient to provide a viable solution to the Cosmological
    Lithium Problem."
    What is the resolution to the Lithium problem?

    I, personally, don't know. Note, however, that the observations are not easy.
    something amiss with BBN?

    No WIMPS have been found in reference to the dark matter problem.
    What are the dark matter alternatives?

    Primordial black holes are still viable.
    but there are arguments against black holes as dark matter https://arxiv.org/abs/1808.05910
    Also, absence of evidence is
    not evidence of absence. Even though we knew the sources and how many
    were produced, it still took a long time before neutrinos were
    discovered. Since practically nothing is known about WIMPs, there are
    no robust predictions for cross sections and hence reaction rates.

    Collective Effects in Nuclear Collisions: Experimental Overview
    https://arxiv.org/abs/1810.06978
    Viscosity plays an important role in measured LHC RHIC nuclear dynamics
    Does this viscosity experimental result
    influence BBN gas phased mechanisms?

    Do you have reason to think so? BBN seems reasonably successful.

    Yes, observed BBN expressions are reasonably successful
    (except for the Lithium problem),
    but the realm of fluid viscosity is different the gas kinematics.
    It implies that we are looking at gas phase BBN
    and there also exists a viscous BBN,
    something like looking at the steam
    but knowing the presence of water somewhere.

    MILKY WAY CEPHEID STANDARDS FOR MEASURING COSMIC DISTANCES AND
    APPLICATION TO Gaia DR2:
    IMPLICATIONS FOR THE HUBBLE CONSTANT
    https://arxiv.org/abs/1804.10655
    The Planck H0 =3D 67.4 km/s/Mpc is based on CMB.
    The reported H0 =3D 73.24 km/s/Mpc is based on photometric parallaxes.
    What mechanism explains the difference?

    Add the error bars and you have a three-sigma difference. Most would consider it irresponsible to base a detection on three sigma, so why
    base a tension on it.

    The paper reports a 96.5% confidence level
    for the difference in the HOs 67.4 and 73.24 km/s/Mpc:
    'The best-fit distance scale is 1.006 =C2=B1 0.033 , relative to the scale
    from Riess et al. (2016) with H0 = 73.24 km s--1 Mpc--1 used to predict
    the parallaxes photometrically, and is inconsistent with the scale
    needed to match the Planck 2016 CMB data combined with =CE=9BCDM at the 2.9\sigma
    confidence level (99.6%). At 96.5% confidence we find that the formal
    DR2 errors may be underestimated as indicated.'

    The universe increased expanding rate
    is an expression of dark energy measured by supernovae type II events.
    What is dark energy?

    Observationally, it is indistinguishable from a cosmological constant. Theoretically, there is no reason it is not the cosmological constant.
    There is no problem.

    Here are some Weinberg's thoughts on the problem: http://supernova.lbl.gov/~evlinder/weinberg.pdf

    'The problem of the dark energy is also central to today's physics.
    Our best attempts at a fundamental theory
    suggest the presence of a cosmological constant
    that is many (perhaps as many as 120) orders of magnitude greater
    than the upper bound set by astronomical observations.Until it is
    solved, the problem of the dark energy
    will be a roadblock on our path
    to a comprehensive fundamental physical theory.'

    The cosmological constant problem or the vacuum catastrophe
    indicates a vacuum energy theory differing from experiment
    by 120 orders of magnitude.
    What is the vacuum energy?

    It is clear from quantum field theory what it is. Why the observed cosmological constant is much smaller is not completely clear, but
    years ago Weinberg came up with an anthropic explanation, which no-one
    has refuted. Also, look for the paper by Bianchi and Rovelli. This is probably another non-problem.

    ditto Weinberg's thoughts from above

    Is there a common theoretical mechanistic thread
    connecting these experimental dots?

    Probably not. To prove that there is, one would have to construct such
    a theory.

    Maybe the mechanism as simple as recognizing an entirely different phase analogous to liquid and gas relationships.
    Then observational cosmology and associated theories
    are not eliminated but complemented.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Phillip Helbig (undress to reply)@21:1/5 to Saam" on Sun Oct 28 12:41:27 2018
    In article <i8-dnai1ztWWwU7GnZ2dnUU7-RXNnZ2d@giganews.com>, "Richard D.
    Saam" <rdsaam@att.net> writes:

    No WIMPS have been found in reference to the dark matter problem.
    What are the dark matter alternatives?

    Primordial black holes are still viable.

    but there are arguments against black holes as dark matter https://arxiv.org/abs/1808.05910

    First, note that the title contains "clustered". So it is addressing
    only the question of clustered primordial black holes. Second,
    apparently this hasn't been accepted by any journal, perhaps not even submitted, so there might have been no external check. Note also that
    they write "Throughout this letter we have assumed a monochromatic
    initial PBH mass distribution." They do cite Carr, Kühnel, and
    Sandstad. Read that paper. It has almost 300 references. Read them as
    well. Those folks have done their homework.

    As luck would have it, just a few days ago I heard a talk on this topic
    by Florian Kühnel. He pointed out that many constraints are too strong
    because they are derived based on wrong or flawed assumptions. At

    https://indico.cern.ch/event/736594/timetable/?view=nicecompact

    you can download his presentation (first talk after lunch on Tuesday 23 October).

    [[Mod. note -- The presentation in question seems to be in Apple
    "keynote" format. At least on my computer, Libreoffice (v6.0.2.2.1)
    was unable to read it.
    -- jt]]

    Collective Effects in Nuclear Collisions: Experimental Overview
    https://arxiv.org/abs/1810.06978
    Viscosity plays an important role in measured LHC RHIC nuclear dynamics
    Does this viscosity experimental result
    influence BBN gas phased mechanisms?

    Do you have reason to think so? BBN seems reasonably successful.

    Yes, observed BBN expressions are reasonably successful
    (except for the Lithium problem),
    but the realm of fluid viscosity is different the gas kinematics.
    It implies that we are looking at gas phase BBN
    and there also exists a viscous BBN,
    something like looking at the steam
    but knowing the presence of water somewhere.

    If you can show that some sort of viscous BBN solves the lithium
    problem, fine, but merely juxtaposing various words with another might
    not even indicate a problem, much less a solution.

    MILKY WAY CEPHEID STANDARDS FOR MEASURING COSMIC DISTANCES AND
    APPLICATION TO Gaia DR2:
    IMPLICATIONS FOR THE HUBBLE CONSTANT
    https://arxiv.org/abs/1804.10655
    The Planck H0 = 67.4 km/s/Mpc is based on CMB.
    The reported H0 = 73.24 km/s/Mpc is based on photometric parallaxes.
    What mechanism explains the difference?

    Add the error bars and you have a three-sigma difference. Most would consider it irresponsible to base a detection on three sigma, so why
    base a tension on it.

    The paper reports a 96.5% confidence level
    for the difference in the HOs 67.4 and 73.24 km/s/Mpc:
    'The best-fit distance scale is 1.006 ± 0.033 , relative to the scale
    from Riess et al. (2016) with H0 = 73.24 km/s/Mpc used to predict
    the parallaxes photometrically, and is inconsistent with the scale
    needed to match the Planck 2016 CMB data combined with LambdaCDM at the 2.9\sigma confidence level (99.6%). At 96.5% confidence we find that the formal DR2 errors may be underestimated as indicated.'

    So only 2.9 sigma. Again, a claimed detection at 2.9 sigma would be
    deemed over-confident, so the same rules should apply for a claimed
    tension. In both cases, there might be something interesting, but one
    is far from being able to claim that something is seriously wrong. Note
    that not long ago there were claims that the Hubble constant was as low
    as 30 or as high as 100, with about 20% errors. Did any "new physics"
    come of that? No. Probably something similar will happen here.

    The universe increased expanding rate
    is an expression of dark energy measured by supernovae type II events.
    What is dark energy?

    Observationally, it is indistinguishable from a cosmological constant. Theoretically, there is no reason it is not the cosmological constant. There is no problem.

    Here are some Weinberg's thoughts on the problem: http://supernova.lbl.gov/~evlinder/weinberg.pdf

    'The problem of the dark energy is also central to today's physics.
    Our best attempts at a fundamental theory
    suggest the presence of a cosmological constant
    that is many (perhaps as many as 120) orders of magnitude greater
    than the upper bound set by astronomical observations.Until it is
    solved, the problem of the dark energy
    will be a roadblock on our path
    to a comprehensive fundamental physical theory.'

    Weinberg presented an anthropic argument for the observed value of the cosmological constant, and many believe that there is no better
    explanation. It is not a FUNDAMENTAL explanation, but then there might
    not be one. There is no FUNDAMENTAL explanation for the distance of the
    Earth from the Sun, but there is an easily understood anthropic effect.

    The cosmological constant problem or the vacuum catastrophe
    indicates a vacuum energy theory differing from experiment
    by 120 orders of magnitude.
    What is the vacuum energy?

    It is clear from quantum field theory what it is. Why the observed cosmological constant is much smaller is not completely clear, but
    years ago Weinberg came up with an anthropic explanation, which no-one
    has refuted. Also, look for the paper by Bianchi and Rovelli. This is probably another non-problem.

    ditto Weinberg's thoughts from above

    The point is that Bianchi and Rovelli explicitly address the concerns of Weinberg but not vice versa. No progress can be made by just quoting
    someone who supports one's own point of view and ignoring others,
    especially if the latter explicitly address issues raised by the former.

    Is there a common theoretical mechanistic thread
    connecting these experimental dots?

    Probably not. To prove that there is, one would have to construct such
    a theory.

    Maybe the mechanism as simple as recognizing an entirely different phase analogous to liquid and gas relationships.
    Then observational cosmology and associated theories
    are not eliminated but complemented.

    Without some quantitative results, this is just a collection of
    buzzwords. If you think that this idea can somehow solve some problems,
    then present some quantitative results.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Steve Willner@21:1/5 to Richard D. Saam on Thu Nov 8 23:37:58 2018
    In article <qZidnZewGob5elHGnZ2dnUU7-YnNnZ2d@giganews.com>,
    "Richard D. Saam" <rdsaam@att.net> writes:
    What is the resolution to the Lithium problem?

    Subsequent to the Big Bang, lithium is created by cosmic rays and
    destroyed by stars. Are the yields known well enough to determine
    the BB abundance?

    No WIMPS have been found in reference to the dark matter problem.
    What are the dark matter alternatives?

    Massive particles that don't interact other than by gravitation (or
    have extremely low cross sections) come to mind. Theorists are
    creative, and no doubt there are a vast number of candidates.

    Collective Effects in Nuclear Collisions: Experimental Overview https://arxiv.org/abs/1810.06978
    Viscosity plays an important role in measured LHC RHIC nuclear dynamics
    Does this viscosity experimental result
    influence BBN gas phased mechanisms?

    Someone else will have to answer that one. Only light nuclei matter,
    though.

    MILKY WAY CEPHEID STANDARDS FOR MEASURING COSMIC DISTANCES AND
    APPLICATION TO Gaia DR2:
    IMPLICATIONS FOR THE HUBBLE CONSTANT
    https://arxiv.org/abs/1804.10655
    The Planck H0 = 67.4 km/s/Mpc is based on CMB.
    The reported H0 = 73.24 km/s/Mpc is based on photometric parallaxes.
    What mechanism explains the difference?

    The difference is less than 3 sigma so may not be real. If there is
    a difference, time-variable dark energy would be one possibility.
    The Adam Riess colloquium I posted about earlier https://www.youtube.com/watch?v=eSPCy-IJaPg
    is still relevant and easy to follow.

    The universe increased expanding rate
    is an expression of dark energy measured by supernovae type II events.

    I think you mean Type Ia SNe, but there are numerous other
    measurements that agree.

    What is dark energy?

    Nobody knows, but a cosmological constant is consistent with all
    data so far.

    The cosmological constant problem or the vacuum catastrophe
    indicates a vacuum energy theory differing from experiment
    by 120 orders of magnitude.

    So much for theory!

    --
    Help keep our newsgroup healthy; please don't feed the trolls.
    Steve Willner Phone 617-495-7123 swillner@cfa.harvard.edu Cambridge, MA 02138 USA

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Richard D. Saam@21:1/5 to Steve Willner on Fri Dec 14 18:40:52 2018
    On 11/9/18 1:37 AM, Steve Willner wrote:
    MILKY WAY CEPHEID STANDARDS FOR MEASURING COSMIC DISTANCES AND
    APPLICATION TO Gaia DR2:
    IMPLICATIONS FOR THE HUBBLE CONSTANT
    https://arxiv.org/abs/1804.10655
    The Planck H0 = 67.4 km/s/Mpc is based on CMB.
    The reported H0 = 73.24 km/s/Mpc is based on photometric parallaxes.
    What mechanism explains the difference?

    The difference is less than 3 sigma so may not be real. If there is
    a difference, time-variable dark energy would be one possibility.
    The Adam Riess colloquium I posted about earlier https://www.youtube.com/watch?v=eSPCy-IJaPg
    is still relevant and easy to follow.
    Referencing a good summary article on the Ho tension:
    Measuring cosmic distances with standard sirens Physics Today, Dec 2018

    universe expansion using supernovae Ho1 = 73.24 +/-1.74 km s^-1 Mpc^-1
    and
    Planck satellite's CMB fluctuations Ho2 = 67.74 +/-.46 km s^-1 Mpc^-1

    The hope is to resolve Ho tension with standard siren determination.
    There is one currently available measurement GW170817
    with the hope of many more.
    rds

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jonathan Thornburg [remove -animal@21:1/5 to All on Sat Dec 15 10:59:03 2018
    A few comments about just how one goes about measuring (estimating)
    the Hubble constant with gravitational-wave "standard sirens"...

    The key physics underlying these measurements is that gravitational-wave
    (GW) observations of the final stages of the orbital decay and coalescence
    of a compact-object binary system allow it to be treated as a GW "standard siren".
    [In this context "compact" means compact enough so
    that we can ignore tidal effects, in practice this
    means the individual objects are black holes (BHs) or
    neutron stars (NSs).]

    [By analogy to the classic phrase "standard candle",
    such GW sources are called "standard sirens" (the
    pre-coalescence GW signal is roughly a sine wave which
    sweeps up in frequency an amplitude as the system gets
    closer to coalescence).]

    That is, assuming that general relativity can accurately model the system,
    from the GW observations alone we can calculate the distance -- more
    precisely, the luminosity distance -- to the source in meters.

    Unfortunately, the current GW observations don't give a very accurate
    sky position -- the error ellipsoids have areas of tens of square degrees.
    This should decrease to a few square degrees when additional detectors
    come online 5-10 years from now. But that's still a substantial sky
    area, containing many many galaxies.

    In a classic 1986 paper (Nature 323, 310), Bernard Schutz worked out
    that there are two subcases for estimating the Hubble constant:


    The first subcase occurs if the coalescence produces a strong
    electromagnetic (EM) signal, which we use to accurately localize its
    sky position. In practice this (strong EM signal) is true for binary
    NS coalescences (which seem to produce a short gamma ray burst, with
    strong flux everywhere in the EM spectrum from radio to gamma rays).

    Given an accurate sky localization, then we can then use standard optical/infrared astronomy techniques to measure the redshift of the
    galaxy at that position. (We assume, as seems very likely, that the
    compact binary is in or nearby a galaxy.)

    Combining the measured redshift with the GW luminosity distance then
    gives an estimate of the Hubble constant.

    I don't recall the exact numbers, but I think a few dozen binary-NS coalescences observed with the current generation of GW detectors should
    yield an estimate of the Hubble constant good to a few km/sec/Mpc (i.e.,
    good enough to be useful in discriminating between the supernovae and
    CMB estimates).


    The second subcase is if the coalescence does NOT produce a detectable
    EM signal. In practice this second subcase occurs for binary-BH
    coalescences. (We don't yet know whether BH/NS coalescences will
    produce a detectable EM signal -- theoretical models suggest that it
    will, but there are a lot of uncertainties.)

    Schutz described a statistical procedure for estimating the Hubble
    constant from this type of observations (GW "standard siren" but no EM
    signal & no accurate sky localization), but this would require a larger
    number of observations, say on the order of 100 to a few hundered.


    The actual event rates for BH/BH, NS/NS, and BH/NS coalescences are
    rather uncertain. To date 7, 1, and 0 of these (respectively) have been observed (https://arxiv.org/abs/1811.12907), and with continued GW-detector tweaks and improvements) it seems likely that the detection rates will
    increase by a factor of 3-10 over the next decade.

    --
    -- "Jonathan Thornburg [remove -animal to reply]" <jthorn@astro.indiana-zebra.edu>
    Dept of Astronomy & IUCSS, Indiana University, Bloomington, Indiana, USA
    "There was of course no way of knowing whether you were being watched
    at any given moment. How often, or on what system, the Thought Police
    plugged in on any individual wire was guesswork. It was even conceivable
    that they watched everybody all the time." -- George Orwell, "1984"

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Phillip Helbig (undress to reply)@21:1/5 to Saam" on Sat Dec 15 10:46:11 2018
    In article <kcKdnSjS7YpyBo_BnZ2dnUU7-R3NnZ2d@giganews.com>, "Richard D.
    Saam" <rdsaam@att.net> writes:

    On 11/9/18 1:37 AM, Steve Willner wrote:
    MILKY WAY CEPHEID STANDARDS FOR MEASURING COSMIC DISTANCES AND
    APPLICATION TO Gaia DR2:
    IMPLICATIONS FOR THE HUBBLE CONSTANT
    https://arxiv.org/abs/1804.10655
    The Planck H0 = 67.4 km/s/Mpc is based on CMB.
    The reported H0 = 73.24 km/s/Mpc is based on photometric parallaxes.
    What mechanism explains the difference?

    The difference is less than 3 sigma so may not be real. If there is
    a difference, time-variable dark energy would be one possibility.
    The Adam Riess colloquium I posted about earlier https://www.youtube.com/watch?v=eSPCy-IJaPg
    is still relevant and easy to follow.
    Referencing a good summary article on the Ho tension:
    Measuring cosmic distances with standard sirens Physics Today, Dec 2018

    universe expansion using supernovae Ho1 = 73.24 +/-1.74 km s^-1 Mpc^-1
    and
    Planck satellite's CMB fluctuations Ho2 = 67.74 +/-.46 km s^-1 Mpc^-1

    The hope is to resolve Ho tension with standard siren determination.
    There is one currently available measurement GW170817
    with the hope of many more.

    I guess "standard sirens" mean "black-hole mergers detected by
    gravitational waves". As the song says, "two men say they're Jesus; one
    of them must be wrong". A third determination might disagree with both.
    Even if it agrees with one, that doesn't "resolve the tension". I also
    doubt whether the standard-siren technique will get the uncertainties
    down to a comparable level any time soon.

    My guess is that the errors have been overestimated. IIRC, H_0 is one
    area where WMAP and Planck don't agree well. So Planck is the odd man
    out, considering that most techniques favour a higher value (though with
    larger uncertainties).

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Richard D. Saam@21:1/5 to All on Mon Dec 17 05:49:53 2018
    On 12/15/18 12:59 PM, Jonathan Thornburg [remove -animal to reply] wrote:
    The actual event rates for BH/BH, NS/NS, and BH/NS coalescences are
    rather uncertain. To date 7, 1, and 0 of these (respectively) have been observed (https://arxiv.org/abs/1811.12907), and with continued GW-detector tweaks and improvements) it seems likely that the detection rates will increase by a factor of 3-10 over the next decade.

    Looking beyond the decade
    And again referencing the Physics Today Dec 2018 article:
    The ESA Interferometer Space Antenna(LISA) to be launched in 2034
    will operate in the milliHz range
    (10-10,000 Hz range of ground based gw detectors)
    involving coalescence of black holes of 10^4 - 10^7 solar masses
    for distances corresponding to redshifts as large as 20
    with accuracy to determine the source position
    accurately enough to pin down the galaxy cluster
    or even the galaxy hosting the event.
    Corresponding narrower Ho resolution is expected.

    Are there any anticipated Planck spacecraft replacements
    to verify its CMB based:
    Ho2 = 67.74 ±.46 km s^-1 Mpc^-1?
    The Webb infrared spacecraft will not apparently do it.
    Richard D Saam

    [Moderator's note: The James Webb Space Telescope is more like a
    traditional telescope in space, the successor to HST in some sense, but
    with more emphasis on the infrared. The CMB is observed at lower
    frequencies, with bolometers (which are also used in the far
    (lower-frequency) infrared) and traditional radio receivers. As far as
    I know there is no CMB satellite in the works, but some ground-based
    stuff such as the Simons Observatory. -P.H.]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Steve Willner@21:1/5 to All on Mon Dec 17 20:54:50 2018
    [Moderator's note: The James Webb Space Telescope is more like a
    traditional telescope in space, the successor to HST in some sense, but
    with more emphasis on the infrared.

    JWST wavelength range is roughly 0.6 to 27 microns

    The CMB is observed at lower frequencies

    much longer wavelengths than JWST.

    As far as I know there is no CMB satellite in the works,

    same here

    but some ground-based stuff such as the Simons Observatory. -P.H.]

    I think there's quite a lot of ground-based work. Most familiar to
    me is South Pole Telescope:
    https://pole.uchicago.edu/

    There's also BLAST, a balloon telescope:
    https://sites.northwestern.edu/blast/

    They've done CMB work in the past, but the upcoming flight seems to
    be Galactic objects.

    All ground-based and balloon telescopes study relatively high
    multipoles, i.e., relatively small angular scales. I think it's only
    the lower multipoles that carry information on H, but I may be wrong.

    --
    Help keep our newsgroup healthy; please don't feed the trolls.
    Steve Willner Phone 617-495-7123 swillner@cfa.harvard.edu Cambridge, MA 02138 USA

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Phillip Helbig (undress to reply@21:1/5 to Willner on Mon Dec 17 21:25:51 2018
    In article <pv9132$omt$1@dont-email.me>, willner@cfa.harvard.edu (Steve Willner) writes:

    [Moderator's note: The James Webb Space Telescope is more like a traditional telescope in space, the successor to HST in some sense, but with more emphasis on the infrared.

    JWST wavelength range is roughly 0.6 to 27 microns

    Visible light is about 0.4 to 0.7 microns (400 to 700 nm (nanometers),
    4000 to 7000 Å (Ångström)), so there is a bit of overlap between HST and
    JWST. JWST is more or less a normal reflecting telescope, with a CCD as detector.

    The CMB is observed at lower frequencies

    much longer wavelengths than JWST.

    You can easily search for "Planck focal plane" on the web and find an
    image showing lots of horns and other radio-astronomy stuff. Planck has
    a wide frequency range, with frequencies from 30 GHz to 857 GHz,
    corresponding to wavelengths between a centimetre and about a third of a millimetre, the latter being about 300 microns. Typical traditional ground-based radio astronomy is in the GHz range and below, so
    wavelengths from centimetres to metres.

    All ground-based and balloon telescopes study relatively high
    multipoles, i.e., relatively small angular scales.

    To do the large angular scales, one has to observe a large part of the
    sky.

    I think it's only
    the lower multipoles that carry information on H, but I may be wrong.

    Here are some movies where one can get a feel for how changing a
    parameter changes the CMB power spectrum:

    https://space.mit.edu/home/tegmark/movies.html

    In the plots, as usual, larger angular scales are on the left, smaller
    ones on the right.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Steve Willner@21:1/5 to helbig@asclothestro.multivax.de on Wed Jan 2 02:09:56 2019
    In article <pv93gv$pfo$1@gioia.aioe.org>,
    helbig@asclothestro.multivax.de (Phillip Helbig (undress to reply)) writes:
    Visible light is about 0.4 to 0.7 microns (400 to 700 nm (nanometers),

    That range is what the human eye can see. In practice, the term
    "visible" often refers to light detectable by instrumentation
    suitable for visible light, say from 300 nm (the atmospheric cutoff)
    to 1000 nm (the intrinsic silicon limit).

    there is a bit of overlap between HST and JWST.

    JWST's short-wavelength limit is 600 nm. Its prime range is roughly
    1000 to 3000 nm, and the long limit is 28000 nm (=28 microns).

    JWST is more or less a normal reflecting telescope, with a CCD as
    detector.

    Detectors are actually infrared hybrid arrays, not CCDs. They are
    based on the "HAWAII-2RG" technology: http://www.teledyne-si.com/products-and-services/imaging-sensors/standard-fpa-products

    You can easily search for "Planck focal plane" on the web and find an
    image showing lots of horns and other radio-astronomy stuff. Planck has

    I think "had" for that last word above.

    a wide frequency range, with frequencies from 30 GHz to 857 GHz, corresponding to wavelengths between a centimetre and about a third of a millimetre, the latter being about 300 microns. Typical traditional ground-based radio astronomy is in the GHz range and below, so
    wavelengths from centimetres to metres.

    Frequencies up to 15 GHz (wavelength 2 cm) were pretty common even
    when I was in school. Nowadays, the VLA https://public.nrao.edu/telescopes/vla/
    makes images up to 50 GHz, and ALMA
    https://public.nrao.edu/telescopes/alma/
    goes up to 950 GHz (though I don't think the highest frequencies are
    100% operational yet).

    https://space.mit.edu/home/tegmark/movies.html
    In the plots, as usual, larger angular scales are on the left, smaller
    ones on the right.

    Yes, very nice.

    --
    Help keep our newsgroup healthy; please don't feed the trolls.
    Steve Willner Phone 617-495-7123 swillner@cfa.harvard.edu Cambridge, MA 02138 USA

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jos Bergervoet@21:1/5 to Steve Willner on Wed Jan 2 14:35:42 2019
    On 1/2/2019 8:09 AM, Steve Willner wrote:
    In article <pv93gv$pfo$1@gioia.aioe.org>,
    helbig@asclothestro.multivax.de (Phillip Helbig (undress to reply)) writes:
    Visible light is about 0.4 to 0.7 microns (400 to 700 nm (nanometers),

    That range is what the human eye can see. In practice, the term
    "visible" often refers to light detectable by instrumentation
    suitable for visible light, say from 300 nm (the atmospheric cutoff)
    to 1000 nm (the intrinsic silicon limit).

    (In the same fashion, this silicon is a 'metal' of course!)

    ...
    Typical traditional
    ground-based radio astronomy is in the GHz range and below, so
    wavelengths from centimetres to metres.

    Frequencies up to 15 GHz (wavelength 2 cm) were pretty common even
    when I was in school. Nowadays, the VLA https://public.nrao.edu/telescopes/vla/
    makes images up to 50 GHz, and ALMA
    https://public.nrao.edu/telescopes/alma/
    goes up to 950 GHz (though I don't think the highest frequencies are
    100% operational yet).

    950 GHz sounds like an interesting LNA design problem!

    Is there any pointer to the solutions they use? (Other metals than
    silicon, undoubtedly..)

    --
    Jos

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Steve Willner@21:1/5 to Jos Bergervoet on Tue Feb 5 10:44:42 2019
    I wrote:
    In practice, the term
    "visible" often refers to light detectable by instrumentation
    suitable for visible light, say from 300 nm (the atmospheric cutoff)
    to 1000 nm (the intrinsic silicon limit).

    In article <5c2cab94$0$22362$e4fe514c@news.xs4all.nl>,
    Jos Bergervoet <bergervo@iae.nl> writes:
    (In the same fashion, this silicon is a 'metal' of course!)

    Heh. (Silicon is a semiconductor, not a metal, for anyone who is
    confused.) I reported how the language is used in practice. Human
    language is not always logical. I don't think I've seen silicon
    described as a metal, but it wouldn't shock me. Actually, come to
    think of it, doesn't silicon become a metal at very high pressure?

    950 GHz sounds like an interesting LNA design problem!
    Is there any pointer to the solutions they use? (Other metals than
    silicon, undoubtedly..)

    There must be some design documents, but I don't know where. There
    should also be descriptions in the literature. Try an ADS search.

    My courses in radio astronomy were a long time ago, and the
    technology has changed. I don't think there is any amplification at
    the incoming frequency, though. In most radio telescopes, the signal
    is mixed down to an intermediate frequency and amplified there. In
    the old days, they would have used klystrons or something, but I
    doubt they do now.

    --
    Help keep our newsgroup healthy; please don't feed the trolls.
    Steve Willner Phone 617-495-7123 swillner@cfa.harvard.edu Cambridge, MA 02138 USA

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Edward Prochak@21:1/5 to Steve Willner on Mon Feb 11 14:11:28 2019
    [[Mod. note -- I apologise for the delay in posting this article
    (the delay was caused by a mistake on my part). This article was
    received by the s.a.r moderation system on 2019-02-06.
    -- jt]]

    On Tuesday, February 5, 2019 at 10:44:44 AM UTC-5, Steve Willner wrote:
    I wrote:
    In practice, the term
    "visible" often refers to light detectable by instrumentation
    suitable for visible light, say from 300 nm (the atmospheric cutoff)
    to 1000 nm (the intrinsic silicon limit).

    In article <5c2cab94$0$22362$e4fe514c@news.xs4all.nl>,
    Jos Bergervoet <bergervo@iae.nl> writes:
    (In the same fashion, this silicon is a 'metal' of course!)

    Heh. (Silicon is a semiconductor, not a metal, for anyone who is
    confused.) I reported how the language is used in practice. Human
    language is not always logical. I don't think I've seen silicon
    described as a metal, but it wouldn't shock me. Actually, come to
    think of it, doesn't silicon become a metal at very high pressure?

    I suspect Jos was hinting at the astro view of metals,
    IOW, there is hydrogen and helium, while the rest are metals.

    Ed

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)