• Lloyd & Ng on limits of physical measure of a region

    From stargene@21:1/5 to All on Thu Sep 9 11:36:15 2021
    [[Mod. note -- I'm sorry for the delay in processing this article,
    which the author submitted on 2021-Sept-04. -- jt]]

    The following quote is from a sciam article titled "Black Hole
    Computers" by Seth Lloyd and Y. Jack Ng (April 1, 2007). They
    are referring to satellites measuring any region with radius R
    and certain ultimate limits to the possible accuracy which can be
    obtained by even the most advanced civilization imaginable; lp
    is the Planck length:

    "..Mathematically, in the time it takes to map a region of radius R,
    the total number of ticks by all the satellites is R^2/lp^2. If each
    satellite ticks precisely once during the mapping process, the
    satellites are spaced out by an average distance of
    R^(1/3)lp^(2/3). Shorter distances can be measured in one sub-
    region but only at the expense of reduced precision in some
    other subregion. The argument applies even if space is
    expanding.

    This formula gives the precision to which distances can be
    determined; it is applicable when the measurement apparatus is
    just on the verge of becoming a black hole. Below the minimum
    scale, spacetime geometry ceases to exist. That level of
    precision is much, much bigger than the Planck length. To be
    sure, it is still very small. The average imprecision in measuring
    the size of the observable universe is about 10^-15 meter. Never-
    theless, such an imprecision might be detectable by precise
    distance-measuring equipment, such as future gravitational-wave observatories--"

    I don't have a concrete grasp of their conclusions-- Are they
    saying, as an example, if we had a system (equivalent to a
    cosmic tape measure), any attempt to measure the entire
    universe would never have an average accuracy finer than
    ~ 10^-15 meter? Also, the fact of this "fineness" accuracy,
    10^-15 meters, re: the "measure of the universe", being
    roughly the radius of a proton, is fairly astonishing. Also, what
    do Lloyd and Ng mean when they say that below that minimum
    (fineness) scale, spacetime geometry has no meaning? Would
    this actually conform with the notion of spacetime being
    an emergent phenomenon outside of certain defined limits?
    Thanks, Gene

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Phillip Helbig (undress to reply@21:1/5 to stargene on Fri Sep 17 08:57:03 2021
    In article <771609e1-0092-40c9-86de-cee4762cc10fn@googlegroups.com>,
    stargene <stargene@sbcglobal.net> writes:

    The following quote is from a sciam article titled "Black Hole
    Computers" by Seth Lloyd and Y. Jack Ng (April 1, 2007). They
    are referring to satellites measuring any region with radius R
    and certain ultimate limits to the possible accuracy which can be
    obtained by even the most advanced civilization imaginable; lp
    is the Planck length:

    The article (from 2012) is freely available; Google finds it quickly.

    In general, it is concerned with the fascinating union of
    thermodynamics, general relativity, and quantum theory in relation to
    the information content of black holes. In particular, it looks at
    limits on information processing in the universe. About 20 years ago
    (building on earlier work), Freeman Dyson (and Lawrence Krauss, in a
    sort of debate) did some work on this (but more in the context of
    cosmology).

    I don't have a concrete grasp of their conclusions-- Are they
    saying, as an example, if we had a system (equivalent to a
    cosmic tape measure), any attempt to measure the entire
    universe would never have an average accuracy finer than
    ~ 10^-15 meter?

    Essentially, yes.

    Also, the fact of this "fineness" accuracy,
    10^-15 meters, re: the "measure of the universe", being
    roughly the radius of a proton, is fairly astonishing.

    Do you mean the size or the coincidence (if it is one)? The interesting
    thing is that, if true, their idea might be proved relatively soon.

    Also, what
    do Lloyd and Ng mean when they say that below that minimum
    (fineness) scale, spacetime geometry has no meaning?

    They essentially mean that it can't be measured. Whether that means
    that it doesn't exist is at least a philosophical question.

    Would
    this actually conform with the notion of spacetime being
    an emergent phenomenon outside of certain defined limits?

    The two concepts are probably related, though perhaps not too closely.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From stargene@21:1/5 to All on Mon Sep 20 07:32:53 2021
    On Friday, September 17, 2021 at 1:57:07 AM UTC-7, Phillip Helbig (undress to reply) wrote:

    stargene writes:


    The article (from 2012) is freely available; Google finds it quickly.

    In general, it is concerned with the fascinating union of
    thermodynamics, general relativity, and quantum theory in relation to
    the information content of black holes. In particular, it looks at
    limits on information processing in the universe. About 20 years ago (building on earlier work), Freeman Dyson (and Lawrence Krauss, in a
    sort of debate) did some work on this (but more in the context of
    cosmology).

    Also, the fact of this "fineness" accuracy,
    10^-15 meters, re: the "measure of the universe", being
    roughly the radius of a proton, is fairly astonishing.
    Do you mean the size or the coincidence (if it is one)? The interesting
    thing is that, if true, their idea might be proved relatively soon.
    Thanks for your feedback. Re: 10^-15 meter as the fineness
    limit on the measure of the universe, it’s notable that, assuming
    the universe volume is roughly (10^26 meters)^3, separating it
    into identical volumes by dividing by (Ru / Rpl)^2, these tiny units
    also have radii roughly equal to 10^-15 meter. Ru is universe
    radius and Rpl the Planck length.
    Following Ng, each one can be seen as having one degree of
    freedom. Ie: About ten to the one hundred twenty-two degrees
    of freedom, as the universe computes itself from time zero
    onward.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)