• Re: E = 3/4 mc? or E = mc?? The forgotten Hassenohrl 1905 work.

    From J. J. Lodder@21:1/5 to ProkaryoticCaspaseHomolog on Wed Dec 4 12:40:04 2024
    ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:

    On Tue, 3 Dec 2024 19:02:45 +0000, rhertz wrote:

    And I forgot:

    The settlement of constants BY COLLUSION requires that ALL THE INSTRUMENTATION THAT EXIST (used in any science) BE RE-CALIBRATED, to
    obey.


    Do you get this?

    If you manufacture mass spectrometers, voltmeters, timers, WHATEVER,
    better that you RE-ADJUST the values that come from measurements.

    Example: Your voltmeter measures 1 Volt as 0.9995743 OLD Volts? Then RECALIBRATE THAT MF or you will sell NONE. Is that clear?

    CALIBRATION is an essential part in the design and manufacturing OF ANY INSTRUMENT!. But you require MASTER REFERENCES (OR GUIDELINES LIKE THOSE FROM BIPM).

    Your laser based distance meter measure 1 meter as 1.00493 meters? RECALIBRATE THE INSTRUMENT RIGHT IN THE PRODUCTION LINE.

    Not to talk about instrumentation used to compute Atomic Weight or
    a.m.u.

    ADJUST, COMPLY AND OBEY OR YOU'RE OUT OF THE BUSINESS.

    Did you manufacture a single instrument in an university lab? ADJUST, COMPLY AND OBEY or you are OUTCASTED.

    How do you dare to measure c = 299,793,294 m/s? ARE YOU CRAZY? Adjust
    the readings to c = 299,792,458 m/s, OR ELSE.

    And this has been happening since late XIX Century. Read the history
    behind the definition of 1 Ohm, mainly commanded by British
    institutions, with Cavendish lab behind it.

    E ≈ 1.0000000 mc^2 is not a calibration adjustment. It is a
    measurement made with calibrated instrumentation whose consistency
    with other instrumentation has been carefully verified by procedures
    such as you cast aspersion upon above.

    Was, was, was. There is nothing to 'cast upon' anymore.
    With the redefinition of the kilogram in 2018
    those measurements have become irrelevant.

    E = m c^2 now holds exactly,
    by the definition of the kilogram.
    (and the Joule)

    Jan

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to ProkaryoticCaspaseHomolog on Wed Dec 4 21:17:25 2024
    ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:

    On Wed, 4 Dec 2024 11:40:04 +0000, J. J. Lodder wrote:
    ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:

    E ≈ 1.0000000 mc^2 is not a calibration adjustment. It is a
    measurement made with calibrated instrumentation whose consistency
    with other instrumentation has been carefully verified by procedures
    such as you cast aspersion upon above.

    Was, was, was. There is nothing to 'cast upon' anymore.
    With the redefinition of the kilogram in 2018
    those measurements have become irrelevant.

    E = m c^2 now holds exactly,
    by the definition of the kilogram.
    (and the Joule)

    Specious argument.

    When the kilogram was defined in terms of a metal artifact held in
    vaults in Paris, it was a legitimate question whether the mass of said artifact varied over time, even though by definition it was _the_
    kilogram. As a matter of fact, that mass was found to vary despite its
    being the basis as the definition of kilogram.

    The mere fact that E = mc^2 holds exactly according to our present definitions of the kilogram and the Joule does not make irrelevant experiments intended to check whether the assumptions that have led to
    the adoption of our current set of standards are correct.

    The mere fact that theory and over a century of experimental
    validation have led to the speed of light being adopted as a constant
    does not invalidate experiments intended to verify to increasing
    levels of precision the correctness of the assumptions that led to
    it adoption as a constant.

    So you haven't understood what it is all about.
    I rest my case,

    Jan

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to ProkaryoticCaspaseHomolog on Thu Dec 5 11:57:06 2024
    ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:

    On Wed, 4 Dec 2024 20:17:25 +0000, J. J. Lodder wrote:

    ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:

    The mere fact that theory and over a century of experimental
    validation have led to the speed of light being adopted as a constant
    does not invalidate experiments intended to verify to increasing
    levels of precision the correctness of the assumptions that led to
    it adoption as a constant.

    So you haven't understood what it is all about.
    I rest my case,

    You prematurely rest your case.

    OK. Maybe I gave up on you to soon.

    Since 1983, the speed of light in vacuum has been defined as exactly
    equal to 299,792,458 meters per second.

    Correct, almost.
    Conceptually better: the meter is defined as....
    The CGPM is concerned with how measurements are to be done,
    not with theoretical proclamations.

    Given this definition, is there any point to conducting experiments
    to test whether there are anisotropies in the speed of light due to
    Earth's motions in space? Such as these: https://tinyurl.com/8hkry7k3

    The definition of the speed of light is such that there can't be.

    Right?

    That's where you go wrong.
    The agreement to give c a defined value
    is irrelevant to any experiment.

    It is a convention that tells us how to represent
    the outcomes of experiments.
    So the results of an anisotropy of space experiment
    must be presented (under the SI) as the length of meter rods
    depending on their orientatation in space.
    (even if it may loosely be called differently)
    It has no bearing at all on the possibility of doing such experiments.

    Jan

    PS Given unexpected outcomes of such experiments
    those in the know may of course rethink the SI.
    No need or use to pre-think such hypothecalities.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul B. Andersen@21:1/5 to All on Thu Dec 5 15:26:08 2024
    Den 04.12.2024 21:17, skrev J. J. Lodder:
    ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:

    The mere fact that theory and over a century of experimental
    validation have led to the speed of light being adopted as a constant
    does not invalidate experiments intended to verify to increasing
    levels of precision the correctness of the assumptions that led to
    it adoption as a constant.

    So you haven't understood what it is all about.
    I rest my case,

    Jan

    The meter is defined as:

    1 metre = (1 sec/⁠299792458⁠ m/s)

    1 second = 9192631770 Δν_Cs

    Note that neither the definition of second nor the definition
    of metre depend on the speed of light.

    The constant ⁠299792458⁠ m/s is equal to the defined speed of light,
    but in the definition of the metre it is a constant.

    That means that it possible to measure the speed of light
    even if it is different from the defined value.

    So if the speed of light, measured with instruments with better
    precision than they had in 1983 is found to be 299792458⁠.000001 m/s,
    then that only means that the real speed of light (measured with
    SI metre and SI second) is different from the defined one.

    The point is that the metre isn't define by the speed of light,
    but by the constant 299792458⁠ m/s.

    --
    Paul

    https://paulba.no/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to Paul B. Andersen on Thu Dec 5 19:42:24 2024
    Paul B. Andersen <relativity@paulba.no> wrote:

    Den 04.12.2024 21:17, skrev J. J. Lodder:
    ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:

    The mere fact that theory and over a century of experimental
    validation have led to the speed of light being adopted as a constant
    does not invalidate experiments intended to verify to increasing
    levels of precision the correctness of the assumptions that led to
    it adoption as a constant.

    So you haven't understood what it is all about.
    I rest my case,

    Jan

    The meter is defined as:

    1 metre = (1 sec/?299792458? m/s)

    1 second = 9192631770 ??_Cs

    Note that neither the definition of second nor the definition
    of metre depend on the speed of light.

    The constant ?299792458? m/s is equal to the defined speed of light,
    but in the definition of the metre it is a constant.

    That means that it possible to measure the speed of light
    even if it is different from the defined value.

    The point is that the metre isn't define by the speed of light,
    but by the constant 299792458? m/s.

    So you didn't get the point either.
    (also suffering from a naive empirist bias, I guess)

    The point is not about pottering around with lasers and all that,
    it is about correctly interpreting what you are doing.
    To do that you need to understand the physics of it.

    In fact, the kind of experiments that used to be called
    'speed of light measurements' (so before 1983)
    are still being done routinely today, at places like NIST, or BIPM.
    The difference is that nowadays, precisely the same kind of measurements
    are called 'calibration of a (secudary) meter standard',
    or 'calibration of a frequency standard'. [1]

    So if the speed of light, measured with instruments with better
    precision than they had in 1983 is found to be 299792458?.000001 m/s,
    then that only means that the real speed of light (measured with
    SI metre and SI second) is different from the defined one.

    So this is completely, absolutely, and totally wrong.
    Such a result does not mean that the speed of light
    is off its defined value,
    it means that your meter standard is off,
    and that you must use your measurement result to recalibrate it.
    (so that the speed of light comes out to its defined value)

    In other words, it means that you can nowadays
    calibrate a frequency standard, aka secundary meter standard
    to better accuracy than was possible 1n 1983.
    This is no doubt true,
    but it cannot possibly change the (defined!) speed of light.

    In still other words, there is no such thing as an independent SI meter.
    The SI meter is that meter, and only that meter,
    that makes the speed of light equal to 299792458? m/s (exactly)

    Jan

    --
    Aber das ist Falsch! Sogar ganz Falsch!! (Wolfgang Pauli)


    [1] They publish 'prefered values' for the frequencies
    of a number of standard laser lines.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to Ross Finlayson on Fri Dec 6 11:48:39 2024
    On 2024-12-06 02:29:15 +0000, Ross Finlayson said:

    On 12/05/2024 10:42 AM, J. J. Lodder wrote:
    Paul B. Andersen <relativity@paulba.no> wrote:

    Den 04.12.2024 21:17, skrev J. J. Lodder:
    ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:

    The mere fact that theory and over a century of experimental
    validation have led to the speed of light being adopted as a constant
    does not invalidate experiments intended to verify to increasing
    levels of precision the correctness of the assumptions that led to
    it adoption as a constant.

    So you haven't understood what it is all about.
    I rest my case,

    Jan

    The meter is defined as:

    1 metre = (1 sec/?299792458? m/s)

    1 second = 9192631770 ??_Cs

    Note that neither the definition of second nor the definition
    of metre depend on the speed of light.

    The constant ?299792458? m/s is equal to the defined speed of light,
    but in the definition of the metre it is a constant.

    That means that it possible to measure the speed of light
    even if it is different from the defined value.

    The point is that the metre isn't define by the speed of light,
    but by the constant 299792458? m/s.

    So you didn't get the point either.
    (also suffering from a naive empirist bias, I guess)

    The point is not about pottering around with lasers and all that,
    it is about correctly interpreting what you are doing.
    To do that you need to understand the physics of it.

    In fact, the kind of experiments that used to be called
    'speed of light measurements' (so before 1983)
    are still being done routinely today, at places like NIST, or BIPM.
    The difference is that nowadays, precisely the same kind of measurements
    are called 'calibration of a (secudary) meter standard',
    or 'calibration of a frequency standard'. [1]

    So if the speed of light, measured with instruments with better
    precision than they had in 1983 is found to be 299792458?.000001 m/s,
    then that only means that the real speed of light (measured with
    SI metre and SI second) is different from the defined one.

    So this is completely, absolutely, and totally wrong.
    Such a result does not mean that the speed of light
    is off its defined value,
    it means that your meter standard is off,
    and that you must use your measurement result to recalibrate it.
    (so that the speed of light comes out to its defined value)

    In other words, it means that you can nowadays
    calibrate a frequency standard, aka secundary meter standard
    to better accuracy than was possible 1n 1983.
    This is no doubt true,
    but it cannot possibly change the (defined!) speed of light.

    In still other words, there is no such thing as an independent SI meter.
    The SI meter is that meter, and only that meter,
    that makes the speed of light equal to 299792458? m/s (exactly)

    Not only "deep space in a vacuum, alone, at constant velocity",
    yet, what is the "radius of gyration"?

    There is no SI meter in "deep space in a vacuum, alone, at constant
    velocity"
    SI meters exist only in SI standards laboratories.
    Elsewhere there are only less accurate copies.

    Jan

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to rhertz on Fri Dec 6 11:48:39 2024
    rhertz <hertz778@gmail.com> wrote:

    Permittivity and permeability at the center of each galaxy are different
    from the values of ?? and ?? on the outer limits of each one.

    So, the value of c? = 1/√(????) applies only locally.

    There we go again.
    Is there really no part of physics that you don't misunderstand?
    FYI, eps_0 and mu_0 are not physical quantities.
    They are artifacts of an ill-conceived unit system. (the SI)

    In any half-way decent unit system they are both equal to 1,
    with c appearing explictly in Maxwell's equations in the right places.

    Even saying that they have been put equal to one is too kind to them.
    They just have no physical existence at all,

    Jan

    --
    They are as physical as the 'tractability of free space' tau_0.
    You know, the dimensionless constant with value one
    that appears in Newton's force law:

    F = tau_0 m a

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to Mikko on Fri Dec 6 11:48:39 2024
    Mikko <mikko.levanto@iki.fi> wrote:

    On 2024-12-01 00:28:14 +0000, rhertz said:

    Now, E = 3/4 mc? or E = mc?? Which one would the physics community
    adopt?

    The latter because the former was refuted by later experiments, in
    particular observations and analysis of radioactive decays.

    Yes, it was an elementary error that was easily fixed.
    Nevertheless, some people never give up.

    There are still papers being written
    about the so-called '4/3-problem',
    as if that really is a problem,

    Jan

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul B. Andersen@21:1/5 to You on Fri Dec 6 14:46:56 2024
    Den 05.12.2024 19:42, skrev J. J. Lodder:
    Paul B. Andersen <relativity@paulba.no> wrote:>
    So if the speed of light, measured with instruments with better
    precision than they had in 1983 is found to be 299792458?.000001 m/s,
    then that only means that the real speed of light (measured with
    SI metre and SI second) is different from the defined one.

    Note: measured with SI metre and SI second.


    So this is completely, absolutely, and totally wrong.
    Such a result does not mean that the speed of light
    is off its defined value,
    it means that your meter standard is off,
    and that you must use your measurement result to recalibrate it.
    (so that the speed of light comes out to its defined value)

    The 1983 definition of the speed of light is:
    c = 299792458 m/s

    The 1983 definition of second is:
    1 second = 9192631770 ΔνCs

    The 1983 definition of meter is:
    1 metre = 1 second/299792458 m/s

    The 2019 definition of meter is:
    1 metre = 9192631770 ΔνCs/299792458 m/s

    If the speed of light is measured _with the meter and second
    defined above_ it is obviously possible to get a result slightly
    different from the defined speed of light.

    So I was not "completely, absolutely, and totally wrong".

    Are you are saying that if we got the result 299792458.000001 m/s
    then the metre would have to be recalibrated to:
    1 metre = 9192631770 ΔνCs/299792458.000001 m/s ?


    In other words, it means that you can nowadays
    calibrate a frequency standard, aka secundary meter standard
    to better accuracy than was possible 1n 1983.

    Or are you saying that we would have to recalibrate the meter to:
    1 metre = 9192631770.0000306 ΔνCs/299792458 m/s ?

    This is no doubt true,
    but it cannot possibly change the (defined!) speed of light.

    In still other words, there is no such thing as an independent SI meter.
    The SI meter is that meter, and only that meter,
    that makes the speed of light equal to 299792458 m/s (exactly)

    Jan


    You wrote:
    In fact, the kind of experiments that used to be called
    'speed of light measurements' (so before 1983)
    are still being done routinely today, at places like NIST, or BIPM.
    The difference is that nowadays, precisely the same kind of measurements
    are called 'calibration of a (secudary) meter standard',
    or 'calibration of a frequency standard'.

    Is any such recalibration of the meter ever done?
    And which "frequency standard" are you referring to?
    The definition of a second?

    --
    Paul

    https://paulba.no/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to Paul B. Andersen on Fri Dec 6 21:00:10 2024
    Paul B. Andersen <relativity@paulba.no> wrote:

    Den 05.12.2024 19:42, skrev J. J. Lodder:
    Paul B. Andersen <relativity@paulba.no> wrote:>
    So if the speed of light, measured with instruments with better
    precision than they had in 1983 is found to be 299792458?.000001 m/s,
    then that only means that the real speed of light (measured with
    SI metre and SI second) is different from the defined one.

    Note: measured with SI metre and SI second.


    So this is completely, absolutely, and totally wrong.
    Such a result does not mean that the speed of light
    is off its defined value,
    it means that your meter standard is off,
    and that you must use your measurement result to recalibrate it.
    (so that the speed of light comes out to its defined value)

    The 1983 definition of the speed of light is:
    c = 299792458 m/s

    The 1983 definition of second is:
    1 second = 9192631770 ??Cs

    The 1983 definition of meter is:
    1 metre = 1 second/299792458 m/s

    The 2019 definition of meter is:
    1 metre = 9192631770 ??Cs/299792458 m/s

    If the speed of light is measured _with the meter and second
    defined above_ it is obviously possible to get a result slightly
    different from the defined speed of light.

    So I was not "completely, absolutely, and totally wrong".

    You were, and it would seem that you still are.
    You cannot measure the speed of light because it has a defined value.
    If you would think that what you are doing is a speed of light
    measurement you don't understand what you are doing.

    Are you are saying that if we got the result 299792458.000001 m/s
    then the metre would have to be recalibrated to:
    1 metre = 9192631770 ??Cs/299792458.000001 m/s ?

    Of course not.
    All it would mean is that you have made some systematic error
    with your particular implementattion of the SI meter.

    In other words, it means that you can nowadays
    calibrate a frequency standard, aka secundary meter standard
    to better accuracy than was possible 1n 1983.

    Or are you saying that we would have to recalibrate the meter to:
    1 metre = 9192631770.0000306 ??Cs/299792458 m/s ?

    Neither. The SI meter is a secondary standard that must be calibrated
    such that the speed of light comes to 299792458 m/s.

    This is no doubt true,
    but it cannot possibly change the (defined!) speed of light.

    In still other words, there is no such thing as an independent SI meter. The SI meter is that meter, and only that meter,
    that makes the speed of light equal to 299792458 m/s (exactly)

    Jan


    You wrote:
    In fact, the kind of experiments that used to be called
    'speed of light measurements' (so before 1983)
    are still being done routinely today, at places like NIST, or BIPM.
    The difference is that nowadays, precisely the same kind of measurements are called 'calibration of a (secudary) meter standard',
    or 'calibration of a frequency standard'.

    Is any such recalibration of the meter ever done?

    Of course, routinely, on a day to day basis.
    Guess there are whole departments devoted to it.
    (it is a subtle art)
    The results are published nowadays as a list of frequencies
    of prefered optical frequency standards.
    (measuring the frequency of an optical frequency standard
    and calibrating a secondary meter standard are just two different ways
    of saying the same thing)
    And remember, there is no longer such a thing as -the- meter.
    It is a secondary unit, and any convenient secondary standard will do.

    And which "frequency standard" are you referring to?

    Any optical frequency standard of known frequency
    defines a secondary meter standard.
    (because given the frequency, you know the wavelength,
    so you can measure lengths by interferometry)

    A commonly used one is a certain stabilised He-Ne laser.
    (of specified construction)

    The definition of a second?

    Of course not, that is fixed. (for the time being)
    It is the frequency that all other frequencies must relate to.
    It will be replaced in the not to far future
    by an optical frequency standard. (yet to be chosen)

    Finaly, you really need to get yourself out of the conceptual knot
    that you have tied yourself in.
    Something is either defined, or it can be measured.
    It can't possibly be both,

    Jan

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to ProkaryoticCaspaseHomolog on Sat Dec 7 12:03:24 2024
    ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:

    On Fri, 6 Dec 2024 20:00:10 +0000, J. J. Lodder wrote:

    Finaly, you really need to get yourself out of the conceptual knot
    that you have tied yourself in.
    Something is either defined, or it can be measured.
    It can't possibly be both,

    Sure it can, provided that you use a different measurement standard
    than the one used in the definition.

    Sure, you can be inconsistent, if you choose to be.
    Don't expect meaningful results.

    It would not make sense to quantify hypothetical variations in the
    speed of light in terms of the post-1983 meter. But they would make
    sense in terms pre-1983 meters. Or (assuming some incredible ramp-up
    in technology, perhaps introduced by Larry Niven-ish Outsiders) in
    terms of a meter defined as the distance massless gluons travel in 1/299,792,458 of a second. Or gravitons... :-)

    Completely irrelevant,
    and it does not get you out of your conceptual error as stated above.

    Summmary: There must be:
    1) a length standard, 2) a frequency standard [1], and 3) c

    Two of the three must be defined, the third must be measured.
    Pre-1983 1) and 2) were defined, and 3), c was measured.
    Post-1983 2) and c are defined, 1) must be measured.
    So in 1983 we have collectively decided that any future refinement
    in measurement techniques will result in more accurate meter standards,
    not in a 'better' value for c. [2]

    Finally, an excercise for you personally.
    You quoted a pre-2018 experiment that verified that E=mc^2
    to some high accuracy. (using the measured value of Planck's constant) Post-2018, Planck's constant has a defined value,
    and E=mc^2 is true by definition. (of the Joule and the kilogram)

    So E=mc^2 can no longer be verified by any possible experiment.
    Now:
    Ex1) Does this make the experiment you quoted worthless?
    Ex2) If not, what does that experiment demonstrate?

    Jan


    [1] Or a time standard, which amounts to the same in other words.
    But defining it as a frequency standard is more 'natural'.

    [2] Note that all this has nothing whatsoever to do with physics.
    (like c being 'really' constant in some sense or something like that)
    It is all about metrology, so about the ways -we agree upon-
    to have standards in the most stable, accurate, and reproducible way.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to ProkaryoticCaspaseHomolog on Sat Dec 7 22:35:57 2024
    On 2024-12-07 16:03:31 +0000, ProkaryoticCaspaseHomolog said:
    [missing article on my server, sorry about mixed up quote levels]

    On Sat, 7 Dec 2024 11:03:24 +0000, J. J. Lodder wrote:

    ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:

    On Fri, 6 Dec 2024 20:00:10 +0000, J. J. Lodder wrote:

    Finaly, you really need to get yourself out of the conceptual knot
    that you have tied yourself in.
    Something is either defined, or it can be measured.
    It can't possibly be both,


    Sure it can, provided that you use a different measurement standard
    than the one used in the definition.


    Sure, you can be inconsistent, if you choose to be.
    Don't expect meaningful results.

    It would not make sense to quantify hypothetical variations in the
    speed of light in terms of the post-1983 meter. But they would make
    sense in terms pre-1983 meters. Or (assuming some incredible ramp-up
    in technology, perhaps introduced by Larry Niven-ish Outsiders) in
    terms of a meter defined as the distance massless gluons travel in 1/299,792,458 of a second. Or gravitons... :-)

    Completely irrelevant,
    and it does not get you out of your conceptual error as stated above.

    Summmary: There must be:
    1) a length standard, 2) a frequency standard [1], and 3) c

    Two of the three must be defined, the third must be measured.
    Pre-1983 1) and 2) were defined, and 3), c was measured.
    Post-1983 2) and c are defined, 1) must be measured.
    So in 1983 we have collectively decided that any future refinement
    in measurement techniques will result in more accurate meter standards,
    not in a 'better' value for c. [2]

    You don't "get" the point that I was trying to make. Let us review

    I do get it, and it is wrong.

    | Resolution 1 of the 17th CGPM (1983)
    [snip boilerplate material]

    Gamma ray burst observations have constrained the arrival times
    between the visible light and gamma ray components of the burst to
    be equal to within 10^-15 of the total travel time of the burst.
    [snip more irrelevancies]

    This is irrelevant for the issue of E=mc^2.
    Differential travel times are a test for a non-zero photon mass, if any.

    Definitions are BASED ON state-of-the-art known physics. They do not DETERMINE physical law.

    Are you really incapabable of understanding
    that all this is about metrology, not physical law?
    No definition of units can ever determine or change any physical law.

    Finally, an excercise for you personally.
    You quoted a pre-2018 experiment that verified that E=mc^2
    to some high accuracy. (using the measured value of Planck's constant) Post-2018, Planck's constant has a defined value,
    and E=mc^2 is true by definition. (of the Joule and the kilogram)

    So E=mc^2 can no longer be verified by any possible experiment.
    Now:
    Ex1) Does this make the experiment you quoted worthless?

    Not at all.

    Correct.

    Ex2) If not, what does that experiment demonstrate?

    It would demonstrate an inadequacy in the definitions that must be
    addressed in some future conference when the discrepancies have been
    better characterized.

    I'm sorry, but this is not the right answer,

    Jan

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to Ross Finlayson on Sat Dec 7 22:35:57 2024
    Ross Finlayson <ross.a.finlayson@gmail.com> wrote:

    O.W. Richardson's "The Electron Theory ..." is really pretty
    great, he spends a lot of time explaining all sorts of
    issues in systems of units and algebraic quantities and
    derivations and the inner and outer and these things,
    it's a 100 years old yet I'm glad to be reading it now.

    Yes, in those long past times every competent physicist
    understood about systems of units and dimensions.
    This has been lost as a consequence of general 'SI-only' education.

    A more readily accessible (and excellent) source for the subject
    is in the appendices of Jackson, Classical Electrodynamics.
    Unfortunately the subject is not covered adequately in Wikipedia,
    (afaics)

    Jan

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul B. Andersen@21:1/5 to All on Sat Dec 7 22:19:50 2024
    Den 06.12.2024 21:00, skrev J. J. Lodder:
    Paul B. Andersen <relativity@paulba.no> wrote:

    Den 05.12.2024 19:42, skrev J. J. Lodder:
    Paul B. Andersen <relativity@paulba.no> wrote:>
    So if the speed of light, measured with instruments with better
    precision than they had in 1983 is found to be 299792458.000001 m/s,
    then that only means that the real speed of light (measured with
    SI metre and SI second) is different from the defined one.

    Note: measured with SI metre and SI second.


    So this is completely, absolutely, and totally wrong.
    Such a result does not mean that the speed of light
    is off its defined value,
    it means that your meter standard is off,
    and that you must use your measurement result to recalibrate it.
    (so that the speed of light comes out to its defined value)

    According to: https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
    (2019)
    The SI definitions are:

    The relevant defining constants:
    Δν_Cs = 9192631770 Hz (hyperfine transition frequency of Cs133)
    c = 299 792 458 m/s (speed of light in vacuum)

    The relevant base units:
    Second:
    1 s = 9192631770/Δν_Cs 1 Hz = Δν_Cs/9192631770

    Metre:
    1 metre = (c/299792458)s = (9192631770/299792458)⋅(c/Δν_Cs)

    The home page of BIMP:
    https://www.bipm.org/en/measurement-units

    Give the exact same definitions, so I assume
    that the definitions above are valid now.


    https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf

    If the speed of light is measured _with the meter and second
    defined above_ it is obviously possible to get a result slightly
    different from the defined speed of light.

    So I was not "completely, absolutely, and totally wrong".

    You were, and it would seem that you still are.
    You cannot measure the speed of light because it has a defined value.
    If you would think that what you are doing is a speed of light
    measurement you don't understand what you are doing.

    When you have a definition of second and a definition of metre,
    it is _obviously_ possible to measure the speed of light.

    If you measure the speed of light in air, you would probably
    find that v_air ≈ 2.99705e8 m/s.

    If you measure it in vacuum on the ground, you would probably
    get a value slightly less than 299792458 m/s because the vacuum
    isn't perfect.

    If you measure it in perfect vacuum (in a space-vehicle?) you
    would probably get the value 299792458 m/s.
    But it isn't impossible, if you had extremely precise instruments,
    that you would measure a value slightly different from 299792458 m/s,
    e.g. 299792458.000001 m/s.

    However, so precise instruments hardly exists, and probably never will.
    So I don't think this ever will be a real problem needing a fix.

    But my point is:
    It is possible to measure the speed of light even if it exists
    a defined constant c = 299792458 m/s

    If you are claiming otherwise, you are simply wrong.


    You wrote:
    In fact, the kind of experiments that used to be called
    'speed of light measurements' (so before 1983)
    are still being done routinely today, at places like NIST, or BIPM.
    The difference is that nowadays, precisely the same kind of measurements >>> are called 'calibration of a (secudary) meter standard',
    or 'calibration of a frequency standard'.

    Calibration of a frequency standard is just that, and not
    a 'speed of light measurements'.


    Is any such recalibration of the meter ever done?

    Of course, routinely, on a day to day basis.
    Guess there are whole departments devoted to it.
    (it is a subtle art)
    The results are published nowadays as a list of frequencies
    of prefered optical frequency standards.
    (measuring the frequency of an optical frequency standard
    and calibrating a secondary meter standard are just two different ways
    of saying the same thing)
    And remember, there is no longer such a thing as -the- meter.
    It is a secondary unit, and any convenient secondary standard will do.

    In:
    https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf

    I read:
    https://www.bipm.org/en/cipm-mra

    "The CIPM has adopted various secondary representations of
    the second, based on a selected number of spectral lines of atoms,
    ions or molecules. The unperturbed frequencies of these lines can
    be determined with a relative uncertainty not lower than that of
    the realization of the second based on the 133Cs hyperfine transition
    frequency, but some can be reproduced with superior stability."

    This is how I interpret this:
    The second is still defined by "the unperturbed ground state
    hyperfine transition frequency of the caesium 133 atom"
    Δν_Cs = 9192631770 Hz by definition.

    But practical realisations of this frequency standard,
    that is an atomic frequency standard based on Cs133 is
    not immune to perturbation, a magnetic field may affect it.

    So there exist more stable frequency standards than Cs,
    and some are extremely more stable.
    But the frequencies of these standards are still defined
    by Δν_Cs. 1 hz = Δν_Cs/9192631770
    This is "Calibration of a frequency standard".

    The "secondary representations of second"
    don't change the duration of a second
    and the "secondary representations of metre"
    don't change the length of a metre.

    --
    Paul

    https://paulba.no/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Maciej Wozniak@21:1/5 to All on Sun Dec 8 04:52:19 2024
    W dniu 05.12.2024 o 15:26, Paul B. Andersen pisze:
    Den 04.12.2024 21:17, skrev J. J. Lodder:
    ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:

    The mere fact that theory and over a century of experimental
    validation have led to the speed of light being adopted as a constant
    does not invalidate experiments intended to verify to increasing
    levels of precision the correctness of the assumptions that led to
    it adoption as a constant.

    So you haven't understood what it is all about.
    I rest my case,

    Jan

    The meter is defined as:

    1 metre = (1 sec/⁠299792458⁠ m/s)

    1 second = 9192631770 Δν_Cs

    Anyone can check GPS, nobody serious cares
    about the moronic wishes of your moronic cult.
    But feel free to keep enchanting the reality,
    poor halfbrain.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul B. Andersen@21:1/5 to All on Sun Dec 8 09:19:33 2024
    Den 07.12.2024 22:19, skrev Paul B. Andersen:
    Den 06.12.2024 21:00, skrev J. J. Lodder:
    Paul B. Andersen <relativity@paulba.no> wrote:


    According to: https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
    (2019)
    The SI definitions are:

    The relevant defining constants:
     Δν_Cs = 9192631770 Hz  (hyperfine transition frequency of Cs133)
     c = 299 792 458 m/s (speed of light in vacuum)

    The relevant base units:
    Second:
     1 s = 9192631770/Δν_Cs  1 Hz = Δν_Cs/9192631770

    Metre:
     1 metre = (c/299792458)s = (9192631770/299792458)⋅(c/Δν_Cs)

    The home page of BIMP:
    https://www.bipm.org/en/measurement-units

    Give the exact same definitions, so I assume
    that the definitions above are valid now.


    https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf


    If the speed of light is measured _with the meter and second
    defined above_ it is obviously possible to get a result slightly
    different from the defined speed of light.

    So I was not "completely, absolutely, and totally wrong".

    You were, and it would seem that you still are.
    You cannot measure the speed of light because it has a defined value.
    If you would think that what you are doing is a speed of light
    measurement you don't understand what you are doing.

    Yes, I was indeed "absolutely, and totally wrong",
    but not completely wrong.


    When you have a definition of second and a definition of metre,
    it is _obviously_ possible to measure the speed of light.

    If you measure the speed of light in air, you would probably
    find that v_air ≈ 2.99705e8 m/s.

    If you measure it in vacuum on the ground, you would probably
    get a value slightly less than 299792458 m/s because the vacuum
    isn't perfect.

    OK so far.


    If you measure it in perfect vacuum (in a space-vehicle?) you
    would probably get the value 299792458 m/s.

    You would certainly measure the value 299792458 m/s.

    It is possible measure the speed of light in vacuum, but not much
    point in doing so since the result is given by definition.

    But it isn't impossible, if you had extremely precise instruments,
    that you would measure a value slightly different from 299792458 m/s,
    e.g. 299792458.000001 m/s.

    This is indeed "completely, absolutely, and totally wrong".

    I somehow thought that the "real speed" of light in vacuum
    measured before 1985 was different from 299792458 m/s.
    (Which it probably was, but the difference hidden in the error bar)
    And since the definition of metre only contain the defined constant c,
    i thought "the real speed" of light could be different from c.

    But this is utter nonsense!
    Now I can't understand how I could think so.
    My brain seems to be slower than it used to be. :-(

    The real speed of light in vacuum is exactly c = 299792458 m/s,
    and 1 metre = (1 second/299792458)c, is derived from c,
    which means that the measured speed of light in vacuum will
    always be c.


    In:
    https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf

    I read:
    https://www.bipm.org/en/cipm-mra

    "The CIPM has adopted various secondary representations of
     the second, based on a selected number of spectral lines of atoms,
     ions or molecules. The unperturbed frequencies of these lines can
     be determined with a relative uncertainty not lower than that of
     the realization of the second based on the 133Cs hyperfine transition
     frequency, but some can be reproduced with superior stability."

    This is how I interpret this:
    The second is still defined by "the unperturbed ground state
    hyperfine transition frequency of the caesium 133 atom"
      Δν_Cs = 9192631770 Hz by definition.

    But practical realisations of this frequency standard,
    that is an atomic frequency standard based on Cs133 is
    not immune to perturbation, a magnetic field may affect it.

    So there exist more stable frequency standards than Cs,
    and some  are extremely more stable.
    But the frequencies of these standards are still defined
    by Δν_Cs. 1 hz = Δν_Cs/9192631770
    This is "Calibration of a frequency standard".

    The "secondary representations of second"
    don't change the duration of a second
    and the "secondary representations of metre"
    don't change the length of a metre.



    --
    Paul

    https://paulba.no/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Athel Cornish-Bowden@21:1/5 to J. J. Lodder on Sun Dec 8 10:19:52 2024
    On 2024-12-07 21:35:57 +0000, J. J. Lodder said:

    Ross Finlayson <ross.a.finlayson@gmail.com> wrote:

    O.W. Richardson's "The Electron Theory ..." is really pretty
    great, he spends a lot of time explaining all sorts of
    issues in systems of units and algebraic quantities and
    derivations and the inner and outer and these things,
    it's a 100 years old yet I'm glad to be reading it now.

    Yes, in those long past times every competent physicist
    understood about systems of units and dimensions.
    This has been lost as a consequence of general 'SI-only' education.

    A more readily accessible (and excellent) source for the subject
    is in the appendices of Jackson, Classical Electrodynamics.
    Unfortunately the subject is not covered adequately in Wikipedia,
    (afaics)

    Why don't you fix it, then? Anyone can edit Wikipedia.

    --
    Athel -- French and British, living in Marseilles for 37 years; mainly
    in England until 1987.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to Athel Cornish-Bowden on Sun Dec 8 12:56:15 2024
    Athel Cornish-Bowden <me@yahoo.com> wrote:

    On 2024-12-07 21:35:57 +0000, J. J. Lodder said:

    Ross Finlayson <ross.a.finlayson@gmail.com> wrote:

    O.W. Richardson's "The Electron Theory ..." is really pretty
    great, he spends a lot of time explaining all sorts of
    issues in systems of units and algebraic quantities and
    derivations and the inner and outer and these things,
    it's a 100 years old yet I'm glad to be reading it now.

    Yes, in those long past times every competent physicist
    understood about systems of units and dimensions.
    This has been lost as a consequence of general 'SI-only' education.

    A more readily accessible (and excellent) source for the subject
    is in the appendices of Jackson, Classical Electrodynamics.
    Unfortunately the subject is not covered adequately in Wikipedia,
    (afaics)

    Why don't you fix it, then? Anyone can edit Wikipedia.

    Because I have experience with the matters.
    I have had to argue units and dimensions with electrical engineers.
    In consequence my jaws are stronger than those of Father William.
    I'm not in the mood for going into it again,

    Jan

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to Paul B. Andersen on Sun Dec 8 12:30:36 2024
    Paul B. Andersen <relativity@paulba.no> wrote:

    Den 07.12.2024 22:19, skrev Paul B. Andersen:
    Den 06.12.2024 21:00, skrev J. J. Lodder:
    Paul B. Andersen <relativity@paulba.no> wrote:


    According to: https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
    (2019)
    The SI definitions are:

    The relevant defining constants:
    ??_Cs = 9192631770 Hz (hyperfine transition frequency of Cs133)
    c = 299 792 458 m/s (speed of light in vacuum)

    The relevant base units:
    Second:
    1 s = 9192631770/??_Cs 1 Hz = ??_Cs/9192631770

    Metre:
    1 metre = (c/299792458)s = (9192631770/299792458)?(c/??_Cs)

    The home page of BIMP:
    https://www.bipm.org/en/measurement-units

    Give the exact same definitions, so I assume
    that the definitions above are valid now.


    https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf


    If the speed of light is measured _with the meter and second
    defined above_ it is obviously possible to get a result slightly
    different from the defined speed of light.

    So I was not "completely, absolutely, and totally wrong".

    You were, and it would seem that you still are.
    You cannot measure the speed of light because it has a defined value.
    If you would think that what you are doing is a speed of light
    measurement you don't understand what you are doing.

    Yes, I was indeed "absolutely, and totally wrong",
    but not completely wrong.


    When you have a definition of second and a definition of metre,
    it is _obviously_ possible to measure the speed of light.

    If you measure the speed of light in air, you would probably
    find that v_air ≈ 2.99705e8 m/s.

    If you measure it in vacuum on the ground, you would probably
    get a value slightly less than 299792458 m/s because the vacuum
    isn't perfect.

    OK so far.


    If you measure it in perfect vacuum (in a space-vehicle?) you
    would probably get the value 299792458 m/s.

    You would certainly measure the value 299792458 m/s.

    It is possible measure the speed of light in vacuum, but not much
    point in doing so since the result is given by definition.

    But it isn't impossible, if you had extremely precise instruments,
    that you would measure a value slightly different from 299792458 m/s,
    e.g. 299792458.000001 m/s.

    This is indeed "completely, absolutely, and totally wrong".

    I somehow thought that the "real speed" of light in vacuum
    measured before 1985 was different from 299792458 m/s.

    Of course it was. The adopted value was a compromise
    between the results of different teams.
    BTW, you are also falling into the 'das ding an sich' trap.

    (Which it probably was, but the difference hidden in the error bar)
    And since the definition of metre only contain the defined constant c,
    i thought "the real speed" of light could be different from c.

    Yes, that is where you go wrong.

    But this is utter nonsense!

    Beginning to see the light?

    Now I can't understand how I could think so.
    My brain seems to be slower than it used to be. :-(

    The real speed of light in vacuum is exactly c = 299792458 m/s,
    and 1 metre = (1 second/299792458)c, is derived from c,
    which means that the measured speed of light in vacuum will
    always be c.

    Correct.
    Perhaps I can explain the practicalities behind it in another way.
    If you measure the speed of light accurately
    you must of course do an error analysis.
    The result of this that almost all of the error results from
    the ecessary realisation of the meter standard. (in your laboratory)
    So the paradoxal result is that you cannot measure the speed of light
    even when there is a meter standard of some kind.

    You may call whatever it is that you are doing
    'a speed of light measurement',
    but if you are a competent experimentalist you will understand
    that what you are really doing is a meter calibraton experiment.
    Hence the speed of light must be given a defined value,
    for practical experimental reasons. [1]

    Jan

    [1] Which have not changed.
    (and will not change in the forseeable future)
    Meter standards are orders of magnitude less accurate
    than time standards. (see why this must be?)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Maciej Wozniak@21:1/5 to All on Sun Dec 8 15:01:05 2024
    W dniu 08.12.2024 o 12:56, J. J. Lodder pisze:
    Athel Cornish-Bowden <me@yahoo.com> wrote:

    On 2024-12-07 21:35:57 +0000, J. J. Lodder said:

    Ross Finlayson <ross.a.finlayson@gmail.com> wrote:

    O.W. Richardson's "The Electron Theory ..." is really pretty
    great, he spends a lot of time explaining all sorts of
    issues in systems of units and algebraic quantities and
    derivations and the inner and outer and these things,
    it's a 100 years old yet I'm glad to be reading it now.

    Yes, in those long past times every competent physicist
    understood about systems of units and dimensions.
    This has been lost as a consequence of general 'SI-only' education.

    A more readily accessible (and excellent) source for the subject
    is in the appendices of Jackson, Classical Electrodynamics.
    Unfortunately the subject is not covered adequately in Wikipedia,
    (afaics)

    Why don't you fix it, then? Anyone can edit Wikipedia.

    Because I have experience with the matters.
    I have had to argue units and dimensions with electrical engineers.

    And, of course, you were unable to listen to
    competent men, just limke any other Shit's
    fanatic.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to ProkaryoticCaspaseHomolog on Sun Dec 8 21:35:14 2024
    ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:

    On Sun, 8 Dec 2024 5:42:07 +0000, ProkaryoticCaspaseHomolog wrote:

    On Sat, 7 Dec 2024 21:35:57 +0000, J. J. Lodder wrote:

    I'm sorry, but this is not the right answer,

    So what are you saying, then? Are you saying that, because of the definition of E=mc^2, it is totally required that 1 gram of electrons annihilating 1 gram of positrons completely to electromagnetic
    radiation must NECESSICARILY yield the same amount of energy as 1 gram
    of protons annihilating 1 gram of antiprotons completely to electro- magnetic radiation? That the equality of these two values is a matter
    of definition, not something to be established by experiment?

    Are you saying that because the current definition of c is
    299,792,458 meters per second regardless of wavelength, that questions
    as to whether gamma rays travel faster than visible light rays are
    totally nonsensical?

    In fewer words:

    No experiment can measure a difference between the amount of energy
    released by the complete annihilation of 1 g of (electrons + positrons) versus the complete annihilation of 1 g of (protons + antiprotons).
    True or false?

    False, see previous.

    No experiment can measure a difference between the speed of visible
    light photons versus the speed of gamma rays. True or false?

    False, already answered several postings back.
    A class of experiments relevant to this question
    are experiments that set an upper limit on the photon mass,
    (the most plausible mechanism for such an effect)

    Why for heavens sake would you even get such an idea?

    Jan

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to ProkaryoticCaspaseHomolog on Sun Dec 8 23:32:09 2024
    ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:

    On Sun, 8 Dec 2024 8:19:33 +0000, Paul B. Andersen wrote:

    Den 07.12.2024 22:19, skrev Paul B. Andersen:
    Den 06.12.2024 21:00, skrev J. J. Lodder:
    Paul B. Andersen <relativity@paulba.no> wrote:


    According to:
    https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
    (2019)
    The SI definitions are:

    The relevant defining constants:
    ??_Cs = 9192631770 Hz (hyperfine transition frequency of Cs133)
    c = 299 792 458 m/s (speed of light in vacuum)

    The relevant base units:
    Second:
    1 s = 9192631770/??_Cs 1 Hz = ??_Cs/9192631770

    Metre:
    1 metre = (c/299792458)s = (9192631770/299792458)?(c/??_Cs)

    The home page of BIMP:
    https://www.bipm.org/en/measurement-units

    Give the exact same definitions, so I assume
    that the definitions above are valid now.


    https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf


    If the speed of light is measured _with the meter and second
    defined above_ it is obviously possible to get a result slightly
    different from the defined speed of light.

    So I was not "completely, absolutely, and totally wrong".

    You were, and it would seem that you still are.
    You cannot measure the speed of light because it has a defined value.
    If you would think that what you are doing is a speed of light
    measurement you don't understand what you are doing.

    Yes, I was indeed "absolutely, and totally wrong",
    but not completely wrong.

    I disagree that you were wrong at all.

    So you are not there yet.

    Remember that nothing you say, and no definitions you make
    can have any effect on reality as it is.
    It can only change your way of looking at it,
    and your interpretations of what you see.

    1) The expression "c" has multiple meanings. On the one hand, it is,
    according to a widely accepted geometric model of spacetime, a
    constant that expresses the relationship between units of space and
    units of time. This "c" is given a defined value of 299792458 m/s,
    and because it has that value by definition, it cannot be measured.
    2) Another meaning of "c" is the speed of photons in vacuum. Photons
    are, to the best of our knowledge, massless, and according to the
    above geometric model of spacetime, all unimpeded massless
    particles travel at the speed "c" given in definition (1).

    All very true, but completely irrelevant
    from the point of view of metrology.

    Metrology is about how to realise units, and nothing else.
    Deep thoughts about the nature of things,
    or what words might mean, do not come into it at all.
    In particular, the whole theory of relativity is irrelevant
    as far the definition of the meter is concerned.
    [snip more irrelevancies]

    Jan

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul B. Andersen@21:1/5 to All on Mon Dec 9 15:21:01 2024
    Den 08.12.2024 12:30, skrev J. J. Lodder:
    Paul B. Andersen <relativity@paulba.no> wrote:

    Den 07.12.2024 22:19, skrev Paul B. Andersen:

    But it isn't impossible, if you had extremely precise instruments,
    that you would measure a value slightly different from 299792458 m/s,
    e.g. 299792458.000001 m/s.

    This is indeed "completely, absolutely, and totally wrong".

    I somehow thought that the "real speed" of light in vacuum
    measured before 1985 was different from 299792458 m/s.

    Of course it was. The adopted value was a compromise
    between the results of different teams.
    BTW, you are also falling into the 'das ding an sich' trap.

    (Which it probably was, but the difference hidden in the error bar)
    And since the definition of metre only contain the defined constant c,
    i thought "the real speed" of light could be different from c.

    Yes, that is where you go wrong.

    But this is utter nonsense!

    Beginning to see the light?

    Now I can't understand how I could think so.
    My brain seems to be slower than it used to be. :-(

    The real speed of light in vacuum is exactly c = 299792458 m/s,
    and 1 metre = (1 second/299792458)c, is derived from c,
    which means that the measured speed of light in vacuum will
    always be c.

    Correct.
    Perhaps I can explain the practicalities behind it in another way.
    If you measure the speed of light accurately
    you must of course do an error analysis.
    The result of this that almost all of the error results from
    the ecessary realisation of the meter standard. (in your laboratory)
    So the paradoxal result is that you cannot measure the speed of light
    even when there is a meter standard of some kind.

    You may call whatever it is that you are doing
    'a speed of light measurement',
    but if you are a competent experimentalist you will understand
    that what you are really doing is a meter calibraton experiment.
    Hence the speed of light must be given a defined value,
    for practical experimental reasons. [1]

    Jan

    This is my way of thinking which made me realise that I was wrong:
    How do we measure the speed of light?
    We measure the time it takes for the light to travel a known distance.
    So we bounce the light off a mirror and measure the round trip time.
    How do we calibrate the distance to the mirror?
    We measure the time it takes for the light to go back and forth
    to the mirror.
    L = (c/299792458)⋅t/2 where t is round trip time in seconds
    AHA!!!


    [1] Which have not changed.
    (and will not change in the forseeable future)
    Meter standards are orders of magnitude less accurate
    than time standards. (see why this must be?)


    No, I don't understand.
    The definition of metre only depends on the two constants
    Δν_Cs and c and both have an exact value.
    Is it because the time standard only depend on one constant?

    I can however understand that practical calibration of the meter
    is less precise than the calibration of a frequency standard.

    ------------------

    I would like your reaction to the following;

    In:
    https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
    I read:
    https://www.bipm.org/en/cipm-mra

    "The CIPM has adopted various secondary representations of
    the second, based on a selected number of spectral lines of atoms,
    ions or molecules. The unperturbed frequencies of these lines can
    be determined with a relative uncertainty not lower than that of
    the realization of the second based on the 133Cs hyperfine transition
    frequency, but some can be reproduced with superior stability."

    This is how I interpret this:
    The second is still defined by "the unperturbed ground state
    hyperfine transition frequency of the caesium 133 atom"
    Δν_Cs = 9192631770 Hz by definition.

    But practical realisations of this frequency standard,
    that is an atomic frequency standard based on Cs133 is
    not immune to perturbation, a magnetic field may affect it.

    So there exist more stable frequency standards than Cs,
    and some are extremely more stable.
    But the frequencies of these standards are still defined
    by Δν_Cs. 1 hz = Δν_Cs/9192631770
    This is "Calibration of a frequency standard".

    The "secondary representations of second"
    don't change the duration of a second
    and the "secondary representations of metre"
    don't change the length of a metre.


    --
    Paul

    https://paulba.no/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to Paul B. Andersen on Mon Dec 9 20:28:42 2024
    Paul B. Andersen <relativity@paulba.no> wrote:

    Den 08.12.2024 12:30, skrev J. J. Lodder:
    Paul B. Andersen <relativity@paulba.no> wrote:

    Den 07.12.2024 22:19, skrev Paul B. Andersen:

    But it isn't impossible, if you had extremely precise instruments,
    that you would measure a value slightly different from 299792458 m/s,
    e.g. 299792458.000001 m/s.

    This is indeed "completely, absolutely, and totally wrong".

    I somehow thought that the "real speed" of light in vacuum
    measured before 1985 was different from 299792458 m/s.

    Of course it was. The adopted value was a compromise
    between the results of different teams.
    BTW, you are also falling into the 'das ding an sich' trap.

    (Which it probably was, but the difference hidden in the error bar)
    And since the definition of metre only contain the defined constant c,
    i thought "the real speed" of light could be different from c.

    Yes, that is where you go wrong.

    But this is utter nonsense!

    Beginning to see the light?

    Now I can't understand how I could think so.
    My brain seems to be slower than it used to be. :-(

    The real speed of light in vacuum is exactly c = 299792458 m/s,
    and 1 metre = (1 second/299792458)c, is derived from c,
    which means that the measured speed of light in vacuum will
    always be c.

    Correct.
    Perhaps I can explain the practicalities behind it in another way.
    If you measure the speed of light accurately
    you must of course do an error analysis.
    The result of this that almost all of the error results from
    the ecessary realisation of the meter standard. (in your laboratory)
    So the paradoxal result is that you cannot measure the speed of light
    even when there is a meter standard of some kind.

    You may call whatever it is that you are doing
    'a speed of light measurement',
    but if you are a competent experimentalist you will understand
    that what you are really doing is a meter calibraton experiment.
    Hence the speed of light must be given a defined value,
    for practical experimental reasons. [1]

    Jan

    This is my way of thinking which made me realise that I was wrong:
    How do we measure the speed of light?
    We measure the time it takes for the light to travel a known distance.
    So we bounce the light off a mirror and measure the round trip time.
    How do we calibrate the distance to the mirror?
    We measure the time it takes for the light to go back and forth
    to the mirror.
    L = (c/299792458)?t/2 where t is round trip time in seconds
    AHA!!!


    [1] Which have not changed.
    (and will not change in the forseeable future)
    Meter standards are orders of magnitude less accurate
    than time standards. (see why this must be?)


    No, I don't understand.
    The definition of metre only depends on the two constants
    ??_Cs and c and both have an exact value.
    Is it because the time standard only depend on one constant?

    I can however understand that practical calibration of the meter
    is less precise than the calibration of a frequency standard.

    ------------------

    I would like your reaction to the following;

    In:
    https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
    I read:
    https://www.bipm.org/en/cipm-mra

    "The CIPM has adopted various secondary representations of
    the second, based on a selected number of spectral lines of atoms,
    ions or molecules. The unperturbed frequencies of these lines can
    be determined with a relative uncertainty not lower than that of
    the realization of the second based on the 133Cs hyperfine transition
    frequency, but some can be reproduced with superior stability."

    This is how I interpret this:
    The second is still defined by "the unperturbed ground state
    hyperfine transition frequency of the caesium 133 atom"
    ??_Cs = 9192631770 Hz by definition.

    But practical realisations of this frequency standard,
    that is an atomic frequency standard based on Cs133 is
    not immune to perturbation, a magnetic field may affect it.

    So there exist more stable frequency standards than Cs,
    and some are extremely more stable.
    But the frequencies of these standards are still defined
    by ??_Cs. 1 hz = ??_Cs/9192631770
    This is "Calibration of a frequency standard".

    The "secondary representations of second"
    don't change the duration of a second
    and the "secondary representations of metre"
    don't change the length of a metre.

    Instead of replying point by point I'll sum up the whole situation.
    (as I understand it, and perhaps repeating what I wrote earlier)

    For understanding all this you must realise
    that there are two kinds of frequency standards:
    microwave ones, typically in the (perhaps many) GHz range,
    and
    optical ones, typically in the hundreds of THz range.
    The GHz ones may serve as absolute frequency standards and as clocks.
    The optical ones (like the standard stabilised HeNe laser)
    may also serve as (secondary) meter standards.
    Standards labs supply lists of 'recommended' optical frequencies.
    The optical frequency sources are of course also 'floating' frequency
    standards on their own.

    The GHz ones can be calibrated against each other by direct counting.
    So their accuracy may equal that of the Cs standard. (by the definition)
    The stability of frequency standards can in general be established
    by comparing ensembles of them against each other. (so indepently of Cs)
    Which kind of standard to use depends on what you need:
    relative or absolute accuracy.

    AFAIK about those matters, the idea among metrologists at present
    is to leave things as they are,
    until a really big step forward can be made.
    (hopefully already at the next CGPM)

    Some of the optical frequency standards are far more stable indeed.
    (nowadays pushing 10^18, last time I looked)
    But their frequencies (in terms of the Cs standard!)
    are known to a much lesser accuracy.
    (pushing 10^12, again last time I looked)
    The use of frequency combs caused a revolution here. (see 2005 Nobel)

    Summary: optical frequency standards can be far more stable,
    but their frequencies are (relatively speaking!) poorly known.

    Once you have a calibrated optical frequency standard, [1]
    for which you know the frequency in terms of the Cs standard,
    you know its wavelength, by the definition of c,
    so you can start measuring distances and sizes
    in terms of its wavelength, hence in meters.
    It has become a secondary meter standard.

    So measuring distances/lengths is inherently much less accurate
    than measuring time/frequency.
    And, circle closed, this was precisely the reason
    for giving c a defined value.
    So c really cannot be measured anymore,
    not because some crazed guru-followers decreed so,
    but because of hard experimental realities and necessities.

    Hope this clears up the questions you had,

    Jan

    [1] This is the ongoing, never-ending, program I mentioned earlier:
    finding optical frequency standards, aka secondary meter standards,
    to ever greater accuracy and reproducibiliy.
    The original <1983 series of measurements, then called 'measuring c',
    was just good enough to base the definined value of c on.
    Those decades of added precision had to go into better frequency/meter standards, not into a 'better' value of c.

    PS There are first indications that it may be possible
    to harness a gamma ray line from a nuclear transition
    in the not to far future, for again greatly increased stability.
    Very low frequency, as gammas go, but still in the very far UV.
    Challenges, challenges.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul B. Andersen@21:1/5 to All on Tue Dec 10 11:20:27 2024
    Den 09.12.2024 20:28, skrev J. J. Lodder:
    Paul B. Andersen <relativity@paulba.no> wrote:


    I would like your reaction to the following;

    In:
    https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
    I read:
    https://www.bipm.org/en/cipm-mra

    "The CIPM has adopted various secondary representations of
    the second, based on a selected number of spectral lines of atoms,
    ions or molecules. The unperturbed frequencies of these lines can
    be determined with a relative uncertainty not lower than that of
    the realization of the second based on the 133Cs hyperfine transition
    frequency, but some can be reproduced with superior stability."

    This is how I interpret this:
    The second is still defined by "the unperturbed ground state
    hyperfine transition frequency of the caesium 133 atom"
    ??_Cs = 9192631770 Hz by definition.

    But practical realisations of this frequency standard,
    that is an atomic frequency standard based on Cs133 is
    not immune to perturbation, a magnetic field may affect it.

    So there exist more stable frequency standards than Cs,
    and some are extremely more stable.
    But the frequencies of these standards are still defined
    by ??_Cs. 1 hz = ??_Cs/9192631770
    This is "Calibration of a frequency standard".

    The "secondary representations of second"
    don't change the duration of a second
    and the "secondary representations of metre"
    don't change the length of a metre.

    Instead of replying point by point I'll sum up the whole situation.
    (as I understand it, and perhaps repeating what I wrote earlier)

    For understanding all this you must realise
    that there are two kinds of frequency standards:
    microwave ones, typically in the (perhaps many) GHz range,
    and
    optical ones, typically in the hundreds of THz range.
    The GHz ones may serve as absolute frequency standards and as clocks.
    The optical ones (like the standard stabilised HeNe laser)
    may also serve as (secondary) meter standards.
    Standards labs supply lists of 'recommended' optical frequencies.
    The optical frequency sources are of course also 'floating' frequency standards on their own.

    The GHz ones can be calibrated against each other by direct counting.
    So their accuracy may equal that of the Cs standard. (by the definition)
    The stability of frequency standards can in general be established
    by comparing ensembles of them against each other. (so indepently of Cs) Which kind of standard to use depends on what you need:
    relative or absolute accuracy.

    AFAIK about those matters, the idea among metrologists at present
    is to leave things as they are,
    until a really big step forward can be made.
    (hopefully already at the next CGPM)

    Some of the optical frequency standards are far more stable indeed.
    (nowadays pushing 10^18, last time I looked)
    But their frequencies (in terms of the Cs standard!)
    are known to a much lesser accuracy.
    (pushing 10^12, again last time I looked)
    The use of frequency combs caused a revolution here. (see 2005 Nobel)

    Summary: optical frequency standards can be far more stable,
    but their frequencies are (relatively speaking!) poorly known.

    Once you have a calibrated optical frequency standard, [1]
    for which you know the frequency in terms of the Cs standard,
    you know its wavelength, by the definition of c,
    so you can start measuring distances and sizes
    in terms of its wavelength, hence in meters.
    It has become a secondary meter standard.

    So measuring distances/lengths is inherently much less accurate
    than measuring time/frequency.
    And, circle closed, this was precisely the reason
    for giving c a defined value.
    So c really cannot be measured anymore,
    not because some crazed guru-followers decreed so,
    but because of hard experimental realities and necessities.

    Hope this clears up the questions you had,

    Yes. Thank you!


    Jan

    [1] This is the ongoing, never-ending, program I mentioned earlier:
    finding optical frequency standards, aka secondary meter standards,
    to ever greater accuracy and reproducibiliy.
    The original <1983 series of measurements, then called 'measuring c',
    was just good enough to base the definined value of c on.
    Those decades of added precision had to go into better frequency/meter standards, not into a 'better' value of c.

    PS There are first indications that it may be possible
    to harness a gamma ray line from a nuclear transition
    in the not to far future, for again greatly increased stability.
    Very low frequency, as gammas go, but still in the very far UV.
    Challenges, challenges.



    --
    Paul

    https://paulba.no/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J. J. Lodder@21:1/5 to Ross Finlayson on Tue Dec 10 23:16:53 2024
    Ross Finlayson <ross.a.finlayson@gmail.com> wrote:

    On 12/08/2024 12:35 PM, J. J. Lodder wrote:
    ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:

    On Sun, 8 Dec 2024 5:42:07 +0000, ProkaryoticCaspaseHomolog wrote:

    On Sat, 7 Dec 2024 21:35:57 +0000, J. J. Lodder wrote:

    I'm sorry, but this is not the right answer,

    So what are you saying, then? Are you saying that, because of the
    definition of E=mc^2, it is totally required that 1 gram of electrons
    annihilating 1 gram of positrons completely to electromagnetic
    radiation must NECESSICARILY yield the same amount of energy as 1 gram >>> of protons annihilating 1 gram of antiprotons completely to electro-
    magnetic radiation? That the equality of these two values is a matter
    of definition, not something to be established by experiment?

    Are you saying that because the current definition of c is
    299,792,458 meters per second regardless of wavelength, that questions >>> as to whether gamma rays travel faster than visible light rays are
    totally nonsensical?

    In fewer words:

    No experiment can measure a difference between the amount of energy
    released by the complete annihilation of 1 g of (electrons + positrons)
    versus the complete annihilation of 1 g of (protons + antiprotons).
    True or false?

    False, see previous.

    No experiment can measure a difference between the speed of visible
    light photons versus the speed of gamma rays. True or false?

    False, already answered several postings back.
    A class of experiments relevant to this question
    are experiments that set an upper limit on the photon mass,
    (the most plausible mechanism for such an effect)

    Why for heavens sake would you even get such an idea?

    Jan



    O.W. Richardson's "The Electron Theory of Matter" has
    really a great account of various considerations of
    what "c" is with regards to electromagnetic radiation
    as opposed to the optical range of not-electromagnetic
    radiation and as with regards to wavelength versus
    wave velocity.

    Sort of like "photons" are overloaded and diluted these
    days, so are waves, and so is "c".

    The wave model is great and all and the energy equivalency
    is great and all, yet it's overloaded and diluted (i.e.,
    tenuous and weak).

    The popular public deserves quite an apology from the
    too-simple accounts that have arrived at having nothing
    at all to say and no way to say it about the wider milieu
    and the real-er parts of the theory.

    So, for a pretty great example when these differences
    were not just ignored and furthermore pasted over,
    wall-papered as it were, "The Electron Theory of Matter"
    is a bit antique yet it's perfectly cool and furthermore
    greatly expands a usual discourse on radiation that travels
    through space, _and_, the space-contraction (FitzGeraldian).

    Not at hand, but this 1914! book, while perhaps a classic
    is no doubt completely obsolete.
    From the available reviews it would seem
    that it is mostly a rehash of Lorentz' 'Theory of Electons',

    Jan

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)