On Tue, 3 Dec 2024 19:02:45 +0000, rhertz wrote:
And I forgot:
The settlement of constants BY COLLUSION requires that ALL THE INSTRUMENTATION THAT EXIST (used in any science) BE RE-CALIBRATED, to
obey.
Do you get this?
If you manufacture mass spectrometers, voltmeters, timers, WHATEVER,
better that you RE-ADJUST the values that come from measurements.
Example: Your voltmeter measures 1 Volt as 0.9995743 OLD Volts? Then RECALIBRATE THAT MF or you will sell NONE. Is that clear?
CALIBRATION is an essential part in the design and manufacturing OF ANY INSTRUMENT!. But you require MASTER REFERENCES (OR GUIDELINES LIKE THOSE FROM BIPM).
Your laser based distance meter measure 1 meter as 1.00493 meters? RECALIBRATE THE INSTRUMENT RIGHT IN THE PRODUCTION LINE.
Not to talk about instrumentation used to compute Atomic Weight or
a.m.u.
ADJUST, COMPLY AND OBEY OR YOU'RE OUT OF THE BUSINESS.
Did you manufacture a single instrument in an university lab? ADJUST, COMPLY AND OBEY or you are OUTCASTED.
How do you dare to measure c = 299,793,294 m/s? ARE YOU CRAZY? Adjust
the readings to c = 299,792,458 m/s, OR ELSE.
And this has been happening since late XIX Century. Read the history
behind the definition of 1 Ohm, mainly commanded by British
institutions, with Cavendish lab behind it.
E ≈ 1.0000000 mc^2 is not a calibration adjustment. It is a
measurement made with calibrated instrumentation whose consistency
with other instrumentation has been carefully verified by procedures
such as you cast aspersion upon above.
On Wed, 4 Dec 2024 11:40:04 +0000, J. J. Lodder wrote:
ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:
E ≈ 1.0000000 mc^2 is not a calibration adjustment. It is a
measurement made with calibrated instrumentation whose consistency
with other instrumentation has been carefully verified by procedures
such as you cast aspersion upon above.
Was, was, was. There is nothing to 'cast upon' anymore.
With the redefinition of the kilogram in 2018
those measurements have become irrelevant.
E = m c^2 now holds exactly,
by the definition of the kilogram.
(and the Joule)
Specious argument.
When the kilogram was defined in terms of a metal artifact held in
vaults in Paris, it was a legitimate question whether the mass of said artifact varied over time, even though by definition it was _the_
kilogram. As a matter of fact, that mass was found to vary despite its
being the basis as the definition of kilogram.
The mere fact that E = mc^2 holds exactly according to our present definitions of the kilogram and the Joule does not make irrelevant experiments intended to check whether the assumptions that have led to
the adoption of our current set of standards are correct.
The mere fact that theory and over a century of experimental
validation have led to the speed of light being adopted as a constant
does not invalidate experiments intended to verify to increasing
levels of precision the correctness of the assumptions that led to
it adoption as a constant.
On Wed, 4 Dec 2024 20:17:25 +0000, J. J. Lodder wrote:
ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:
The mere fact that theory and over a century of experimental
validation have led to the speed of light being adopted as a constant
does not invalidate experiments intended to verify to increasing
levels of precision the correctness of the assumptions that led to
it adoption as a constant.
So you haven't understood what it is all about.
I rest my case,
You prematurely rest your case.
Since 1983, the speed of light in vacuum has been defined as exactly
equal to 299,792,458 meters per second.
Given this definition, is there any point to conducting experiments
to test whether there are anisotropies in the speed of light due to
Earth's motions in space? Such as these: https://tinyurl.com/8hkry7k3
The definition of the speed of light is such that there can't be.
Right?
ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:
The mere fact that theory and over a century of experimental
validation have led to the speed of light being adopted as a constant
does not invalidate experiments intended to verify to increasing
levels of precision the correctness of the assumptions that led to
it adoption as a constant.
So you haven't understood what it is all about.
I rest my case,
Jan
Den 04.12.2024 21:17, skrev J. J. Lodder:
ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:
The mere fact that theory and over a century of experimental
validation have led to the speed of light being adopted as a constant
does not invalidate experiments intended to verify to increasing
levels of precision the correctness of the assumptions that led to
it adoption as a constant.
So you haven't understood what it is all about.
I rest my case,
Jan
The meter is defined as:
1 metre = (1 sec/?299792458? m/s)
1 second = 9192631770 ??_Cs
Note that neither the definition of second nor the definition
of metre depend on the speed of light.
The constant ?299792458? m/s is equal to the defined speed of light,
but in the definition of the metre it is a constant.
That means that it possible to measure the speed of light
even if it is different from the defined value.
The point is that the metre isn't define by the speed of light,
but by the constant 299792458? m/s.
So if the speed of light, measured with instruments with better
precision than they had in 1983 is found to be 299792458?.000001 m/s,
then that only means that the real speed of light (measured with
SI metre and SI second) is different from the defined one.
On 12/05/2024 10:42 AM, J. J. Lodder wrote:
Paul B. Andersen <relativity@paulba.no> wrote:
Den 04.12.2024 21:17, skrev J. J. Lodder:
ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:
The mere fact that theory and over a century of experimental
validation have led to the speed of light being adopted as a constant
does not invalidate experiments intended to verify to increasing
levels of precision the correctness of the assumptions that led to
it adoption as a constant.
So you haven't understood what it is all about.
I rest my case,
Jan
The meter is defined as:
1 metre = (1 sec/?299792458? m/s)
1 second = 9192631770 ??_Cs
Note that neither the definition of second nor the definition
of metre depend on the speed of light.
The constant ?299792458? m/s is equal to the defined speed of light,
but in the definition of the metre it is a constant.
That means that it possible to measure the speed of light
even if it is different from the defined value.
The point is that the metre isn't define by the speed of light,
but by the constant 299792458? m/s.
So you didn't get the point either.
(also suffering from a naive empirist bias, I guess)
The point is not about pottering around with lasers and all that,
it is about correctly interpreting what you are doing.
To do that you need to understand the physics of it.
In fact, the kind of experiments that used to be called
'speed of light measurements' (so before 1983)
are still being done routinely today, at places like NIST, or BIPM.
The difference is that nowadays, precisely the same kind of measurements
are called 'calibration of a (secudary) meter standard',
or 'calibration of a frequency standard'. [1]
So if the speed of light, measured with instruments with better
precision than they had in 1983 is found to be 299792458?.000001 m/s,
then that only means that the real speed of light (measured with
SI metre and SI second) is different from the defined one.
So this is completely, absolutely, and totally wrong.
Such a result does not mean that the speed of light
is off its defined value,
it means that your meter standard is off,
and that you must use your measurement result to recalibrate it.
(so that the speed of light comes out to its defined value)
In other words, it means that you can nowadays
calibrate a frequency standard, aka secundary meter standard
to better accuracy than was possible 1n 1983.
This is no doubt true,
but it cannot possibly change the (defined!) speed of light.
In still other words, there is no such thing as an independent SI meter.
The SI meter is that meter, and only that meter,
that makes the speed of light equal to 299792458? m/s (exactly)
Permittivity and permeability at the center of each galaxy are different
from the values of ?? and ?? on the outer limits of each one.
So, the value of c? = 1/√(????) applies only locally.
On 2024-12-01 00:28:14 +0000, rhertz said:
Now, E = 3/4 mc? or E = mc?? Which one would the physics community
adopt?
The latter because the former was refuted by later experiments, in
particular observations and analysis of radioactive decays.
Paul B. Andersen <relativity@paulba.no> wrote:>
So if the speed of light, measured with instruments with better
precision than they had in 1983 is found to be 299792458?.000001 m/s,
then that only means that the real speed of light (measured with
SI metre and SI second) is different from the defined one.
So this is completely, absolutely, and totally wrong.
Such a result does not mean that the speed of light
is off its defined value,
it means that your meter standard is off,
and that you must use your measurement result to recalibrate it.
(so that the speed of light comes out to its defined value)
In other words, it means that you can nowadays
calibrate a frequency standard, aka secundary meter standard
to better accuracy than was possible 1n 1983.
This is no doubt true,
but it cannot possibly change the (defined!) speed of light.
In still other words, there is no such thing as an independent SI meter.
The SI meter is that meter, and only that meter,
that makes the speed of light equal to 299792458 m/s (exactly)
Jan
In fact, the kind of experiments that used to be called
'speed of light measurements' (so before 1983)
are still being done routinely today, at places like NIST, or BIPM.
The difference is that nowadays, precisely the same kind of measurements
are called 'calibration of a (secudary) meter standard',
or 'calibration of a frequency standard'.
Den 05.12.2024 19:42, skrev J. J. Lodder:
Paul B. Andersen <relativity@paulba.no> wrote:>
So if the speed of light, measured with instruments with better
precision than they had in 1983 is found to be 299792458?.000001 m/s,
then that only means that the real speed of light (measured with
SI metre and SI second) is different from the defined one.
Note: measured with SI metre and SI second.
So this is completely, absolutely, and totally wrong.
Such a result does not mean that the speed of light
is off its defined value,
it means that your meter standard is off,
and that you must use your measurement result to recalibrate it.
(so that the speed of light comes out to its defined value)
The 1983 definition of the speed of light is:
c = 299792458 m/s
The 1983 definition of second is:
1 second = 9192631770 ??Cs
The 1983 definition of meter is:
1 metre = 1 second/299792458 m/s
The 2019 definition of meter is:
1 metre = 9192631770 ??Cs/299792458 m/s
If the speed of light is measured _with the meter and second
defined above_ it is obviously possible to get a result slightly
different from the defined speed of light.
So I was not "completely, absolutely, and totally wrong".
Are you are saying that if we got the result 299792458.000001 m/s
then the metre would have to be recalibrated to:
1 metre = 9192631770 ??Cs/299792458.000001 m/s ?
In other words, it means that you can nowadays
calibrate a frequency standard, aka secundary meter standard
to better accuracy than was possible 1n 1983.
Or are you saying that we would have to recalibrate the meter to:
1 metre = 9192631770.0000306 ??Cs/299792458 m/s ?
This is no doubt true,
but it cannot possibly change the (defined!) speed of light.
In still other words, there is no such thing as an independent SI meter. The SI meter is that meter, and only that meter,
that makes the speed of light equal to 299792458 m/s (exactly)
Jan
You wrote:
In fact, the kind of experiments that used to be called
'speed of light measurements' (so before 1983)
are still being done routinely today, at places like NIST, or BIPM.
The difference is that nowadays, precisely the same kind of measurements are called 'calibration of a (secudary) meter standard',
or 'calibration of a frequency standard'.
Is any such recalibration of the meter ever done?
And which "frequency standard" are you referring to?
The definition of a second?
On Fri, 6 Dec 2024 20:00:10 +0000, J. J. Lodder wrote:
Finaly, you really need to get yourself out of the conceptual knot
that you have tied yourself in.
Something is either defined, or it can be measured.
It can't possibly be both,
Sure it can, provided that you use a different measurement standard
than the one used in the definition.
It would not make sense to quantify hypothetical variations in the
speed of light in terms of the post-1983 meter. But they would make
sense in terms pre-1983 meters. Or (assuming some incredible ramp-up
in technology, perhaps introduced by Larry Niven-ish Outsiders) in
terms of a meter defined as the distance massless gluons travel in 1/299,792,458 of a second. Or gravitons... :-)
On Sat, 7 Dec 2024 11:03:24 +0000, J. J. Lodder wrote:
ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:
On Fri, 6 Dec 2024 20:00:10 +0000, J. J. Lodder wrote:
Finaly, you really need to get yourself out of the conceptual knot
that you have tied yourself in.
Something is either defined, or it can be measured.
It can't possibly be both,
Sure it can, provided that you use a different measurement standard
than the one used in the definition.
Sure, you can be inconsistent, if you choose to be.
Don't expect meaningful results.
It would not make sense to quantify hypothetical variations in the
speed of light in terms of the post-1983 meter. But they would make
sense in terms pre-1983 meters. Or (assuming some incredible ramp-up
in technology, perhaps introduced by Larry Niven-ish Outsiders) in
terms of a meter defined as the distance massless gluons travel in 1/299,792,458 of a second. Or gravitons... :-)
Completely irrelevant,
and it does not get you out of your conceptual error as stated above.
Summmary: There must be:
1) a length standard, 2) a frequency standard [1], and 3) c
Two of the three must be defined, the third must be measured.
Pre-1983 1) and 2) were defined, and 3), c was measured.
Post-1983 2) and c are defined, 1) must be measured.
So in 1983 we have collectively decided that any future refinement
in measurement techniques will result in more accurate meter standards,
not in a 'better' value for c. [2]
You don't "get" the point that I was trying to make. Let us review
| Resolution 1 of the 17th CGPM (1983)[snip boilerplate material]
Gamma ray burst observations have constrained the arrival times[snip more irrelevancies]
between the visible light and gamma ray components of the burst to
be equal to within 10^-15 of the total travel time of the burst.
Definitions are BASED ON state-of-the-art known physics. They do not DETERMINE physical law.
Finally, an excercise for you personally.
You quoted a pre-2018 experiment that verified that E=mc^2
to some high accuracy. (using the measured value of Planck's constant) Post-2018, Planck's constant has a defined value,
and E=mc^2 is true by definition. (of the Joule and the kilogram)
So E=mc^2 can no longer be verified by any possible experiment.
Now:
Ex1) Does this make the experiment you quoted worthless?
Not at all.
Ex2) If not, what does that experiment demonstrate?
It would demonstrate an inadequacy in the definitions that must be
addressed in some future conference when the discrepancies have been
better characterized.
O.W. Richardson's "The Electron Theory ..." is really pretty
great, he spends a lot of time explaining all sorts of
issues in systems of units and algebraic quantities and
derivations and the inner and outer and these things,
it's a 100 years old yet I'm glad to be reading it now.
Paul B. Andersen <relativity@paulba.no> wrote:
Den 05.12.2024 19:42, skrev J. J. Lodder:
Paul B. Andersen <relativity@paulba.no> wrote:>
So if the speed of light, measured with instruments with better
precision than they had in 1983 is found to be 299792458.000001 m/s,
then that only means that the real speed of light (measured with
SI metre and SI second) is different from the defined one.
Note: measured with SI metre and SI second.
So this is completely, absolutely, and totally wrong.
Such a result does not mean that the speed of light
is off its defined value,
it means that your meter standard is off,
and that you must use your measurement result to recalibrate it.
(so that the speed of light comes out to its defined value)
If the speed of light is measured _with the meter and second
defined above_ it is obviously possible to get a result slightly
different from the defined speed of light.
So I was not "completely, absolutely, and totally wrong".
You were, and it would seem that you still are.
You cannot measure the speed of light because it has a defined value.
If you would think that what you are doing is a speed of light
measurement you don't understand what you are doing.
You wrote:
In fact, the kind of experiments that used to be called
'speed of light measurements' (so before 1983)
are still being done routinely today, at places like NIST, or BIPM.
The difference is that nowadays, precisely the same kind of measurements >>> are called 'calibration of a (secudary) meter standard',
or 'calibration of a frequency standard'.
Is any such recalibration of the meter ever done?
Of course, routinely, on a day to day basis.
Guess there are whole departments devoted to it.
(it is a subtle art)
The results are published nowadays as a list of frequencies
of prefered optical frequency standards.
(measuring the frequency of an optical frequency standard
and calibrating a secondary meter standard are just two different ways
of saying the same thing)
And remember, there is no longer such a thing as -the- meter.
It is a secondary unit, and any convenient secondary standard will do.
Den 04.12.2024 21:17, skrev J. J. Lodder:
ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:
The mere fact that theory and over a century of experimental
validation have led to the speed of light being adopted as a constant
does not invalidate experiments intended to verify to increasing
levels of precision the correctness of the assumptions that led to
it adoption as a constant.
So you haven't understood what it is all about.
I rest my case,
Jan
The meter is defined as:
1 metre = (1 sec/299792458 m/s)
1 second = 9192631770 Δν_Cs
Den 06.12.2024 21:00, skrev J. J. Lodder:
Paul B. Andersen <relativity@paulba.no> wrote:
According to: https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
(2019)
The SI definitions are:
The relevant defining constants:
Δν_Cs = 9192631770 Hz (hyperfine transition frequency of Cs133)
c = 299 792 458 m/s (speed of light in vacuum)
The relevant base units:
Second:
1 s = 9192631770/Δν_Cs 1 Hz = Δν_Cs/9192631770
Metre:
1 metre = (c/299792458)s = (9192631770/299792458)⋅(c/Δν_Cs)
The home page of BIMP:
https://www.bipm.org/en/measurement-units
Give the exact same definitions, so I assume
that the definitions above are valid now.
https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
If the speed of light is measured _with the meter and second
defined above_ it is obviously possible to get a result slightly
different from the defined speed of light.
So I was not "completely, absolutely, and totally wrong".
You were, and it would seem that you still are.
You cannot measure the speed of light because it has a defined value.
If you would think that what you are doing is a speed of light
measurement you don't understand what you are doing.
When you have a definition of second and a definition of metre,
it is _obviously_ possible to measure the speed of light.
If you measure the speed of light in air, you would probably
find that v_air ≈ 2.99705e8 m/s.
If you measure it in vacuum on the ground, you would probably
get a value slightly less than 299792458 m/s because the vacuum
isn't perfect.
If you measure it in perfect vacuum (in a space-vehicle?) you
would probably get the value 299792458 m/s.
But it isn't impossible, if you had extremely precise instruments,
that you would measure a value slightly different from 299792458 m/s,
e.g. 299792458.000001 m/s.
In:
https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
I read:
https://www.bipm.org/en/cipm-mra
"The CIPM has adopted various secondary representations of
the second, based on a selected number of spectral lines of atoms,
ions or molecules. The unperturbed frequencies of these lines can
be determined with a relative uncertainty not lower than that of
the realization of the second based on the 133Cs hyperfine transition
frequency, but some can be reproduced with superior stability."
This is how I interpret this:
The second is still defined by "the unperturbed ground state
hyperfine transition frequency of the caesium 133 atom"
Δν_Cs = 9192631770 Hz by definition.
But practical realisations of this frequency standard,
that is an atomic frequency standard based on Cs133 is
not immune to perturbation, a magnetic field may affect it.
So there exist more stable frequency standards than Cs,
and some are extremely more stable.
But the frequencies of these standards are still defined
by Δν_Cs. 1 hz = Δν_Cs/9192631770
This is "Calibration of a frequency standard".
The "secondary representations of second"
don't change the duration of a second
and the "secondary representations of metre"
don't change the length of a metre.
Ross Finlayson <ross.a.finlayson@gmail.com> wrote:
O.W. Richardson's "The Electron Theory ..." is really pretty
great, he spends a lot of time explaining all sorts of
issues in systems of units and algebraic quantities and
derivations and the inner and outer and these things,
it's a 100 years old yet I'm glad to be reading it now.
Yes, in those long past times every competent physicist
understood about systems of units and dimensions.
This has been lost as a consequence of general 'SI-only' education.
A more readily accessible (and excellent) source for the subject
is in the appendices of Jackson, Classical Electrodynamics.
Unfortunately the subject is not covered adequately in Wikipedia,
(afaics)
On 2024-12-07 21:35:57 +0000, J. J. Lodder said:
Ross Finlayson <ross.a.finlayson@gmail.com> wrote:
O.W. Richardson's "The Electron Theory ..." is really pretty
great, he spends a lot of time explaining all sorts of
issues in systems of units and algebraic quantities and
derivations and the inner and outer and these things,
it's a 100 years old yet I'm glad to be reading it now.
Yes, in those long past times every competent physicist
understood about systems of units and dimensions.
This has been lost as a consequence of general 'SI-only' education.
A more readily accessible (and excellent) source for the subject
is in the appendices of Jackson, Classical Electrodynamics.
Unfortunately the subject is not covered adequately in Wikipedia,
(afaics)
Why don't you fix it, then? Anyone can edit Wikipedia.
Den 07.12.2024 22:19, skrev Paul B. Andersen:
Den 06.12.2024 21:00, skrev J. J. Lodder:
Paul B. Andersen <relativity@paulba.no> wrote:
According to: https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
(2019)
The SI definitions are:
The relevant defining constants:
??_Cs = 9192631770 Hz (hyperfine transition frequency of Cs133)
c = 299 792 458 m/s (speed of light in vacuum)
The relevant base units:
Second:
1 s = 9192631770/??_Cs 1 Hz = ??_Cs/9192631770
Metre:
1 metre = (c/299792458)s = (9192631770/299792458)?(c/??_Cs)
The home page of BIMP:
https://www.bipm.org/en/measurement-units
Give the exact same definitions, so I assume
that the definitions above are valid now.
https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
If the speed of light is measured _with the meter and second
defined above_ it is obviously possible to get a result slightly
different from the defined speed of light.
So I was not "completely, absolutely, and totally wrong".
You were, and it would seem that you still are.
You cannot measure the speed of light because it has a defined value.
If you would think that what you are doing is a speed of light
measurement you don't understand what you are doing.
Yes, I was indeed "absolutely, and totally wrong",
but not completely wrong.
When you have a definition of second and a definition of metre,
it is _obviously_ possible to measure the speed of light.
If you measure the speed of light in air, you would probably
find that v_air ≈ 2.99705e8 m/s.
If you measure it in vacuum on the ground, you would probably
get a value slightly less than 299792458 m/s because the vacuum
isn't perfect.
OK so far.
If you measure it in perfect vacuum (in a space-vehicle?) you
would probably get the value 299792458 m/s.
You would certainly measure the value 299792458 m/s.
It is possible measure the speed of light in vacuum, but not much
point in doing so since the result is given by definition.
But it isn't impossible, if you had extremely precise instruments,
that you would measure a value slightly different from 299792458 m/s,
e.g. 299792458.000001 m/s.
This is indeed "completely, absolutely, and totally wrong".
I somehow thought that the "real speed" of light in vacuum
measured before 1985 was different from 299792458 m/s.
(Which it probably was, but the difference hidden in the error bar)
And since the definition of metre only contain the defined constant c,
i thought "the real speed" of light could be different from c.
But this is utter nonsense!
Now I can't understand how I could think so.
My brain seems to be slower than it used to be. :-(
The real speed of light in vacuum is exactly c = 299792458 m/s,
and 1 metre = (1 second/299792458)c, is derived from c,
which means that the measured speed of light in vacuum will
always be c.
Athel Cornish-Bowden <me@yahoo.com> wrote:
On 2024-12-07 21:35:57 +0000, J. J. Lodder said:
Ross Finlayson <ross.a.finlayson@gmail.com> wrote:
O.W. Richardson's "The Electron Theory ..." is really pretty
great, he spends a lot of time explaining all sorts of
issues in systems of units and algebraic quantities and
derivations and the inner and outer and these things,
it's a 100 years old yet I'm glad to be reading it now.
Yes, in those long past times every competent physicist
understood about systems of units and dimensions.
This has been lost as a consequence of general 'SI-only' education.
A more readily accessible (and excellent) source for the subject
is in the appendices of Jackson, Classical Electrodynamics.
Unfortunately the subject is not covered adequately in Wikipedia,
(afaics)
Why don't you fix it, then? Anyone can edit Wikipedia.
Because I have experience with the matters.
I have had to argue units and dimensions with electrical engineers.
On Sun, 8 Dec 2024 5:42:07 +0000, ProkaryoticCaspaseHomolog wrote:
On Sat, 7 Dec 2024 21:35:57 +0000, J. J. Lodder wrote:
I'm sorry, but this is not the right answer,
So what are you saying, then? Are you saying that, because of the definition of E=mc^2, it is totally required that 1 gram of electrons annihilating 1 gram of positrons completely to electromagnetic
radiation must NECESSICARILY yield the same amount of energy as 1 gram
of protons annihilating 1 gram of antiprotons completely to electro- magnetic radiation? That the equality of these two values is a matter
of definition, not something to be established by experiment?
Are you saying that because the current definition of c is
299,792,458 meters per second regardless of wavelength, that questions
as to whether gamma rays travel faster than visible light rays are
totally nonsensical?
In fewer words:
No experiment can measure a difference between the amount of energy
released by the complete annihilation of 1 g of (electrons + positrons) versus the complete annihilation of 1 g of (protons + antiprotons).
True or false?
No experiment can measure a difference between the speed of visible
light photons versus the speed of gamma rays. True or false?
On Sun, 8 Dec 2024 8:19:33 +0000, Paul B. Andersen wrote:
Den 07.12.2024 22:19, skrev Paul B. Andersen:
Den 06.12.2024 21:00, skrev J. J. Lodder:
Paul B. Andersen <relativity@paulba.no> wrote:
According to:
https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
(2019)
The SI definitions are:
The relevant defining constants:
??_Cs = 9192631770 Hz (hyperfine transition frequency of Cs133)
c = 299 792 458 m/s (speed of light in vacuum)
The relevant base units:
Second:
1 s = 9192631770/??_Cs 1 Hz = ??_Cs/9192631770
Metre:
1 metre = (c/299792458)s = (9192631770/299792458)?(c/??_Cs)
The home page of BIMP:
https://www.bipm.org/en/measurement-units
Give the exact same definitions, so I assume
that the definitions above are valid now.
https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
If the speed of light is measured _with the meter and second
defined above_ it is obviously possible to get a result slightly
different from the defined speed of light.
So I was not "completely, absolutely, and totally wrong".
You were, and it would seem that you still are.
You cannot measure the speed of light because it has a defined value.
If you would think that what you are doing is a speed of light
measurement you don't understand what you are doing.
Yes, I was indeed "absolutely, and totally wrong",
but not completely wrong.
I disagree that you were wrong at all.
1) The expression "c" has multiple meanings. On the one hand, it is,
according to a widely accepted geometric model of spacetime, a
constant that expresses the relationship between units of space and
units of time. This "c" is given a defined value of 299792458 m/s,
and because it has that value by definition, it cannot be measured.
2) Another meaning of "c" is the speed of photons in vacuum. Photons
are, to the best of our knowledge, massless, and according to the
above geometric model of spacetime, all unimpeded massless
particles travel at the speed "c" given in definition (1).
Paul B. Andersen <relativity@paulba.no> wrote:
Den 07.12.2024 22:19, skrev Paul B. Andersen:
But it isn't impossible, if you had extremely precise instruments,
that you would measure a value slightly different from 299792458 m/s,
e.g. 299792458.000001 m/s.
This is indeed "completely, absolutely, and totally wrong".
I somehow thought that the "real speed" of light in vacuum
measured before 1985 was different from 299792458 m/s.
Of course it was. The adopted value was a compromise
between the results of different teams.
BTW, you are also falling into the 'das ding an sich' trap.
(Which it probably was, but the difference hidden in the error bar)
And since the definition of metre only contain the defined constant c,
i thought "the real speed" of light could be different from c.
Yes, that is where you go wrong.
But this is utter nonsense!
Beginning to see the light?
Now I can't understand how I could think so.
My brain seems to be slower than it used to be. :-(
The real speed of light in vacuum is exactly c = 299792458 m/s,
and 1 metre = (1 second/299792458)c, is derived from c,
which means that the measured speed of light in vacuum will
always be c.
Correct.
Perhaps I can explain the practicalities behind it in another way.
If you measure the speed of light accurately
you must of course do an error analysis.
The result of this that almost all of the error results from
the ecessary realisation of the meter standard. (in your laboratory)
So the paradoxal result is that you cannot measure the speed of light
even when there is a meter standard of some kind.
You may call whatever it is that you are doing
'a speed of light measurement',
but if you are a competent experimentalist you will understand
that what you are really doing is a meter calibraton experiment.
Hence the speed of light must be given a defined value,
for practical experimental reasons. [1]
Jan
[1] Which have not changed.
(and will not change in the forseeable future)
Meter standards are orders of magnitude less accurate
than time standards. (see why this must be?)
Den 08.12.2024 12:30, skrev J. J. Lodder:
Paul B. Andersen <relativity@paulba.no> wrote:
Den 07.12.2024 22:19, skrev Paul B. Andersen:
But it isn't impossible, if you had extremely precise instruments,
that you would measure a value slightly different from 299792458 m/s,
e.g. 299792458.000001 m/s.
This is indeed "completely, absolutely, and totally wrong".
I somehow thought that the "real speed" of light in vacuum
measured before 1985 was different from 299792458 m/s.
Of course it was. The adopted value was a compromise
between the results of different teams.
BTW, you are also falling into the 'das ding an sich' trap.
(Which it probably was, but the difference hidden in the error bar)
And since the definition of metre only contain the defined constant c,
i thought "the real speed" of light could be different from c.
Yes, that is where you go wrong.
But this is utter nonsense!
Beginning to see the light?
Now I can't understand how I could think so.
My brain seems to be slower than it used to be. :-(
The real speed of light in vacuum is exactly c = 299792458 m/s,
and 1 metre = (1 second/299792458)c, is derived from c,
which means that the measured speed of light in vacuum will
always be c.
Correct.
Perhaps I can explain the practicalities behind it in another way.
If you measure the speed of light accurately
you must of course do an error analysis.
The result of this that almost all of the error results from
the ecessary realisation of the meter standard. (in your laboratory)
So the paradoxal result is that you cannot measure the speed of light
even when there is a meter standard of some kind.
You may call whatever it is that you are doing
'a speed of light measurement',
but if you are a competent experimentalist you will understand
that what you are really doing is a meter calibraton experiment.
Hence the speed of light must be given a defined value,
for practical experimental reasons. [1]
Jan
This is my way of thinking which made me realise that I was wrong:
How do we measure the speed of light?
We measure the time it takes for the light to travel a known distance.
So we bounce the light off a mirror and measure the round trip time.
How do we calibrate the distance to the mirror?
We measure the time it takes for the light to go back and forth
to the mirror.
L = (c/299792458)?t/2 where t is round trip time in seconds
AHA!!!
[1] Which have not changed.
(and will not change in the forseeable future)
Meter standards are orders of magnitude less accurate
than time standards. (see why this must be?)
No, I don't understand.
The definition of metre only depends on the two constants
??_Cs and c and both have an exact value.
Is it because the time standard only depend on one constant?
I can however understand that practical calibration of the meter
is less precise than the calibration of a frequency standard.
------------------
I would like your reaction to the following;
In:
https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
I read:
https://www.bipm.org/en/cipm-mra
"The CIPM has adopted various secondary representations of
the second, based on a selected number of spectral lines of atoms,
ions or molecules. The unperturbed frequencies of these lines can
be determined with a relative uncertainty not lower than that of
the realization of the second based on the 133Cs hyperfine transition
frequency, but some can be reproduced with superior stability."
This is how I interpret this:
The second is still defined by "the unperturbed ground state
hyperfine transition frequency of the caesium 133 atom"
??_Cs = 9192631770 Hz by definition.
But practical realisations of this frequency standard,
that is an atomic frequency standard based on Cs133 is
not immune to perturbation, a magnetic field may affect it.
So there exist more stable frequency standards than Cs,
and some are extremely more stable.
But the frequencies of these standards are still defined
by ??_Cs. 1 hz = ??_Cs/9192631770
This is "Calibration of a frequency standard".
The "secondary representations of second"
don't change the duration of a second
and the "secondary representations of metre"
don't change the length of a metre.
Paul B. Andersen <relativity@paulba.no> wrote:
I would like your reaction to the following;
In:
https://www.bipm.org/utils/common/pdf/si-brochure/SI-Brochure-9.pdf
I read:
https://www.bipm.org/en/cipm-mra
"The CIPM has adopted various secondary representations of
the second, based on a selected number of spectral lines of atoms,
ions or molecules. The unperturbed frequencies of these lines can
be determined with a relative uncertainty not lower than that of
the realization of the second based on the 133Cs hyperfine transition
frequency, but some can be reproduced with superior stability."
This is how I interpret this:
The second is still defined by "the unperturbed ground state
hyperfine transition frequency of the caesium 133 atom"
??_Cs = 9192631770 Hz by definition.
But practical realisations of this frequency standard,
that is an atomic frequency standard based on Cs133 is
not immune to perturbation, a magnetic field may affect it.
So there exist more stable frequency standards than Cs,
and some are extremely more stable.
But the frequencies of these standards are still defined
by ??_Cs. 1 hz = ??_Cs/9192631770
This is "Calibration of a frequency standard".
The "secondary representations of second"
don't change the duration of a second
and the "secondary representations of metre"
don't change the length of a metre.
Instead of replying point by point I'll sum up the whole situation.
(as I understand it, and perhaps repeating what I wrote earlier)
For understanding all this you must realise
that there are two kinds of frequency standards:
microwave ones, typically in the (perhaps many) GHz range,
and
optical ones, typically in the hundreds of THz range.
The GHz ones may serve as absolute frequency standards and as clocks.
The optical ones (like the standard stabilised HeNe laser)
may also serve as (secondary) meter standards.
Standards labs supply lists of 'recommended' optical frequencies.
The optical frequency sources are of course also 'floating' frequency standards on their own.
The GHz ones can be calibrated against each other by direct counting.
So their accuracy may equal that of the Cs standard. (by the definition)
The stability of frequency standards can in general be established
by comparing ensembles of them against each other. (so indepently of Cs) Which kind of standard to use depends on what you need:
relative or absolute accuracy.
AFAIK about those matters, the idea among metrologists at present
is to leave things as they are,
until a really big step forward can be made.
(hopefully already at the next CGPM)
Some of the optical frequency standards are far more stable indeed.
(nowadays pushing 10^18, last time I looked)
But their frequencies (in terms of the Cs standard!)
are known to a much lesser accuracy.
(pushing 10^12, again last time I looked)
The use of frequency combs caused a revolution here. (see 2005 Nobel)
Summary: optical frequency standards can be far more stable,
but their frequencies are (relatively speaking!) poorly known.
Once you have a calibrated optical frequency standard, [1]
for which you know the frequency in terms of the Cs standard,
you know its wavelength, by the definition of c,
so you can start measuring distances and sizes
in terms of its wavelength, hence in meters.
It has become a secondary meter standard.
So measuring distances/lengths is inherently much less accurate
than measuring time/frequency.
And, circle closed, this was precisely the reason
for giving c a defined value.
So c really cannot be measured anymore,
not because some crazed guru-followers decreed so,
but because of hard experimental realities and necessities.
Hope this clears up the questions you had,
Jan
[1] This is the ongoing, never-ending, program I mentioned earlier:
finding optical frequency standards, aka secondary meter standards,
to ever greater accuracy and reproducibiliy.
The original <1983 series of measurements, then called 'measuring c',
was just good enough to base the definined value of c on.
Those decades of added precision had to go into better frequency/meter standards, not into a 'better' value of c.
PS There are first indications that it may be possible
to harness a gamma ray line from a nuclear transition
in the not to far future, for again greatly increased stability.
Very low frequency, as gammas go, but still in the very far UV.
Challenges, challenges.
On 12/08/2024 12:35 PM, J. J. Lodder wrote:
ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:
On Sun, 8 Dec 2024 5:42:07 +0000, ProkaryoticCaspaseHomolog wrote:
On Sat, 7 Dec 2024 21:35:57 +0000, J. J. Lodder wrote:
I'm sorry, but this is not the right answer,
So what are you saying, then? Are you saying that, because of the
definition of E=mc^2, it is totally required that 1 gram of electrons
annihilating 1 gram of positrons completely to electromagnetic
radiation must NECESSICARILY yield the same amount of energy as 1 gram >>> of protons annihilating 1 gram of antiprotons completely to electro-
magnetic radiation? That the equality of these two values is a matter
of definition, not something to be established by experiment?
Are you saying that because the current definition of c is
299,792,458 meters per second regardless of wavelength, that questions >>> as to whether gamma rays travel faster than visible light rays are
totally nonsensical?
In fewer words:
No experiment can measure a difference between the amount of energy
released by the complete annihilation of 1 g of (electrons + positrons)
versus the complete annihilation of 1 g of (protons + antiprotons).
True or false?
False, see previous.
No experiment can measure a difference between the speed of visible
light photons versus the speed of gamma rays. True or false?
False, already answered several postings back.
A class of experiments relevant to this question
are experiments that set an upper limit on the photon mass,
(the most plausible mechanism for such an effect)
Why for heavens sake would you even get such an idea?
Jan
O.W. Richardson's "The Electron Theory of Matter" has
really a great account of various considerations of
what "c" is with regards to electromagnetic radiation
as opposed to the optical range of not-electromagnetic
radiation and as with regards to wavelength versus
wave velocity.
Sort of like "photons" are overloaded and diluted these
days, so are waves, and so is "c".
The wave model is great and all and the energy equivalency
is great and all, yet it's overloaded and diluted (i.e.,
tenuous and weak).
The popular public deserves quite an apology from the
too-simple accounts that have arrived at having nothing
at all to say and no way to say it about the wider milieu
and the real-er parts of the theory.
So, for a pretty great example when these differences
were not just ignored and furthermore pasted over,
wall-papered as it were, "The Electron Theory of Matter"
is a bit antique yet it's perfectly cool and furthermore
greatly expands a usual discourse on radiation that travels
through space, _and_, the space-contraction (FitzGeraldian).
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 475 |
Nodes: | 16 (2 / 14) |
Uptime: | 63:41:53 |
Calls: | 9,497 |
Calls today: | 1 |
Files: | 13,621 |
Messages: | 6,124,827 |