I'm not sure what this would even mean. Since 1983, SI has defined the length of the meter in terms of the speed of light and the length of the second. So the speed of light in a vacuum as a constant by definition. Before that, the meter was defined in terms of a wavelength emitted by krypton. The second was then, as it is now, defined in terms of a frequency emitted by cesium. So the constancy of the speed of light then depended on whether the relative behaviors of those two elements varied. Before that, the meter was defined as the length of a specific object in Paris. If the speed of light seemed to vary then, it could be that the size of the atoms in that object varied. Farfetched? Not really. The sizes of atoms depend on the values of various fundamental physical constants, including the speed of light. I think it's only meaningful to speak of changes in dimensionless numbers. For instance the fine structure constant, whose value is the same in all possible systems of units. It depends on the speed of light, Planck's constant, and the charge of the electron. It's meaningful to talk about the fine structure constant changing. If it does change, I think it's completely arbitrary whether that change is attributed to a change in the speed of light, Planck's constant, the charge of the electron, or some combination of these. (On the other hand, it seems to me that if Earth were to suddenly shrink, apparent gravity would drop as the shrinkage began, and would increase when it ended. Obviously, there would be no perceived change in gravity if Earth "shrank" only due to a redefinition of the meter. (The numerical value of the gravity might change due to redefinition of the meter. But readings on scales would remain unchanged until the scales were recalibrated or replaced.)) Can anyone find an online copy of the original study that makes this claim? From the news coverage, I can't even tell whether they claim to have measured an effect, or whether this is entirely theoretical. Warren D Smith <warren.wds@gmail.com>
Does this contradict the observed fact that light from halfway across the universe, hitting the left side of your telescope mirror, is still perfectly in phase with light hitting right side of your telescope mirror, allowing interferometry?
No, so long as the speed varies smoothly with time and location, rather than there being any abrupt transitions. Exactly as happens with curvature of space-time due to gravity. In fact, this alleged variability would be indistinguishable from such curvature. Indeed, it's arbitrary to say that space-time is curved near masses rather than to say that c is slower there and that objects there are shrunken and slowed by just enough that they don't notice the slowdown. One possible difference is that space-time curvature near masses is always equivalent to slower light, not faster. If this new effect allows light to go faster than c, it might allow causality violations, i.e. a way to send messages back through time. On second thought, space-time curvature near *negative* masses would also be equivalent to faster light. Maybe that's why there are no negative masses. Unless the Casimir effect counts. That's a very weak effect, and probably any speedup of light due to the negative mass-energy between the plates would be swamped by the positive mass-energy *of* the plates.
Keith F. Lynch:
Can anyone find an online copy of the original study that makes this claim? From the news coverage, I can't even tell whether they claim to have measured an effect, or whether this is entirely theoretical.
Theoretical. There are two papers in Springer's 'The European Physical Journal D'. Here are the abstracts: 1. The quantum vacuum as the origin of the speed of light: : "We show that the vacuum permeability and permittivity may originate from the magnetization and the polarization of continuously appearing and disappearing fermion pairs. We then show that if we simply model the propagation of the photon in vacuum as a series of transient captures within these ephemeral pairs, we can derive a finite photon velocity. Requiring that this velocity is equal to the speed of light constrains our model of vacuum. Within this approach, the propagation of a photon is a statistical process at scales much larger than the Planck scale. Therefore we expect its time of flight to fluctuate. We propose an experimental test of this prediction." 2. A sum rule for charged elementary particles: "There may be a link between the quantum properties of the vacuum and the parameters describing the properties of light propagation, culminating in a sum over all types of elementary particles existing in Nature weighted only by their squared charges and independent of their masses. The estimate for that sum is of the order of 100."
If the statistical process is Gaussian, the tails fall off very fast, so 'much larger' may still only mean 4-5 orders of magnitude, which is still many orders of magnitude away from where we can 'see' today. E.g., the photons arriving from the other side of the universe have sampled such a large variety of variations that the dispersion is non-existent. Also, if you believe Feynman, these photons have also sampled _all_ paths. Also, perhaps that famous map of the radiation from the early universe has captured precisely these fluctuations? At 06:09 AM 3/28/2013, Hans Havermann wrote:
Within this approach, the propagation of a photon is a statistical process at scales much larger than the Planck scale.
Perhaps the Casimir effect can be used to detect a shift, given its change in the population of virtual particles in the gap. On Mar 28, 2013, at 9:32 AM, Henry Baker wrote:
If the statistical process is Gaussian, the tails fall off very fast, so 'much larger' may still only mean 4-5 orders of magnitude, which is still many orders of magnitude away from where we can 'see' today. E.g., the photons arriving from the other side of the universe have sampled such a large variety of variations that the dispersion is non-existent.
Also, if you believe Feynman, these photons have also sampled _all_ paths.
Also, perhaps that famous map of the radiation from the early universe has captured precisely these fluctuations?
At 06:09 AM 3/28/2013, Hans Havermann wrote:
Within this approach, the propagation of a photon is a statistical process at scales much larger than the Planck scale.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
Naturally, since the speed of light is constant by definition, it must be the length of a meter that potentially can vary. --Dan On 2013-03-27, at 7:50 PM, Keith F. Lynch wrote:
I'm not sure what this would even mean. Since 1983, SI has defined the length of the meter in terms of the speed of light and the length of the second. So the speed of light in a vacuum as a constant by definition.
participants (5)
-
Dan Asimov -
Hans Havermann -
Henry Baker -
Keith F. Lynch -
Thomas Knight