On 2/9/2014 12:40 PM, Wouter Meeussen wrote:
in http://www.nist.gov/pml/div688/2013_1_17_newera_atomicclocks_3.cfm a atomic clock, using Y-Sr ?lattice?, is said to achieve 3E-18 accuracy & stability. This would allow to detect time dilation caused by a mere 4 cm altitude difference.
If guestimating correctly, this would be equivalent to the gravitational effect of a 19 kg mass at a horizontal distance of 10 cm.
--The time-dilating factor is (to first order) sqrt( 1 + 2*P/c^2 ) where P is the (negative) gravitational potential energy per unit mass of you, G=Newton gravit. constant, c=light speed. Using P = -G * (19 kg) / (10 cm) one finds about 1 - 1.4*10^(-25). So you'd got that wrong by factor > million. To make the time-dilate factor be 1-3*10^(-18), which is the accuracy claimed for the clock, we would want to lower the clock in Earth gravity field by height h where h = 3*10^(-18) * c^2 / g = 2.8 cm. You might want to consider the fact that the ground under your feet rises and falls twice a day by about 50 cm due to the moon ("solid tide"). You don't normally notice this happening. This would seem to suggest that any "true" time standard would have to be produced by a space-based clock, Earth-based clocks are inherently too inaccurate? Or software adjustment? Or what? The very meaning of "time" depends where you are on the Earth and it would seem we are reaching the limits of what one might call "usable accuracy." Further clock improvements would seem futile.