I suppose they could connect the two clocks via fibers following different paths, to create a large interferometer. For that matter, it might be possible to cover a very large area with a bunch of clocks interconnected by a "web" of fibers.
But fibers do have problems. Their optical path length depends on temperature, and on how much the fiber is stretched. Understanding those issues is probably good for a dissertation or two. ;-)
Just a wild guess - the path length due to temperature and wear should grow and shrink at a slow rate, no? The gravitational wave however should be a spike, affecting the lag that could be potentially be measured.
The LIGO-lasers are probably much more precise, but we have way more fiber and clocks than LIGO observatories. Thoughts?
Thinking about it more, it was mentioned that the laser clocks might have a fair amount of phase noise, and might not be good for picking up fast time lag signals. On the other hand, you can feed multiple lasers down the same fiber, so a better experiment might be to wait for the fiber network to be built, and use a separate laser color from a specialized laser with excellent short term stability.
No, LIGO works on a different principle. It bounces a single beam between mirrors at great distance and them measures the interference pattern that results from the phase differences because the light traveled a different distance in each direction.
I am pretty sure that this is an attempt by somebody who doesn't work in DWDM/optical transport systems to describe industry standard dispersion compensation modules (DCM). Such as you would install for a 40 x 10 Gbps wavelength DWDM link on two strands between two sets of chassis and linecards.
This is not dispersion. The light sent in fiber is essentially monochromatic,hence there is no need to cancel dispersion.
A whole bunch of processes can add phase noise to the light as it propagates in fiber. Refractive index changes, temperature fluctuations, stresses in fiber due to bending, etc. This is suppressed by sending the light back through the same fiber and actively locking the phase of the light that had travelled through the fiber wrt the incident light.
That interesting because in most metrology clock applications, phase noise is not a concern, rather long term drift. Here they are going to the difficulty to maintain low phase noise (jitter).
I compare it with what I played with personally: NTP. The drift is covered here by the very clock technology, the jitter of the line, just like in NTP, is why the cables are specially tuned. Both are needed.
This is just my guess, however, and if somebody knows more, please write!
I guess phase noise could result in broadening of the laser line width, which would increase the averaging time needed to achieve the desired precision.
But this solution seems to require complicated active elements such as VCOs. Can these run at the frequencies that are of interest in this particular application?
I'm just guessing here but I think an accurate log of astronomical events might be something for which this would be useful. If you have observatories distributed throughout the world, then an accurate time log would perhaps be invaluable in making very accurate measurements.
Probably also useful for frequency synchronization for large baseline radio telescopes. I believe now they use hydrogen masers. One needs to maintain coherent receivers to several GHz over thousands of km.
"Also, the apparent rate of a clock depends on the local gravitational potential: comparing two clocks measures the gravitational redshift between them, and thus yields their height difference. Such measurements provide data points for the geodetic reference surface, the so-called “geoid”. This research approach is pursued jointly by physicists and geodesists in the Collaborative Research Centre 1128 (“geo-Q”) of the German Science Foundation (DFG)."
I don't think this would be solely due to the height difference that the article suggests, but also the composition of the Earth beneath the facilities.
They can measure height differences of less than a meter, so I would guess they could measure the contribution of the moon. But this particular link currently seems to measure time differences between two fairly nearby clocks, so the moon position should have almost no effect on that.
I'm not so sure about that. The distance between the two locations is ~.25% of the distance from the Earth's surface to the moon, while one meter is only 0.000016% of the Earth's radius. So if the moon were on the side of the Earth such that it were closer to one of the locations than the other it would probably matter.