I am pretty sure that this is an attempt by somebody who doesn't work in DWDM/optical transport systems to describe industry standard dispersion compensation modules (DCM). Such as you would install for a 40 x 10 Gbps wavelength DWDM link on two strands between two sets of chassis and linecards.
This is not dispersion. The light sent in fiber is essentially monochromatic,hence there is no need to cancel dispersion.
A whole bunch of processes can add phase noise to the light as it propagates in fiber. Refractive index changes, temperature fluctuations, stresses in fiber due to bending, etc. This is suppressed by sending the light back through the same fiber and actively locking the phase of the light that had travelled through the fiber wrt the incident light.
That interesting because in most metrology clock applications, phase noise is not a concern, rather long term drift. Here they are going to the difficulty to maintain low phase noise (jitter).
I compare it with what I played with personally: NTP. The drift is covered here by the very clock technology, the jitter of the line, just like in NTP, is why the cables are specially tuned. Both are needed.
This is just my guess, however, and if somebody knows more, please write!
I guess phase noise could result in broadening of the laser line width, which would increase the averaging time needed to achieve the desired precision.
But this solution seems to require complicated active elements such as VCOs. Can these run at the frequencies that are of interest in this particular application?
Where could these fluctuations originate from?