you can google the spec sheet for the receiver described. it can measure carrier phase to 0.2mm rms. that translates to an accuracy after post-processing of 3mm horizontal (with a fancy antenna).
the trick is to use the carrier itself, which is higher frequency than the signal modulated on it. also, relative offsets are easier that absolutes (removes many systematic errors).
i wrote (the software for part of) one of these (not for leica, for some geophysical survey company) back in the day (although it was not mm resolution!)
Current GPS receivers have a couple sources of error, including processing slowness and ionospheric delays. If you don't know what the ionosphere looks like, you can't be accurate to 0.2mm. There are ground stations that measure this (based on the fact that they aren't moving so any change in the GPS-calculated position means ionospheric changes) and transmit the correction data to other GPSes, but this doesn't get you to 0.2mm, at least not for a moving object.
TO be fair, Andrew wasn't claiming 0.2mm accuracy in position measurement - he was quoting the devices ability to resolve phase difference in the carrier wave (and noted that is an order of magnitude or so better than the resulting position accuracy).
Having said that, I wonder what the magnitude of ionospheric changes have on the phase difference of the carrier signals from satellites in different directions?
(Even though I know how it works, the idea of getting millimeter precision in measuring distances to something that's at least 20,000km away and traveling at almost 4km/sec seems like very black magic to me… Surely that can't actually _work_ in practice…)
i don't know what current state of the art is and even 3mm sounds crazy good when i think about it (i was just repeating the spec sheet). i wonder if it that also requires separate / multiple receivers to fully model the ionosphere?
As I understand it - once you can get the time-based position fix accurate enough, you add in the phase information from the 1.2GHz carrier wave - with a wavelength of ~200mm, resolving that to 0.2mm seems reasonable.
The problem of working out which of the peak/troughs in the carrier wave you're in almost certainly requires terrestrial DGPS assistance (http://en.wikipedia.org/wiki/Differential_GPS). If you can use that to get ~100mm precision - that allows you to use the phase difference in the 200mm
wave length to get sub mm measurements.
Like I said, I understand how it works – I just find it hard to believe it's actually practically possible… Deep magic…
There is a network of hundreds of (static) GPS receivers to monitor seismic activity. They are post-processed to identify and remove the ionospheric effects you mention.
They used to (ca. 2000) get ~1mm accuracy in the plane of the Earth, and ~1cm accuracy in the radial direction. The accuracy seems to have increased in the meantime, and it appears to be ~0.1mm in the plane of the Earth.
GPS is combined with other sensors like strain meters, etc., into something called Plate Boundary Observatory: http://pbo.unavco.org/instruments/gps
this is actually easier in some ways because it's so slow you can integrate forever and so reduce noise (both "normal" noise and the added stuff, if they are still doing that).