According to ChatGPT, the speedup factor for getting 10 cm higher is 1 + 1.09e−17. (With ΔT = gh /(c^2) The math seems to check out, but not sure if the formula itself is correct.) Surely, if the clock ticks at rate 1e-19 in a second, i.e. one tick is hundred times smaller than the dilation difference in a second, the clock would still need at least a hundreth of a second to accumulate enough ticks for the count of ticks to differ even by one tick because of the dilation.
The frequency that is actually counted with a digital counter in this clock is only 500 MHz (i.e. after a frequency divider, because no counter can be used at the hundreds of THz of an optical signal).
Nevertheless, in order to measure a frequency difference between two optical clocks you do not need to count their signals. The optical signals can be mixed in a non-linear optical medium, which will provide a signal whose frequency is equal to the difference between the input frequencies.
That signal might have a frequency no greater than 1 GHz, so it might be easy to count with a digital counter.
Of course, the smaller the frequency difference is, the longer must be the time used for counting, to get enough significant digits.
The laser used in this clock has a frequency around 200 THz (like for optical fiber lasers), i.e. about 2E14 Hz. This choice of frequency allows the use of standard optical fibers to compare the frequencies of different optical clocks, even when they are located at great distances.
Mixing the light beams of 2 such lasers, in the case of a 1E-17 frequency difference would give a difference signal with a period of many minutes, which might need to be counted for several days to give an acceptable precision. The time can be reduced by a small factor selecting some harmonic, but it would still be of some days.
Let's imagine that there is a huge amount of time dilation (we live on the surface of a neuron star or something). By climbing a bit, we experience 1.1 seconds instead of 1.0 seconds experienced by someone who left down.
We have a clock that can measure milliseconds as the smallest tick. But climbing up, back down, and comparing the amount of ticks won't let us conclude anything after a single millisecond. If anything, we must spend at least 11 milliseconds up to have a noticeable 11 to 10 millisecond difference.
Now, if the dilation was 1.01 seconds vs 1.00, we would need to spend at least 101 milliseconds up, to get a minimal comparison between 101 and 100 milliseconds.
Thinking in terms of 'ticks' over-discretises it. You can in practice measure frequency to much more precision than any discrete cycle time in the system you're using, because you can usually measure phase, i.e. you're not just seeing some on-off flash, you're seeing a gradual pulse or wave, and it's how accurately you can measure that pulse (in terms of SNR) which sets your integration time for a given precision, not how rapidly you're measuring it.