I'm not entirely clear on the author's thinking. But consider the case of watching a real world voltmeter read 1.02v, 1.01v, 1.00v, 0.99v. Did we really lose a digit of precision there? The leading digit transition between 1 and 9 is a boundary condition that messes up that rule.
I think the catch is that you aren't keeping the number normalized. Significant digits somewhat assume scientific notation. So that last number is 9.9e-1. Two digits.
I'm not sure what you mean here, probably because I worded it poorly. Of course your instrument will have different scales, and you should use the appropriate one.
But the measurement was done with a number of significant figures, this is independent of how you write it. Nothing is "lost", if measurement had 2 sig figs it is 0.99, if it had 3 it was 0.990. Or 0.991 or whatever.
They said if your device measures 1.00, it is three significant digits. If it measures 0.01, it is one significant digit. From the same measuring device.
So it is counter intuitive that the measurement just going down below 1 somehow causes it to "lose" significant digits. It didn't lost precision, though. Just the number of digits that are significant.
I'm not sure what you are saying. 0.01 has just one significant digit.
Consider you have a 10 meter stick with centimeter meetings. This can give 0.01 as the smallest measure. A number with 1 significant digit. It can also give up to 10.00, a number with 4 significant digits. In both, the precision of the measurement is to a centimeter.