In Europe the grid phase is supposed to be pegged to UTC. In recent times (1~2y) we have some fluctuations because some south east european countries have some disputes about paying vs. being cut-off, which creates an energy imbalance in the network that's not trivially corrected for like normal small leakage or rounding errors. It is usually very stable over long time periods, considering the rarity of power outages lasting more than a few seconds. Most are just factions of a second, when the 10/30kV network (underground in cities) needs to be switched to a different upstream transformer. It's basically a circuit breaker with automatic failover. (Multi-second outages are explained by emergency load shedding, followed by re-connecting residential consumers and not the industrial ones. Necessary after long-distance transmission lines fail their redundancy and cause regions to be partially cut-off and then it's load-shedding vs. cascading failures.)
In the US the grid used to guarantee 5184000 per day with only short-term fluctuations (standard WEQ-006, the exact deviation allowed depends on which grid you are talking about). This was used by e.g. cheap clocks in coffee makers, but also by synchronous motors used by telescopes to align the telescope with the stars.
More recently, with our larger power grid, it is more efficient to deviate from 60 Hz. I do not understand the exact reasons, but for some reason the 5184000 cycle/day guarantee turns out to come with some cost when you have a hundred gigawatts of power spread across a thousand miles. There have been two petitions in North America to remove TEC and allow synchronous clocks to accumulate error, I believe this is going forward but not yet implemented.