Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Transformers for 60 Hz are smaller and cheaper than transformers for 50 Hz. In general this continues to be true as you increase the frequency, although there are some effects that you need to compensate for as the frequency increases.

Your phone charger might have a transformer operating at something like 1 MHz. As a result, it can be made very, very small.



Aircraft use 400Hz AC for exactly this reason.


There's also a very nice property of 60 hz: Your clocks don't lose their time very often since they have a very predictable source of truth built right in.


I don't see a difference there between 60 Hz and 50 Hz. One has 1/60 of a second between cycles and the other has 1/50 so as long as you can count to 50 instead of 60 it's the same, right?


Which one can you use to make the second hand tick accurately without having to count at all?


Your second hand ticks 60 times every second?


Base 12. Think gears turning gears turning gears.


A gear reduction of 2x5x5 is just as easy as a gear reduction of 3x4x5.


In Europe the grid phase is supposed to be pegged to UTC. In recent times (1~2y) we have some fluctuations because some south east european countries have some disputes about paying vs. being cut-off, which creates an energy imbalance in the network that's not trivially corrected for like normal small leakage or rounding errors. It is usually very stable over long time periods, considering the rarity of power outages lasting more than a few seconds. Most are just factions of a second, when the 10/30kV network (underground in cities) needs to be switched to a different upstream transformer. It's basically a circuit breaker with automatic failover. (Multi-second outages are explained by emergency load shedding, followed by re-connecting residential consumers and not the industrial ones. Necessary after long-distance transmission lines fail their redundancy and cause regions to be partially cut-off and then it's load-shedding vs. cascading failures.)


In the US the grid used to guarantee 5184000 per day with only short-term fluctuations (standard WEQ-006, the exact deviation allowed depends on which grid you are talking about). This was used by e.g. cheap clocks in coffee makers, but also by synchronous motors used by telescopes to align the telescope with the stars.

More recently, with our larger power grid, it is more efficient to deviate from 60 Hz. I do not understand the exact reasons, but for some reason the 5184000 cycle/day guarantee turns out to come with some cost when you have a hundred gigawatts of power spread across a thousand miles. There have been two petitions in North America to remove TEC and allow synchronous clocks to accumulate error, I believe this is going forward but not yet implemented.

See the section "Time Error Correction": https://en.wikipedia.org/wiki/Utility_frequency

See the section "Accuracy": https://en.wikipedia.org/wiki/Electric_clock


Relying on network frequency as being accurate to any degree is madness. It's not even useful for any correction.

Even the most basic RC oscillator outperforms it by orders of magnitude.


At least for Europe, it is accurate. Frequency is actively changed to maintain 50Hz in long term: https://www.mainsfrequency.com/gridtime.php


> madness

Yeah but that doesn't mean people don't do it. https://arstechnica.com/tech-policy/2018/04/european-grid-di...


How is that a property of 60Hz?


It isn't.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: