Until the 1990s, Grand Central Station in New York had almost everything - 60Hz commercial power, 40Hz LIRR power (Pennsylvania Railroad standard) 25Hz NYC Subway power, 700VDC Metro North power, 600VDC subway third rail power, and some old Edison 100VDC power. There was a huge basement area full of rotary converters to interconvert all this. Various ancient machinery and lighting ran on different power sources.
In the 1990s, Grand Central was rewired, and everything except railroad traction power was converted to 60Hz. All conversion equipment was replaced with solid state gear. It took quite a while just to find everything that was powered off one of the nonstandard systems.
It wasn't until 2005 that the last 25Hz rotary converter was retired from the NYC subway system. (Third rail power is 600VDC, but subway power distribution was 13KV 25Hz 3-phase.)
> 60Hz commercial power, 40Hz LIRR power (Pennsylvania Railroad standard) 25Hz NYC Subway power, 700VDC Metro North power, 600VDC subway third rail power, and some old Edison 100VDC power.
I really don't understand the downvotes, this was a sincere question. Please note, the OP states:
>"Until the 1990s, Grand Central Station in New York had almost everything - 60Hz commercial power, 40Hz LIRR power (Pennsylvania Railroad standard) 25Hz NYC Subway power, 700VDC Metro North power, 600VDC subway third rail power, and some old Edison 100VDC power."
and then subsequently states:
."In the 1990s, Grand Central was rewired, and everything except railroad traction power was converted to 60Hz."
I am asking because I am trying to make sense of the entire comment. They also mention 4 or 5 other electrical frequencies and voltages, so there's a lot packed into that comment. I am genuinely interested in the comment and trying to learn something.
A summary: once upon a time, Grand Central Station had AC at multiple frequencies in addition to 60hz (also it had some DC power). In the 1990s it was rewired so that the AC only used the (American) standard 60hz.
I suppose its hard to make sense of if you didn't know that 60Hz is the standard in the U.S.?
I don't think 50Hz was ever used in the US. What are you curious about?
They probably had some old 60Hz three phase supply, maybe something weird like a "wild leg" configuration, but when everything was modernized they likely switched to 480V three phase for the big stuff like the ventilation motors, and 240V single phase with neutral for supplying smaller motors and 120V systems.
Incidentally, Japan faces this very issue to this day; part of the country run on 50Hz, the rest on 60Hz.
This made matters trickier after Fukushima, as the nation is effectively two smaller electricity grids, not one large one - so making up for the shortfall became harder than it could have been. (However, there's a massive frequency converter interface between the two grids.)
Edit: Aw, shucks - now that I revisit the article, I see the exact same points being made in that article's comment section. My bad.
-It is not much of an issue on a micro scale - biggest issue today is probably direct-drive electrical motors, which will run too fast (if 50Hz on a 60Hz grid) or too slow (vice versa).
Also, running a 60Hz motor on a 50Hz grid will cause it to run hotter (larger current than nominally on 60Hz; also, less efficient cooling as it turns slower). If the designer designed to a price point rather than to a standard, this may be an issue.
On the macro scale, however, you've got a problem if, say, you suddenly lose a power plant in the 50Hz region of your country. You cannot then simply make up for the shortfall by distributing the load between all plants in the country (or, unless you're an island nation like Japan - from your neighbours, too) - only the ones in the 50Hz part of it.
This is just a long-winded way of saying that the larger your power grid, the more robust it is when a power plant goes offline.
The Japanese are in the unenviable position of having two small national grids rather than one large one - and, for an encore, they're on a bunch of islands and likely unable to import significant power from neighbours, too!
Hence any plant downtime is felt much harder than it would any other place. Tough luck.
For some other fun with California's 60hz legacy...
The timezone database (maintained by the people who are very particular about making sure that a time specified is a well known time) have a note in the northamerica data file:
My mind always boggles at humanity's general inability to standardize on one thing without great pain and fighting. Whether it's Metric vs. Imperial, Beta vs. VHS, Blu-ray vs. HDDVD, OpenGL vs. DirectX, USB speeds, power connectors, instant messaging protocols. Nobody can just sit together and cooperate--we always have to go through that painful period with multiple incompatible standards that fight it out until (hopefully) one of them wins.
I'm actually rather impressed with humanity's ability to standardize. It's perfectly natural (and in many cases desirable) for multiple parties to go through that "painful period" with different solutions that will compete to become "standard". It just takes a little time for everything to align and settle, that's perfectly natural.
As for the U.S. refusing to adopt the planet-wide metric system? You've got me there. That's just… Weird.
It should be obvious why the US stays on the imperial system. There is an opportunity cost to switch that is greater than the current ongoing cost of translation. Thus, someone has to bite the bullet and assume losses to push metric. Throughout the world, the organization that put the foot down and ate the bill has almost always been the government - but the US government operates on short term results. The politician that inconveniences the public today with a metric conversion doesn't have his seat tomorrow, and his replacement immediately halts proceedings.
Language isn't close to a solved problem. Hopefully the next hundred years can finally see the reconstruction of the Tower of Babel for a generation of the world soon to come. But there also needs to be a legitimate reason for us to want to unify language. If we operate independently of one another and let businesses control any interactions by proxy of money, we will stay separate.
As the U.S. loses political importance and mindshare, sites will stop using U.S. units. Conversion to metric will eventually happen.
As for language, I'm kind of split. On one hand language is a tool for communication that becomes more efficient the more standardized it is. On the other hand, language is closely related to art and culture. I'm Icelandic (and a poet at heart) and I love my language deeply. I can succintly say things in Icelandic that can't esaily be expressed in English. And that probably applies to every language.
So I'm torn. I don't want to lose my language. But I also want a more integrated world.
I have three kids and I seriously doubt their kids won't speak the language of their parents (assuming they'll live in Iceland).
The transition happens much slower, although I've been watching it accelerate over the past couple of decades. English is encroaching on the language. We speak Icelandic, but there are small grammatical changes happening, making the language more English - and at the same time, we're losing Icelandic vocabulary.
My prediction is that my grandkids (that's probably at least 13-15 years in the future (hopefully)) will speak Icelandic very similar to what I do, it'll just be a liiiittle different.
But these small changes will certainly accumulate and the pace of change will probably accelerate with more communication and integration. So who knows what will happen in the next 50-100 years.
Welsh was pretty much wiped out as the far more practical use of English press over the centuries. About 50 years ago there was a political movement to re-introduce it, laws were past to ensure that government communicated in Welsh on request , roadsigns were bilingual, and Welsh was taught in schools (which is probably just as useful as French being taught in schools).
Language is the major barrier of freedom, I can theoretically get a job and move to Dublin, London, Berlin, Rome, Cadiz, Porto, Budapest, Taliin or Athens. In reality the language prevents it in most cases.
In 100 years time we could either live in a fuedal society where you can live/work in a small area owned by a local baron who reports to an international mega corporation, with international style communications in a language nobody knows - just like 14th century England and Latin, or we could have a free borderless world where there is fictionless movement and communication. I hope technology will enable the latter before the former entrenchs again.
The problem is public opinion. Americans either don't care or think imperial is superior.
It doesn't have to be a federal law. Change can start gradually with private companies. Pokemon Go, for instance, shows only kilometers and doesn't offer imperial units as an option. All water bottles in the US have both mililiters and fluid ounces on their label.
From what I understand, most construction and other raw materials are all measured in metric units, then converted to imperial at final sale if needed.
For instance, a 4 x 8 foot sheet of plywood down at Home Depot isn't really 4 x 8 feet; it actually has an imperial measurement of 47.938 x 95.938 inches, which is approximately 1217 x 2437 mm - the standard metric size is 1220 x 2440 mm:
Thicknesses (as noted in the table too) have similar "whole" metric numbers. You see this also on some (not all) general materials sold to the American consumer public - the actual material is metric sized, but marketed with imperial measurements.
I think this is done both for cultural reasons, as well as just "momentum" and that the public is used to it. But the reality is (almost) everything is now metric, the public just isn't fully aware of it yet.
This doesn’t pass commen sense for me. Plywood measured in metric, converted to imperial if needed? I can imagine someone coming up with 4 feet by 8 feet as a standard size for building material. I don’t see them coming up with 1220mm x 2440mm and thinking “nailed it, perfect size for calculating how many of these to make a wall!”
Also, since the gap between plywood takes space, plywood (and drywall) is a bit less than stated size to give you breathing room to put 4x8s repeating every 4 and 8 feet.
That said I don’t know why a nominal 2×4 is actually 1½ × 3½. Used to be finishing work shaved it down, now they just make ‘em that way to be ornery.
The sizing is also dependent on intent. Floor underlayment is not the exact same size as furniture grade plywood. And MDF is actually 49"x97" to give you some extra room if you are laminating a veneer on it. It is true that various plywoods come in metric thickness though. Especially those made in China.
Legacy of the 1970s-1980s attempt to move the US onto the metric standard.
Don't tell people who are annoyed by metric, but all the 'imperial' units we use now are 100% based on metric units. Our metrology has been metric since that time, all the day-to-day usage is derived units.
Which is why an inch is exactly 2.54cm instead of some rounded fraction.
Dvorak is something anyone can use on their own any time. I imagine it is much more successful than Esperanto, because the only point of language is communication. It is the Facebook problem - you need a large network of participants to make a language work, or the language fails its purpose.
Also, Esperanto is ugly as fuck. It has I believe 4 accented letters, and I have no idea why anyone would think when divising a new language that accents or umlauts are good ideas (unless all the letters supported them and had consistent meaning). Otherwise, just use new characters.
Imperial units are better, and I expect the world will convert to them in the next century. They're sane and easily divisible into eighths and more with a single significant digit of the next smaller unit, while going from 1 kilo to 125 grams is a awkward increase in digits. They're human scale: a pound is generally about a big handful, while a gram is hardly an easily estimated quantity for humans and a kilogram is awkwardly big to hold in one's hand. A cup and a pint are useful amounts of water to drink, while a liter is a bit more than a useful quantity to measure such things by. An inch is a knuckle on my finger, and a foot is the length of my forearm, while a meter is difficult to measure without a reference and a centimeter is a useful unit but I don't have an inbuilt measuring unit that long. I could go on, but I think this is a sufficient set of examples: metric is difficult to use for casual estimation (which is, after all, what one does most of the time, imperial is designed to be on a human scale and is better for everyday use and estimation.
Edit: Not sarcasm. I work with imperial daily farming, and worked with metric doing chemistry in college.
> expect the world will convert to them in the next century.
> Edit: Not sarcasm.
Heh. I think you'd be disappointed. I haven't heard of of any modern country seriously considering moving from metric to imperial units.
> They're sane and easily divisible into eighths and more with a single significant digit of the next smaller
Divisible maybe. But definitely not sane.
Let's take a look at lengths and weights for example:
How many inches in a foot? 12. Ok. Then it must be 12 feet in a yard. Nope. It's 3. How many yards in a mile? Guessing 1000 or 1200. Nope. Apparently 1760.
Let's try weights. Going up from ounces. Since we already did imperial lengths and saw there are 12 inches in a foot, guessing there should be 12 ounces in a pound. Wrong again. It's 16. Ok, so then 1 US ton should be 1600 pounds right? Nope, wrong again. Apparently it is 2000!
I grew up with metric then came to US and lived here for more than a decade. Apart from knowing how many inches in a foot I had to look up all the other ones because they are non-intuitive and don't follow any pattern to me. I can estimate how much is a yard, an ounce, a pound, an inch, and a foot. But again I could intuitively estimate things even better when it came to meters, kilometers and kilograms.
This is so America-centric it baffles me, you sound delusional, especially that you think countries will convert to imperial. There is no country in the world thinking about that, and a lot of countries have converted from imperial to metric.
Your biggest complaints are that (1) metric units are not quantities you use in everyday life and (2) that you don't know how large a metric quantity is.
Both are just because you are unfamiliar with the system, people working with this system do not have this problem at all. Regarding (1), people don't care that a litre is too large too drink, they just know that a glass is 0.2-0.25L, so you can get 4-5 glasses out of a litre bottle. The benefit base 10 units is that they're easy to convert and use, people don't care if something is in cL/dL/L because it's just a comma placement away.
Regarding (2), people have a very good feeling about how big their everyday units are. E.g. a metre is about a step (large/small step depending on your size), that's fine for rough measurements, and about as (in)accurate as your "a foot is my forearm".
The point about fractions is actually somewhat interesting. The fractions thing seems to be a uniquely American thing to me. When I grew up in Germany people rarely used fractions other than half and third. You'd never here 1/8 or 1/16. I still remember being surprised when I first encountered a quarter. I still don't think in fractions but in multiples of ten. Now you make the point that even the imperial grab bag of unrelated units lends itself somehow better to that. I wonder where that cultural difference comes from.
It's really useful in cases where you need an answer a question like "I want to drill a hole in the middle of this board which is 12.25 (12 and 1/4) inches long." It's easier to for me to split that in half in my head than divide the fraction and find the corresponding mark on my tool.
Half of 12 is 6, half of 1/4 is 1/8, I drill the hole at 6 and 1/8".
It's easy to get a fractional measurement of whatever is in front of you and immediately work with that measurement in your head then find the corresponding mark on your measuring tool. There is no reason the same system would not work with metric, it just seems to be more common with imperial units.
In metric-land, standard measurements for things are used to make things like that easily divisible.
E.g. IKEA cabinets are 60 cm wide (90, or 120 for larger cabinets), common countertop depth is 60 cm, and so on.
That way we get to use sane and consistent units and we can easily see/measure the middle of a 60cm piece of wood or whatever.
It also tickles me how eyeballing these things is apparently something desirable. Whenever I work around my house I measure everything twice to be sure...
This has nothing to do with eyeballing, I hope I did not give that impression. In the imperial measurement world there are also standard sizes of things but that's not the point. Using a fractional scale (not necessarily imperial!) does have benefits in some situations, such as measuring and working with some arbitrary object placed in front of you. Measuring twice is not unique to the metric system and is very common in the imperial unit world as well.
Not everything in the world is (or should be) provided by Ikea :)
You're right about meters not being as good a "human scale" measurement as feet. The idea of average height being 1.8 m is pretty awkward on its face. Turns out, though, the metric world has settled on starting with centimeters, though. But it's not clear that this is better, because now you end up throwing around high magnitude numbers like "183 cm" rather than "6 ft".
Given that we're clearly comfortable going with a diminutive unit (viz. cm vs m), I've always thought it would be better for the world to settle on the decimeter as the reference unit instead. It's larger than a centimeter but smaller than a meter, which is what we're after, and it's about the width of one's hand, which is arguably a more natural choice for something "human scale" than the foot. The snag is that "foot" still rolls of the tongue a lot more easily that "decimeter". So we go ahead and say 1 dm = "1 hand". It's a great unit, because if we want, we can scale up or down to meters and centimeters with (base 10-derived) constant factors, which is so easy that anybody can do it in their head.
The only snag left is that "hand" is already in use as a unit. This turns out to be less problematic than it sounds, because the legacy hand is an obscure unit really only used in horse breeding. And we're in luck, because as its name suggests, the imperial "hand" is named after the span of one's hand (with fingers extended), so they're roughly the same—it's not as if you end up with one name for two wildly differing sizes. This is the same kind of "conversational equivalence" we get with a ton and a metric tonne. That is, in conversation you're basically never reduced to needing the speaker to clarify which it is that he or she means, because you just don't need that kind of precision—a ton and a tone are both two very large masses that are in the same ballpark as one another.
Perhaps most importantly, the transition from feet to hands is fairly straightforward in conversational use, because end up saying that "1 ft" equals "about 3 hands".
People make a big deal about pi versus tau, but getting widespread adoption of the "hand" as a unit seems to me to be a much more worthwhile cause, because it would have a much bigger practical impact on everyday life than tau ever would.
The EU states pint glasses must be 568ml (or larger with a line to the level) and measured in a sane and consistent way. Bit like BS standards and crown marking, but makes trade easier - if I make pint glasses for use in the UK, in Greece, and in Spain, I don't need to get 3 standards bodies to approve them.
The organizations that spend a lot of time thinking about units are heavily weighted towards the sciences, where those kind of properties are less useful and various properties of metric are very useful.
The U.S. is on metric; NIST defines and curates standards on the ISU and is a critical, founding participant in the GCWM. Manufacturing and engineering specifications (especially for the military) are usually ISU. It's just that commercially imperial units are still popular. Most people aren't engineers or scientists and so they don't think about this a lot, but marketing loves inertia.
The US is one of the original signers of the Metric Treaty, so technically we are metric. And it turns out that our common units are defined in terms of the metric system.
I manufactured plastic sheeting and all product was measured in inches, leading to a huge amount of conversion (24-3/8 sheet with +3/64 tolerance) and to me learning what a mil is (1/1000 of an inch). 10 mil plastic runs and stacks totally differently from 15 mil, and 40 and 60 mil take different amounts of time to cool and shrink, so there truly would have been switching costs going to meters (plus all the people who ordered 48x96 sheets, and rolls on 6-inch diameter cores, and so forth).
There's a lot of domain-specific units in use which aren't even imperial originally (fluid barrel, troy ounce/carat, ton of cooling, horsepower, AWG) that will stick around for a long time since they're used so frequently in finance, planning and B2B transactions, along with other imperial units that'll remain because they've taken on domain-specific uses (acre of land, fabric yard, bushel, mils) even when the engineers and logistics folks touching that same stuff are using SI equivalents.
There was a time in the mid-1990s when certain models of vehicles had both metric and imperial fasteners on them; made wrenching on them a major pain. But today, though, everything is metric.
You're right that consumer marketing is still toward imperial. I think it's just going to be a slow process of gradual weaning away as older people pass on. I'm personally of a "in-the-middle" generation (X) - I tend to be more comfortable with imperial units, but if metric is required, or I think it might be a better measurement to use (depending on the purpose), then I'll use it instead.
The US government adopted it in 1975, and Reagan abolished it 1982.
It's frustrating not being able to repair the tools I buy from home depot using tools bought from home depot. I couldn't even go metric if I wanted to. They don't sell raw materials in metric. You get 4x8' sheets of plywood and something called a 2x4 which seems to have nothing to do with either measuring system. It's not even Cartesian.
The metric system was based on one great (mostly) idea, and one stupid idea.
The great idea was having larger and smaller units for a given thing related to the base unit by powers of 10. That's a lot easier to work with than 12 inches to a foot, 3 feet to a yard, 1760 yards to a mile.
The stupid idea was basing the meter on the distance between the North Pole and the Equator.
What they should have done is based it on existing systems of units. The meter should have been equal to 3 feet. The definition of the foot at the time, in both England and on the continent I believe, was a bit fuzzy, but they all had such a unit and it was reasonably consistent everywhere. Then they could have worked at making the definition of the foot/meter more precise and reproducible, keeping it within the range of existing practice.
That would have given us the benefits of a metric system, but with the conversion between metric and prior units easier. Converting feet to meters would have then been a simple division by 3. An inch would be exactly 25 mm. (25 is easy to multiple by in your head: just divide by 4 and shift the decimal point two to the right. Similarly it is easy to divide by in you head if you do it as a multiply by 4 and then shift the decimal two to the left).
Similarly for all other metric base units except for temperature.
For temperature, as far as I can tell, there was NO good reason to switch to Celsius. Unlike with distance, we don't make bigger or smaller units out of multiples of the base unit. We just state the numeric value, in decimal, possibly with modifiers like kilo or micro, and the base unit. In other words, with temperature we were kind of already doing metric, in the sense of using powers of 10 and writing things in decimal.
So the only thing switching from the older, widely in use Fahrenheit scale to Celsius actually made better was changing the anchor points for the scale. Fahrenheit based his scale on a 0 point of the coldest temperature he could make in his lab, and his 100 point at what he thought was human body temperature.
Celsius based his on water freezing at 100 and boiling at 0, but the whole universe thought having lower numbers mean hotter was stupid, so it was quickly flipped to freezing at 0 and boiling at 100.
Now those anchor points are better than the ones Fahrenheit picked, because they are easier to reproduce and more consistent. (They still suck...but they suck a lot less). Fahrenheit's degree size was better, though, especially for the range of temperatures that most people live in. Celsius' degree is too coarse.
The right approach would have been to make Fahrenheit the scale for metric temperature, but anchor at it better points. Instead of 0 as coldest Fahrenheit could make in his lab and 100 human body temperature, use the same anchors Celsius did: freezing and boiling of water. Define freezing as 32 Fahrenheit and boiling as 212 Fahrenheit.
This would have made as much sense as adopting Celsius, and no one would need a new thermometer.
Centigrade makes far more sense. -10 means stay inside. 0 means ice on the road, 10 means jumper still needed, 20 is comfortable, 30 means barbecue required, 40 means you're abroad, stay inside.
The metric system was first adopted during the French revolution. Why would they base the meter on the british foot? The pied du roi was 324.84mm compared to the UK 296 mm.
Same comment for temperatures. Both scale emerge around the time (mid 1700s). It's not like people switched from Fahrenheit to Celcius...
> The stupid idea was basing the meter on the distance between the North Pole and the Equator
Why is that stupid?
The point of SI (and its predecessors like MKS and CGS dating from the early 19th century) is readily achievable, coherent and replicable standards for not just length, but also mass, time, temperature, electric current, amount (of a uniform subsatance), luminous intensity and several derived units like volume (of an arbitrarily shaped container), pressure, electric charge, force, and so forth.
Some of the fundamental units were harder to insert into a coherent system in an easily replicable and achievable way.
You've chosen to look at two such units -- length and temperature.
The metre has an interesting history whose beginnings suffered from difficulties in achieving independent reproducibility of the standard metre. Starting in the 17th century, various approaches were explored, with two leading candidates surviving into the 19th century, both relating to the geometry of the Earth in principle measurable everywhere with suitable equipment. One candidate required a detailed survey of a meridian and an almanac of angular measurements one could make against objects in the sky or objects receding over the horizon or alternatively with a map angular measurements of objects of known height disappearing over the horizon. Another candidate required an excellent portable frequency standard and an almanac relating that to time-of-day in a location-dependent fashion.
The first is the version that survived until the definition of the metre was tied to the properties of atoms and the universality of the speed of light, mostly because it was more reproducible. This was the "meridional" version; one could readily produce a high precision metre prototype with good equipment and a stable platform on a large calm body of water extending to the horizon along a meridian (this means one could do so essentially on the shore of a large lake). The definition is 1/10000 of 1/4 of a great circle through both poles of a geoid that averaged out slight differences in oblateness of the Earth's surface. Apart from error terms relating to the non-uniformities in the Earth's true surface, realizations using trigonometry against objects in the sky suffered uncertainties because of the several sources of variation in the rotational speed of the planet. With the advent of GPS and decent approaches to defining a working geoid, a "meridional" approach is still viable, although I would be surprised if anyone seriously proposed dong so as a replacement for the present definition based on the speed of light.[1]
The other leading candidate was the "pendulum" method. When one constructs any pendulum whose half-period is one second anywhere on the surface of the planet, the arm of the pendulum will have a length very close to one metre. One critical problem is that local mass concentrations, altitude, and latitude all influence the length, and already in the 18th century it was clear that the length of the pendulum could vary by several millimetres within a radius of even a few hundred kilometres in some places, and there was no a priori way to determine all of the local contributions in order to achieve the same accuracy possible with the meridional method. Worse, precisely calibrating a seconds pendulum was a difficult technical challenge even in laboratories, even though the underlying mathematical formula was fairly simple. The problem is that in ideal situations the dominant driving term is "g_0", the standard acceleration of terrestrial free fall (as it is now known). Unfortunately the actual acceleration of objects near the surface in free fall in vacuum varies significantly across the whole of the planet, and can even vary over relatively short timescales at one location, and there is no a priori way to determine the expectation value with great reliability. Indeed we've had relatively poor data with which to build a global almanac until the 21st century with satellite observatories like GRACE and GOCE[2], and even now relying on a seconds pendulum for defining a metre is an unattractive proposition.
The metric unit of temperature is the kelvin, not the degree Celsius. The Celsius scale is based on kelvins, but with a 0 point (that of the triple point of a particular standard of purified water at a particular pressure) that is fairly straightforwardly achieved with decent precision even in a typical school science classroom setting. The kelvin is defined in terms of of an exact fraction that triple point, and it "only" suffers some difficulties in the exact definition and realization of 0 K.
The kelvin is due to be redefined in the next few years taking a fixed value for the Boltzmann constant k_B which can be expressed in terms of J K^-1 where J is Joules and K is kelvins, while the modern Joule is already defined in terms of the Planck constant h, the speed of light in vacuum, and the second. This redefinition is aimed principally at coherency as mentioned at the top, as we replace features common near the surface of the Earth everywhere people live with physical constants expected to be the same everywhere in the observable universe. A strong parallel goal is reproducibility and realizability of the units; the kelvin was already easily reproduced with high precision, and few metrologists are wholly comfortable with a definition that could be much harder to "show" in a lab or factory.
> "more consistent"
The Kelvin scale has always been well-integrated with the other units of SI and its predecessors, particularly since the beginning of the late 19th century programme of defining units in terms of universal physical constants.
The equivalent in U.S. Customary Units is the Rankine scale, which has the same 0 point as the Kelvin scale, but using Fahrenheit-sized degrees (which in the U.S. are anyway defined by NIST as 5/9 K, and NIST prefers "rankine" over "degree Rankine"). There is no (formal) "Imperial" rankine; AFAICT the whole of the former British Empire uses kelvins either in the Kelvin scale or (when discussing weather or cooking, for instance) the Celsius scale, although proximity to the USA and aborted-by-1980s-politics conversion to metric leads to Canada using a mix of units -- e.g. Celsius in weather reporting and Fahrenheit in household cooking.
However, given the 5/9 constant conversion factor, using rankines vs kelvins is a matter of choice. The placement of a zero point in a scale with such degrees is almost essentially arbitrary (although there are obvious attractions for some realizable ground state as the zero point. While "absolute zero" might be reached asymptotically with close approximations of an ideal gas it is not an obviously perfect choice on theoretical grounds), and is a matter of suitability. Thus there is no clear "better" between the two everyday temperature scales; each has advantages and drawbacks. Equally importantly, neither offers scope for improvement of definition of degree or zero than the other.
Essentially all scientific applications use the Kelvin scale; most of the world is comfortable using Celsius in non-scientific applications. There is almost no use of the Rankine scale (even in the USA), and most people in the USA are comfortable using Fahrenheit in non-scientific applications. Some cultures use a mix of Celsius and Fahrenheit in everyday situations. And of course, many cultures have never used the Fahrenheit scale. None of these cultures or the economies they participate in seem to be on the verge of collapse because they have made a "wrong" choice of temperature scale, however it is notable at how quickly the everyday use of Fahrenheit in most of the former British Empire collapsed.
- --
[1] advances in (long-baseline) interferometry in the late 19th century could have been directed at improving the definition of the "meridional" metre, but it would have been odd not to take advantage of short-baseline interferometry that is at the heart of the "wavelength" definition proposed by Michelson (of the famous Michelson and Morley experiment). Unfortunately a consistently reproducible monochromatic emitter was unachieved until the middle of the 20th century, and even with the advent of solid-state lasers, the "wavelength" interferometry approach has more sources of uncertainty than the present light-second definition.
Standards forfeit control. If you can become a de facto standard, without standardizing, you wield insane power over lives and business. Of your examples, Blu-ray, DirectX, and the "new age" closed silo IM protocols like WhatsApp that superseded an XMPP world are all corporate controlled properties that have no standards body considering the needs of the industry. They behave and act to fulfill the needs of their creators, to the detriment of the users and broader ecosystem.
That is why standards are hard. Because it requires those with power to voluntarily relinquish it, which very rarely will happen. And then to never try to circumvent standards with first mover advantage to reclaim that kind of monopoly power. At least right now societies around the world are very poorly structured to align the incentives towards cooperation.
My favorite example of this at the moment is UEFI Time Services. Time is current local time, TimeZone is an offset from UTC in minutes, and Daylight establishes whether DST should apply, and if it has been applied. So yay, no more time incompatibilities with multiple OS's. Except no, not even Microsoft Windows 10 sets the TimeZone value to the actual UTC offset! They set it to EFI_UNSPECIFIED_TIME. And so does Linux. And so does macOS. So we're still fakaked.
UEFI is a good example of a standard that was over designed by a committee. Way too much bad decisions that are now part of the standard.
My favourite is using the FAT file system format for the system partition. No checksums, fragile on disk format, and no built in support for mirroring just to name a few.
And why they kept the idea of having the clock in local time is baffling. That just leads to errors when DST failed to apply or was applied twice.
If you're not in that country and are speaking English and it's your native tongue, then it's highly probable that your country (or its foreign ruler's Queen) is the one that originally imposed this system on us
Maybe if every American took a thermodynamics course where the teacher insisted on memorizing the unit conversions we would not have this issue! What an awful semester...
Sometimes standards are mostly arbitrary, and it doesn't matter which one "wins" as long as we all agree on one. However, it can also be frustrating when society adopts a standard nearly universally, and then that standard becomes an impediment when some new alternative comes along or some new use case is found that isn't supported by the old standard.
There is still an incredible amount of isolation. My primary forums are reddit and hacker news, and both are so disproportionately Americo-centric its scary. Google only ever gives me results from American publications and even Duckduckgo has a hard time finding relevant information to my searches from outside the US or West Europe.
Meanwhile, more people in Africa have cell phones than plumbing, and there is realistically almost nowhere in the world without some form of cellular data service now. All these people, however, are in isolated language specific silos of content. Even the Indian Internet is radically different from what I see despite both being in mostly the same language (with some Hindi mixed in).
I am always worried about how little interaction there is, through a medium of effectively no barriers than the ones we make ourselves, between the people of western powers and everyone else who is currently online but either not informed about what the Internet is (and thus just uses SMS) or is isolated from us by language barriers.
Blame all those cheeky buggers who built machines powered by steam engines or water turbines. They basically built their machines to run on whatever was available. Got a waterwheel doing 1000RPM and you need a big saw attached to it? Your saw will be running at 1000RPM (~16Hz).
So many crazy machines were built before there was either a need or an effective way to standardize anything power related. Keep in mind that many kinds of manipulations of electricity are easy and cheap now, but that certainly wasn't the case when people started converting from steam/water to electricity!
> So many crazy machines were built before there was either a need or an effective way to standardize anything power related.
One of my favorite, somewhat related, such machinery was those multi-pole "generators" that were used to create carrier waves for radio transmission between the thin era of spark-gaps and vacuum tube oscillators.
Doesn't work. If you want fast standardization at all costs, only way to it by forced standardization with "obey or goto gulag" orders send from a "tyrannical government" and "violent" and feared standards enforcement authorities.
When you let people vote standardization takes aaaages. When you let corporations vote, it gets even worse... it may never truly happen because the agreed upon "standards" have purposeful ambiguities sprinkled everywhere 'cause "but that's too hard to implement", and practical implementations never fully obey because "what you gonna do, can't practically sue us for not implementing this". Also standardization favors commoditization, hence doesn't make much business sense to do it if you're the "big guy", and isn't so easy to do if you're the little guy (and you're also eating away your future profits if you're the small guy that knows it's gonna grow bug fast).
Well, different systems have different tradeoffs. It makes sense that people have different priorities, and thus prefer different systems. Almost everyone likes cooperation, but for that compromise is needed, which is always difficult to achieve in a manner satisfactory to all parties involved.
With regard to OpenGL vs. DirectX, the competing standards push each other to keep improving. I suspect that graphics APIs and thus hardware would have really stagnated without the competition.
There's more to the history of DirectX and OpenGL than you may be aware of.
Personally, I believe Microsoft helped to engineer the downfall of SGI, via the whole Fahrenheit graphics "co-operation" - which also led to SGI making Windows NT workstations (in place of Irix).
That isn't to say that SGI didn't have some bumbling management of the time, but I do think that Microsoft took advantage of the situation to get them to chase a red herring to accelerate their demise, while also gaining a lot of new knowledge and IP via the sharing agreements they had for Fahrenheit.
Then again, had SGI not imploded, we wouldn't have NVidia today...
> Then again, had SGI not imploded, we wouldn't have NVidia today...
nvidia is basically one of the worst actors in the market, though. Artificially increasing prices and artificially segmenting the market, using anticompetitive methods to gain an advantage (oh how everyone loves gameworks), etc.
> That isn't to say that SGI didn't have some bumbling management of the time
Understatement of the century.
> Then again, had SGI not imploded, we wouldn't have NVidia today...
That gets the timeline all wrong. Nvidia is a much older company than you are probably aware of, and might have been quite successful (and quite capable of attracting top talent) had SGI managed to stagger along for quite a bit longer. SGI never, ever had a shot at making the mass-market stuff that let Nvidia grow as quickly as they did. It wasn't in their DNA.
Microsoft also bought SoftImage and didn't do much with it except make sure it worked on NT. This breaking of the SGI toolchain was not a friendly move.
You are right about fahrenheit, it doomed everything.
It took a long time to standardize and integrate the US power grid (which even today is basically 3 loosely-connected systems). Some parts held out longer than others.
My brother recently visited a hydro dam in northern Minnesota that had one turbine operating at 25hz even as recently as the 90s, serving at least one industrial customer still running equipment that predated the interconnected 60hz grid.
I've seen a few cases where there was a need to run 120VAC a thousand feet or so. The most cost effective solution being to step the voltage up to 240VAC and then back down at the load. Basically you save 75% on wires which pays for the two transformers. Bonus, you can use a tapped transformer to compensate for line loss.
American homes are typically 240vac, 60hz, between both hot lines. (That's what feeds car chargers, baseboard heaters, dryers, and stoves.) 120vac comes from taking one of the hot lines and one of the neutral lines. It allows a home to have two voltages without a transformer.
I have no information on the voltages used in the early AC grid but. In the US distribution is 15kV. So the turn down even for 240VAC is 60 to 1. Some experience with pre WWII house wiring says 20-50 amp service was common. The difference between the economics of AC vs DC might have not been so stark when people were running a couple of 60W lights.
[1]Seen three old houses run off two 20amp fused circuits.
The open question for me is why did Edison even bother with DC at all? A DC generator requires a commutator. If there is anything that screams dodgy, that would be it.
Sliding friction. high current contacts. 60 times a second? Seriously?
There used to exist these "buttons" (for lack of a better term) that you could place at the bottom of a lamp socket to make your bulbs last longer. They were actually a diode shaped like a disc (more or less), that ultimately halved the amount of voltage going to the lamp (while also acting as a half-wave rectifier to form a noisy DC current). So in such an application - yeah, running a lamp at half its rated voltage would lengthen the life of the lamp (while reducing its brightness of course).
There was once the argument that DC would cause the lamp filament to vibrate like AC would (in the presence of the Earth's magnetic field?) - and thus if you ran your lamp on DC it would last longer due to not being mechanically stressed (I think this was one of the pitches behind those buttons, too).
I think it was later found that the argument had little validity, and was more a marketing pitch. That said, a lamp does experience a moment where, when the current (AC or DC) is switched on, the filament does "flex" - partially from magnetism, partially from thermal loading as it heats up. This flex, over time, does produce a mechanical stress on the filament. It's a major reason why incandescent lamps typically burn out when you turn them on.
Just like in politics, a lot of that opposition was manufactured by PR. Apparently Edison's company publicly executed an elephant using AC current to show the "dangers of AC".
Perhaps 100 years from now, it will be amazing how much opposition there was against human-caused global warming today – a connection that none may doubt then
Ironically I was reading recently(wish I could find the source) that some power distribution is switching back to DC due to the inductive load that the AC wave generates over long distances. Solid state has come along far enough that the conversion losses are comparable with AC transformers.
This is because over long distances synchronizing power phases becomes a seriously problem, with serious consequences. By running HVDC you avoid the issue by pushing the phase synchronization to the endpoints.
Fascinating -- I'd always heard that 24 fps developed as a US film standard compared to 25 fps in Europe because of the difference in AC power frequency (24 and 60 being a pretty straightforward integer ratio). And yet, during this time, LA became firmly entrenched as the center of the American film industry while producing 24 fps films. I wonder how that squares -- was this something people had to deal with, or does this article possibly overstate how widespread 50 Hz power was?
The power frequency isn't that relevant for film, since it just powers motors on the cameras and projectors (or often just hand cranked). In fact, prior to the introduction of sound in films, the exact playback speed wasn't as important, and cinemas often varied.
Where it becomes relevant is for television, and in both cases the refresh rate matches the mains frequency. PAL is 50Hz interlaced, and NTSC is 60Hz interlaced.
I've heard of European broadcasters speeding up a film slightly so that it's 25 fps instead of 24. Thus the same movie could be several minutes shorter on European broadcast TV. As long as the speed difference is small enough, people don't notice.
Digressing a bit from the power grid, but American broadcasters will often speed up older shows (e.g. Friends reruns) because an hour long block of TV has more commercials now than it did in the 90s.
I've always noticed this in the PAL DVD releases of The Shield compared to the NTSC captures. In PAL, the music and voices are noticeably "sped up", mostly noticeable by the pitch shift.
This article covers the topic in more detail, but the general gist is that the film speed was controlled by the projectionist not the camera man recording the film, which was a set speed and powered by hand:
Nah, the 24 frames is the lowest frame rate before we sense the individual images. Leads to less film usage.
Note btw that we need 60 frames to get lifelike.
And gamers often want higher because the input sampling of a game is hooked up to the frame rate, so the higher the rate the more responsive the controls are...
I went to film school in Finland around 2000, and I remember that 25 fps was the preferred production and projection rate for 35mm. I got the impression that this was common practice elsewhere in Europe as well, but I don't know for sure.
I think the point is the film framerate was chosen to match the TV framerate, not the other way around.
Before TVs were invented, we had actual films being rotated by hand crank or by a motor - their fps is independent of electrical mains. But cathode ray tube TVs were made to match mains frequency. Then to avoid having to do some kind of conversion, it was easier to film your movies to match TVs.
Right, standardizing on 25 fps in Europe came from PAL's 25/50Hz rate.
I think 24 fps for sound film originally arose from the requirements of the optical soundtrack: slower running 35mm film wouldn't have enough resolution for decent sound quality. That's my guess anyway.
Silent films were often shot on slower rates around 16-18 fps (which explains the widespread comic "speed-up" look for video copies of silent movies, as the transfer was done the easy way by playing it back at 24).
Actually it has to do with how film is shot. Lightbulbs pulse at mains frequency, shoot indoors at 24fps in 50hz land and you'll get a horrible 1hz strobe effect.
There are tricks you can do with shutter angle that negate this, but they are beyond the scope of this comment.
The story of why parts of the US used 25 Hertz power instead of the standard 60 Hertz is interesting. Hydroelectric power was developed at Niagara Falls starting in 1886. To transmit power to Buffalo, Edison advocated DC, while Westinghouse pushed for polyphase AC. The plan in 1891 was to use DC for local distribution and (incredibly) compressed air to transmit power 20 miles to Buffalo, NY. By 1893, the power company decided to use AC, but used 25 Hertz due to the mechanical design of the turbines and various compromises.
In 1919, more than two thirds of power generation in New York was 25 Hertz and it wasn't until as late as 1952 that Buffalo used more 60 Hertz power than 25 Hertz power. The last 25 Hertz generator at Niagara Falls was shut down in 2006.
I had a Waring Blendor [sic], cat. no. 700A, with the widest range I've ever seen: "115 Volts, 6 Amps, 25 to 60 cycle A. C. - D. C." I haven't been able to pin down an exact date on this model, but it seems to date from the 1940s or so, when the U. S. power grid still hadn't completely settled on a standard. I've read that portions of Boston still had 110 volts DC in residential areas up through the 1960s, though I've been unable to find much detail about this.
It basically doesn't make a difference what polarity you feed it, if it's positive going into the rotor then it's positive going into the stator as well. When the rotor changes polarity, so does the stator.
Growing up in Southern California, I remember always finding old clocks with conversion stickers on them - I've been looking for a good source on the technical details to find out what they needed to do to accomplish the changeover. I'm not willing to pay 35 bucks to read the IEEE article however.
I have a ~10 year old GE microwave. After using a portable power generator which probably does not have a very accurate frequency, this winter when the grid was out, it seemed to switch itself over to 50Hz, and stayed that way after electricity from the grid was restored.
It was very confusing as the clock consistently ran too fast, and the timer ended before food got as hot as it previously had. I was surprised that something like that would be built into the microwave, and that it would be able to guess something like that. Eventually I unplugged it for a couple hours, and it went back to normal when I plugged it back in.
I recall finding a 'frequency converter' at a surplus store in LA from around the time of the war. Basically it was a motor generator set, the motor ran on 120V AC 50Hz and the generator produced 120V AC 60Hz. I asked about it and it was the first I had heard that LA had been 50Hz at one time.
I presume it was used in a lab or something, it weighed quite a bit and didn't look like something you'd have on your kitchen counter.
Having read the article, yes this was a common way - sometimes no conversion was required. Or a new motor was required, because they were using a 60hz motor to start out with (a 60hz motor running on 50hz makes more torque), and with the frequency change the motor would no longer make adequate torque.
The most recent episode of the podcast 99% invisible talked about this. Basically anything with a motor needed to be replaced or somehow changed so the motor would run at the right speed.
If you didn't, your electric clocks would run 12 minutes fast every hour (for example).
Incandescent bulbs certainly don't care about frequency, and I can't think of a frequency-dependant component in 40s florescent lamps. The starters and ballast from that era were pretty simple, unless there's some difference for 120V operation I don't know about.
Florescent lamps became really popular during WWII, so there were many around in 1945.
There are other, smaller countries that have mixed frequencies.
PG&E (California' primary gas and electric utility) still has DC tariffs, thought I believe they provision it by installing a converter at the pint of use. I believe this is just for elevators.
Parts of Back Bay in Boston were still wired for 100V DC mains voltage into the 1960s
The vibrating reed frequency meter which is the lead image in your link is super cool - never thought about a meter using tuned resonators to measure frequency instead of a discrete time-based counter.
higher freq reeds were also used to switch on/off street lights and household low-tariff water heating here in NZ - hearing the switching tones coming through the stereo was part of my childhood.
A local ham friend build a high power tone generator back in the 70's ... sent slow morse across town in the early hours one night by turning on/off all the streetlights in his neighbourhood
with the global trade electronics are simply made to work anywhere 100~240V 50/60Hz so the manufactures don’t have to make several different models. (They obviously still ship with different plugs but the actual converters work with almost any input)
In the 1990s, Grand Central was rewired, and everything except railroad traction power was converted to 60Hz. All conversion equipment was replaced with solid state gear. It took quite a while just to find everything that was powered off one of the nonstandard systems.
It wasn't until 2005 that the last 25Hz rotary converter was retired from the NYC subway system. (Third rail power is 600VDC, but subway power distribution was 13KV 25Hz 3-phase.)