Guessing the split here is between internal and external use-cases. Maybe 7% of e.g. Amazon EC2 is ARM, because that's the percentage of orgs AWS and other cloud vendors predict are "ready" for arm. But internally? How much of Amazon S3 is ARM? How much of Cloudflare? Google Search? Netflix? I'd expect pretty high numbers in these vertically-integrated DC niches — they can literally do whatever they want to solve these problems, and one of the largest KPIs for any of these services would be performance + scalability (i.e. requests served per OpEx dollar.)
Why do you think it’s gaining market share? Because more of ARM is deployed than non-ARM.
I couldn’t find the direct source any more, but I remember that in the year 2020 AWS deployed more ARM than non-ARM, and today it is at around 80% ARM and 20% non-ARM for a newly deployed system. You only get to single digit market share because of the enormous amount of non-ARM that still exists and is being used. ARM is growing faster than non-ARM. That’s what I’ve said.
This gives me some cognitive dissonance. On the one hand, having a common connector will reduce waste and be much more convenient (eg "Hey have you got an iPhone charger?") and make losing a charger a cheaper mistake. On the other hand I'm not sure I like the idea of government mandating electrical connectors on devices, which could stifle innovation, or be very shortsighted in the typical government-rules-on-tech way (eg "banning encryption").
Have you seen this outside Italy? Everywhere else in Europe I've always seen the Europlug, it's only Italy where sometimes you have the weird 3-in-a-line plug, at least in Western Europe
Denmark has the smiley face plug. It works fine if you don't need a grounded connection, but still.. they have a lot of sockets that don't follow the EU standard.
I think France has the one where the ground pin sticks out? That can be a problem depending on which cable you have.
And then there's UK of course, though they're not part of EU anymore.
We have the UK plugs in Ireland too and we’re in the EU. I wish it was feasible to switch to the europlug, I wouldn’t have to carry a handful of adapters around when travelling.
Your comment really got me thinking. Or actually continue thinking since I recently met some N.I. relatives in France and noticed their plethora of plug adapters. I almost forgot Ireland (the whole island) uses a UK style plug as I haven't been there in a while due to life and covid, and meanwhile have travelled a lot in Europe. Meanwhile, I grew up in Canada (US style plugs), recently lived in Japan for a decade (US style plugs), recently traveled a lot for work between Japan and Europe, but sometimes also UAE (UK style plugs). I just moved to mainland Europe and living in Ireland is a possibility someday.
In tech circles like HN, I have seen the UK style plug and its fuse and local power switch lauded for its superior safety, but it's really just belt and suspenders added to make up for old, cheap not forward-thinking design. I like US style plugs because they are small, and for portable stuff often have a folding design that saves space. But I'm biased because I have only lived in Canada and Japan until recently so naturally all my stuff uses this standard.
In addition to that I am a huge nerd about plugs, chargers and travel adapters. Anything I have that is remotely portable is usb-c powered. I've hacked every old wireless mouse, keyboard, headphones that I own to use usb-c to reduce the number of cables and adapters I need for life and travel to the minimum[1]. I don't recommend this to you or anyone else but when I do end up with a UK plug power source, I use a plastic pen lid to brute force my europlug adapter into a UK socket.
Changing to europlug in Ireland is an interesting topic. It would be so tough to phase-in europlugs given that one of the differences is the UK plug with fuse is possible to use with ring circuits. And changing existing ring circuits to radial (each circuit goes back to the breaker box on an independent wiring run) would be expensive enough in wood-framed houses/North America but incredibly expensive in Ireland. Ireland is famously mostly deforested, so old houses are usually stone and new ones are usually concrete. On top of that, Ireland didn't experience recent destruction and rebuilding/postwar-rebuilding like mainland Europe did. A lot of houses[2] are very old, and many are protected cultural assets even if privately owned so you couldn't just send any old contractor in to start drilling holes and carving out channels or adding wall-mounted conduit for new wiring. It wouldn't be popular to spend on this or to potentially scar a lot of historical or family homes.
On the other hand, post-brexit there could be enough will to align more to the single EU marketplace. I don't want to be that guy that speculates "just do this" on a topic I am not an expert in, but I didn't find any serious proposal or discussion by googling it. I think the solution would be to require local outlets or permanent adapters that assume the function of the fuse in UK plugs at each outlet, and provide both EU and UK plug compatibility. You wouldn't want to allow temporary adapters because any unqualified person could make a working but unsafe configuration. You wouldn't want to force EU plugs on everyone because that would also encourage unsafe modification/adapters, or be unpopular/expensive by requiring modification of an antique device.
[1] If you are comfortable with soldering, filing, 3d printing and taking things apart, you can find usb-c sockets with the appropriate resistors installed, and tiny dc-dc converter boards with usb-c input on aliexpress, and easily update old low-power, stuff from micro-usb, barrel connector, etc to usb-c. I even added usb-c power to my NES, SNES, gamecube, wii and wii-u.
[2] I say houses and not buildings because I want to highlight the considerations for the average person, and I also presume anything commercial or open to the public already mandates using the latest building codes and safety features above all considerations. This is evident when you visit a beautiful ancient castle or church and see a jarring green exit sign, or handrails and conduit laid all over the place on top of ancient stonework. Of course, many of these places due their best to conceal modern things and keep everything beautiful except for the green exit signs. But Ireland would have to commit to funding the work to upgrade all of the family heirloom old houses.
Traveling much of Europe (humble brag) the only places where I have seen some noticeable difference is Britain (of course) Italy and Malta. That being said, any remotely modern building or establishment expecting travelers on Malta or Italy has plugs with at least USB-A or the two prong "European" (with the round prongs, don't know the standard name).
The UK is not in the EU and use all kinds of weird units and other nonsense. If they had stayed in the EU long enough, they would have caught up eventually, as will Italy.
I doubt we're changing the side of the road we drive on, but now it's much more expensive to import LHD used cars from the UK. I doubt that we're changing power plugs because there's hundreds of millions of them out there.
I'm not sure that other EU standards there would be that we don't have, being an EU member, except for decent cycling and mass transport infra, and that would be dramatic improvement.
Having travelled the EU with Irish/UK plugs on my electronics, it's a pain in the ass unless you have swapped the plug on several power strips.
Also, basically all of the adapters say not for long term use, and they're nowhere near as stout or secure as either type of plug alone.
Anyway, the UK plug is the best one out there, except for stepping on them. They're worse than legos, and that's saying something. (and maybe size, weight, and cost. But electrically, they're good)
true but also false: italian, french and german standards (and the rest of smaller european countries) are different but compatible. and i'm glad for that.
This is a great example in that residential electrical systems evolve at a snails pace. There's almost no innovation and most houses are full of dangerous legacy systems because regulation makes updating difficult and expensive.
Also of note: you aren't required by national or state law to use any kind of electrical system in the U.S. The National Electric Code is not a binding document. It's up to individual municipalities to decide whether they want to adopt it (or some other standard, which they're free to do).
Except electricians and the companies that insure them for professional liability, or the companies that insure houses against electrical fires use that national code as a reference, same for building codes, etc.
It's interesting to read the comments here from people that have equipment and stuff in their houses and cars and workplaces that are constrained by regulations and laws all around them, but are worried about a connector on their phone.
A "free" market needs regulation otherwise it develops into monopolies. Setting standards is a way to regulate to ensure a level playing field for competition.
It's not like the EU has rushed this decision, or that anyone is proposing some other standard that wasn't considered.
There are plenty of non governmental standards. Some like openGL or http are driven by industry consortiums. Others like keurig/Nespresso generic pods, x86 chips, or cup holder sizes are driven by compatibility with wildly successful products.
Edit: usb-c is both the former and the latter. Macbooks and ipads had already switched, it seems likely the next iphone would have been usb-c anyway, or perhaps portless.
You don't even know what else could have happened if there was no standardization. Maybe the industry must have progressed without the need of outlets in the first place.
Yeah, we would definitely use nuclear mini-reactors in our vacuum cleaners otherwise, if only it wasn’t for the damn EU!
These are very well understood areas, there is only so many ways you can transport electricity. Also, there is nothing stopping innovation “after” the plug. It’s like saying that HTTP somehow stifles innovation.
I am similarly of two minds. As an engineer, I can imagine how legal mandates for technology choices could stifle innovation and make the future of technology worse.
But at the same time, I think there may be a place for mandating interface compatibility.
To give some old-school examples, how differently might industrialization have gone if rail systems weren't mandated to operate under some shared standards? Or where would we be if every country had a handful of competing power grids with different voltages and outlet types?
So I am not sure the cable which plugs into our phones is the correct target for standardization, but in spirit I think it's worthwhile to consider how the public good of interoperability should be weighed against a handful of corporations' interest in building technological moats around themselves and their users.
> To give some old-school examples, how differently might industrialization have gone if rail systems weren't mandated to operate under some shared standards?
I'm not sure that analogy holds particularly well. The UK did build rail systems without shared standards, and only standardised on gauge in 1846, which was very late in the industrial revolution: https://en.wikipedia.org/wiki/British_Gauge_War - Standards for other things like signalling also came late, and broad gauge was still in use until 1892. (Notably though, this did happen just at the turn of Railway Mania, so affected a lot of the new lines.)
Effectively, the UK let (mainly by omission, not intentionally) the solutions compete and then selected the most broadly used solution to standardise on.
(I'd love to see any scholarly research on this though; I haven't seen a huge degree of it. Obviously things are different outside of the UK, but given it's where (uncoincidentally) both the industrial revolution and railways were born, that's the clear case to look at.)
> Effectively, the UK let (mainly by omission, not intentionally) the solutions compete and then selected the most broadly used solution to standardise on.
We're down to two connectors here, one of which is only used by a single vendor; do you think that hasn't already happened?
I’m not questioning that at all! I’m only responding to the question posed by the parent about railway standardisation in the context of industrialisation.
What happened is the worse more common standard was converged upon, so instead of the west coast main line having a comfortable, stable 7’ gauge, we ended up with the less comfortable standard gauge that requires longer trains to seat the same number of people.
I think that’s what weirds me out. In cases like rail, it’s necessary for interoperability and prevents anti competitive practices. There’s a clear state interest there.
But swapping a plug doesn’t buy us that. This is evidenced by the fact that no company is forced to use lightning to compete.
Almost everything uses USB-A at the outlet and choose your other end. We’re starting to see USB-C at the outlet, but same thing.
I also don’t see the e-waste argument. I haven’t had a cable outlive a device. My kids go through them every other month. No plug fixes that, it’s almost always where the cable joins the plug. (Except micro-USB, we shall never speak of that again).
My other biggest practical problem is that, despite appearances, USB-C != USB-C, and the off-the-shelf cables are awful, especially for normal humans.
In my dream world, I’d love to see clearly labeled device and cord capabilities, and USB-C all the things, and a reigned in USB-C spec.
I just think that government is a bad place to do it, and don’t see a necessary state interest case for their intervention.
> I also don’t see the e-waste argument. I haven’t had a cable outlive a device.
My old Nokia Micro-B cables outlived their bundled devices by a factor of… 3x or so? And I only took those out of use because I don't have a lot of stuff that still uses Micro-B.
Actually, I don't think I've ever managed to break a USB cable yet. What on earth are you doing to the things?!
Apple cables were notoriously shitty for many years, because they didn't build strain relief into them, until a couple generations ago, and they would fray with time.
I imagine this is what GP experienced, based on my experience with it on iphones and apple laptops.
Come to think of it, I don’t break them. It’s when I let my kids borrow/take them. Except micro, those connectors just wear out in a year or two.
My kids keep their tablets plugged in while using a lot of the time. I think it’s the stress on the joint from having the cable taut or yanking it around or something.
I did change to making them pay for replacements. That definitely slowed the rate :).
I can “reassure” you: this is unambiguously a bad idea.
First, market forces already standardized on two main standards.
Second, the interop problem is a $10 cable between wall-wart and device.
So the problem solved is a small one. I think this idea got its momentum from years ago when practically every model of phone had a different charger and wall wart.
Meanwhile, think about what you lose.
Is USB-C really the end-game of charging connectors? It has existing issues with labeling and capabilities confusion. What about mag-safe-style connectors? Does a phone necessarily need a charging connector at all and might there not be advantages to not having one?
As an aside, ewaste will increase in the short term, of course, as people will have to throw out their lighting stuff and buy USB-C stuff. (Don’t worry, it’s not a lot — small problem, remember.)
> First, market forces already standardized on two main standards.
EU sat down the manufacturers and told them that if they don't come to an agreement the EU will make the choice for them. Most of them came to an agreement. So you are right, market forces did make manufacturers to standardize on two main standards. That force is named EU.
> Is USB-C really the end-game of charging connectors?
No, and EU has no illusions that it is.
> Does a phone necessarily need a charging connector at all and might there not be advantages to not having one?
No, and the EU doesn't think that it does.
> As an aside, ewaste will increase in the short term, of course, as people will have to throw out their lighting stuff and buy USB-C stuff. (Don’t worry, it’s not a lot — small problem, remember.)
This is just stupid fearmongering. People won't have to throw their lightning stuff.
Even if they throw them out, it's better to pull off the bandaid than to go on creating more and more redundant cables and e-waste for the next 30 years
> First, market forces already standardized on two main standards.
So, one standard too many. Especially when one of those standards is held hostage by one corporation.
> Is USB-C really the end-game of charging connectors?
It's close enough, we can already see that improvements have been incredibly incremental already. Hell, it would have been fine to stick to micro-B or even mini-B.
> What about mag-safe-style connectors?
You already have people building those as third-party dongles. Clearly it doesn't have to be part of the spec.
> Does a phone necessarily need a charging connector at all
Yes. Wireless charging is inherently far less efficient than wired charging.
> and might there not be advantages to not having one?
No. To steal a quote from yourself: this is unambiguously a bad idea.
I'd be interested to see an actual proposed improvement that isn't just "we changed the plug so you have to buy new cables (from us)!" or "we made charging 80% less efficient!".
But so far there has been no meaningful improvement in the last… 20 years or so, and an awful lot of new incompatible standards.
Yes. The one improvement that has happened has been the advent of fast chargers… but that has always been rolled out as a backwards-compatible upgrade that is negotiated between the phone, charger, and cable.
Shit, did not know web devs are so bad so you get EU GDPR pops in US, the idiots are showing you what dirty thing they are doing without having to do it, or maybe the devs are smart and wanted you to see all those 100+ partners are giving the US citizens the option to opt out, if the later great from them to give you the option.
But you're assuming this USB-C decision is eternal right? Couldn't the problems you raised be solved by having a standards organization which could monitor and update the status quo over time based on the technologies available?
We have plenty of free market examples of why "innovation" in the connector space almost always spawns lemons and makes everything worse in the meantime.
Do we? By the same token what are examples of where innovation in the connector space helped out?
I find laws like this problematic because measuring success or failure is super tricky. How do you know about innovation that hasn't happened? If in 5 years or 10 years we're using USB-C connectors is that great or does it mean there's some better idea that isn't being taken to market? I'm sure there are subject matter experts who might confidently weigh in on that but could they realistically get this law changed? Even if those experts agreed there was a better solution would the majority of the incumbent players in the space want that solution?
Laws like this create a situation where opposition depends upon people missing things they've never had and wanting things that they can't conceive of.
> On the other hand I'm not sure I like the idea of government mandating electrical connectors on devices
Depends on the device.
Standards for EV fast charging are a very good thing. North America has two different plugs and protocols for DC fast charging (three if you count CHAdeMO) and it's a mess. The EU has standardized on CCS Type 2 Combo so any brand of CCS charging car can charge on any brand of CCS charging network. That's good common sense and it benefits everyone.
Incompatible charging infrastructure doesn't benefit any EV owner. It's a drag on the EV market. Closed networks like Tesla's and Rivian's are continuing to be part of the problem.
> Closed networks like Tesla's and Rivian's are continuing to be part of the problem.
All new Tesla stations in Europe use CCS and most of the older stations have been retrofitted with CCS connectors in addition to Type 2.
I have an adaptor for my 2017 Model S that allows it to use CCS instead of Type 2 and newer cars can be converted so that CCS connectors can be used without adaptors. All new EVs in Europe already use CCS and Tesla are slowly opening up their network to other brands of car.
> All new EVs in Europe already use CCS and Tesla are slowly opening up their network to other brands of car.
Yes. You're restating what I have said. Tesla should hurry up and open them all.
> So how is Tesla part of the problem?
Because Tesla chargers are not open to all brands everywhere.
A better question to ask is why is Tesla not the leader in multi-brand charging. Other charging networks have delivered chargers which support all brands of EV with longer cables for more vehicle types and are faster than Tesla's chargers to boot.
I wouldn’t be too concerned. They did it before to standardise on micro USB, in such a way that didn’t stop manufacturers from moving to USB-C. This is really just an update of that same rule.
This so much. This era was defined by vendor lock-in. Getting e.g. a dedicated car phone audio setup was tough decision, since you could not just change the phone.
I still have a box of old phones with custom chargers, headsets and data cables. The Nokia barrel plug (different sizes), the Siemens comb style connector, the Ericsson break away type connector, the Motorola two prong, the Bosch connector and some really weird LG/NEC/SAGEM plugs.
Yes, exactly: The market could innovate and develop better connectors. For example Apples Lightning, which is older (and smaller/more robust) than USB-C. Imagine instead the EU mandating micro USB and how we would be forced till the end of times to have a 50% chance to plugging it in wrong.
Is USB-C the best connector or form factor we could ever do?
Maybe not, but it is good enough. "Perfect is the enemy of good" is a phrase I like to keep in mind while programming or making something and can also apply here. Should we wait around for a perfect connector? Or is USB-C as close as we can get as of now?
Forcing everyone to change cables creates waste. Apple is teh onyl company that might not have already been using USBC by the time this went into effect, and now all their customers that already had lightning cables and accessories will have to throw them out of get adapters. Not to mention that by then we will probably already be talking about USB-D or whatever is coming next. Maybe the USB counsel can just conglomerate everything under the usbc name so that companies can just use whatever they want. "Micro usbc, its just micro usb but with a new name!". Maybe that is why they have been renaming all their skews every 6 months.
I wonder if they will just be able to ship iphones with lightning cables and a usbc adapter and a usbc to lightning port adapter so the customers can just throw out the adapters and use the lightning cables.
At this point, I think even the average household in developing countries has USB-C. And that's probably due to entry-level android phone makers choosing to go with it.
> On the other hand I'm not sure I like the idea of government mandating electrical connectors on devices, which could stifle innovation, or be very shortsighted in the typical government-rules-on-tech way
You seem unaware that the government regulate a LOT of things and while this has stiffed some innovation, the cost is probably worth it.
When was the last time a house burned down due to electrical issues? In the past, prior to standards this was rather common.
If you want a general understanding of what things were like prior to "regulation".
It will end up with the terrible quality fake usb cables and chargers we have today, thanks for the usb organisation not being stricter. Seems hard to find specific types of USB 3/C cables. In some cases they have damaged the connected devices which should be considered a failure of the U part.
If anything this will increase the value of official apple chargers because at least those will be well designed.
Even if you're buying from Amazon or AliExpress, there have been very few cases over the last few years of them damaging your device compared to the first years of USB-C.
They might still have other issues, but nothing that damages your device.
> On the other hand I'm not sure I like the idea of government mandating electrical connectors on devices, which could stifle innovation, or be very shortsighted in the typical government-rules-on-tech way (eg "banning encryption").
This was a response to an old law where bakers were accused of "cheating" customers by overpricing undersized loaves or intentionally creating giant air pockets in their bread to minimize the amount of converted flour the customer would be getting.
That article doesn’t have any sources and I’m hesitant to take it as a primary source. A blog isn’t an encyclopedia, even with the same name.
Also doesn’t pass some cursory thinking. If the law is about loaves being too light or small, how does giving out an extra loaf to people who buy 12 help? Who is even buying 12 loaves of bread when restaurants are rare and refrigeration non-existent? Armies, but then they’re buying even way more.
I don’t know why this needs a backstory. A dozen is a common number for objects because it’s highly composite. Then buy X get 1 free promotions are one of the simplest ways to give discounts. No one has to be the first to do it. It could spread and people could come up with it on their own.
> If the law is about loaves being too light or small, how does giving out an extra loaf to people who buy 12 help?
If the baker gives you 13 and calls it 12, that makes it harder for a greedy baker narrative to stick. It doesn't have to be logical, it's about managing impressions.
> Who is even buying 12 loaves of bread when restaurants are rare and refrigeration non-existent?
The average family used to be the size of a small army. 12 loafs of bread could be eaten in 1 or 2 days if you've got 12 hungry kids and bread is a major component of their diet.
It doesn’t add up that it would just be about managing impressions or controlling a narrative. The fact that there was a law regulating the price of loaves of bread is well recorded. Anyone selling loaves too small but also selling baker’s dozen would be in violation.
I regret trying to say buying a dozen would be uncommon. It’s more that even if they sell a dozen, of course there’d frequently be orders smaller than that.
Bakers were under constant suspicion of cheating customers and the regulations. Adulterated flour was a big concern too, not just loafs too small or airy. To manage their reputations, I think bakers would rationally take any edge they can get.
I totally agree that bakers would give an extra loaf to help their reputation. I seriously doubt this was done as a way to stay in line with the law as the linked article claims.
Intel is meant to do a process shrink every 2 years, which is meant to increase compute density and lower power. The last time they successfully did that was 2014 with 14nm, and since then they spent about 7 years getting to 10nm, and even now I'm not sure they're completely there. So basically all the assumptions about anything built with Intel CPUs getting faster were blown out of the water.
If you sneak "eject" into a shell script somewhere then, permissions allowing, it will suddenly open the users optical drive if they have one. This works extra well with laptop drives that are spring loaded.
I think from a hardware perspective this would be non-trivial, despite sounding like an obvious thing in principle.
In sending the entire frame with every refresh you get pixel addressing "for free", since you just send the data for each pixel sequentially in a predefined order. If you only wanted to update a single pixel, you would need to effectively send an instruction saying "set pixel x to rgb: a, b, c" or whatever, including both the pixel colour values and the pixel address. You'd also need some sort of edge detection for when a pixel is supposed to change which adds a delay.
This is fine for one pixel, but if every pixel on the screen changes at once then suddenly you are going to be sending a hell of a lot of instructions that not only contain the colour values as in the "old crt style" method, but also the address of every individual pixel too, which will then have to be decoded by the screen-side hardware.
All in all, you'll be using a lot more bandwidth and will need a much faster clock to do all of that in the same period of time. In the old school way, you just have a pixel clock that matches the pixel rate and some serdes for serialisation/deserialisation on each end which imo is considerably simpler.
If you don't send the entire buffer then a monitor should have at least 6220800 bytes of RAM dedicated to the frame buffer (1920x1080 resolution), do auto refresh (standard 60Hz) on the panel, and accept commands to overwrite said memory with new data, partially or completely.
That solution is far from what we have now, and much more like a serial LCD controller.
I think people can come up with more efficient protocols than sending x,y,rgb for every pixel. What you call "edge detection" is not necessary if you use shadow buffers. Yes, the display would need some memory, but this is measured in tens of megabytes which is not much by today's standards.
* a sink side memory of tens of megabytes is either on-chip memory (very costly) or on-PCB DRAM (still costly). For high refresh rate monitors, the memory would need to be high BW too. Note that there aren't any modern memory standards that are high BW but low storage. DRAMs with just a capacity of just a few MB would be very much pad limited.
* source side, you'd need a shadow buffer as well, and you need double the read BW to detect the difference between previous and the current frame.
All of that is technically achievable, but none of it matter for desktop monitors: the power savings are just too low to matter.
Laptops are a different matter. Many laptop LCD panels already support self-refresh. (Google "Panel Self Refresh")
But that's for cases where the screen is static: the user is staring at the screen, not moving their mouse, the cursor is static. The benefit is not just putting the link but also putting the GPU to sleep.
That's the low hanging fruit. Partial screen update doesn't save a lot more, because you'd need to power up the GPU for that.
Seems like having memory embedded in the display for a shadow buffer, and the diffing algorithm you’re proposing, could easily undo any power savings you’d get, and then some. Why are you certain that saving power is simple? How much power does the data protocol consume compared to the display itself? Isn’t the data transmission power in the noise margin of the power requirements for an active display, CRT or LCD?
Also consider MHRD[0], which is similar to nandgame, except you write your designs out in a (primitive) HDL starting with just NAND gates and working up to a functional CPU. Like the zachtronics games, it blurs the line between "game" and "work" just the right amount.