This is a good time to bring up the fact there was never an industry-wide standardization effort for laptops. A standard form-factor means components would be re-usable between upgrades: the laptop case, power supply, monitor, keyboard, touchpad could all be re-used without any additional effort. This improves repairability, is much better for the environment, and means higher-end components can be selected with the knowledge that the cost can be spread out over a longer period.
For desktop PCs, the ATX standard means that the entirety of a high-end gaming PC upgrade often consists of just a new motherboard, CPU, RAM and GPU.
A 2007 Lenovo ThinkPad X61 chassis is not that different to a 1997 IBM ThinkPad chassis (or a 1997 Dell Latitude XPi chassis). If the laptop industry standardized, manufacturers would produce a vast ecosystem of compatible components.
Instead we got decades of incompatible laptops using several different power supply voltages (and therefore ten slightly-differently shaped barrel power plugs), many incompatibly shaped removable lithium-ion batteries, and more expense and difficulty in sourcing parts if and when components break.
A little bit of forward thinking in the late 1990s would have saved a lot of eWaste.
Some parts such as batteries, storage, ram etc should at least be a standardized.
Manufacturers probably don’t want to standardize on the remaining motherboard/graphics/chassis/cooling because a laptop isn’t like an atx computer where you get modularity at the expense of wasted space. A laptop is basically a 3D puzzle with thermal components. Few consumers would buy a laptop with even a little wasted volume or weight, even if it meant better serviceability and upgradeability. Same with phones. We aren’t going to see modular phones beyond the concept stage either.
I generally agree with your comment. However, when you wrote,
> Same with phones. We aren’t going to see modular phones beyond the concept stage either.
I disagree. I'm writing this on a Fairphone 2, which I bought for its modularity & because running Lineage OS (or any other OS you choose) doesn't void the manufacturer's warranty. While I'm sure Fairphone's sales are small compared to the broader industry, I think they've shown a market exists for ethical, modular phones. I've seen other Fairphones in the wild here in France, as well as seeing them for sale on used goods sites like leboncoin.fr.
Batteries (or rather, individual cells) do have a standard: 18650. Unfortunately too thick for the ultra-thin laptops, but the older Thinkpads use them. I suspect safety is the reason why no one makes replacement laptop battery "empty shells" that take 18650s and have the appropriate balancing/protection circuitry to interface with a laptop, but then again you see mobile phone powerbanks being sold this way... go figure:
There's always beem a trickle of machines like that. The problem is that they're targeted towards industrial usage and RIOTOUSLY expensive.
https://www.bsicomputer.com/products/fieldgo-m9-1760 for example (the first vendor I saw that actually shows prices, as opposed to just request-for-quote)
It starts at nearly $2400 for a low-spec Celeron, and I'm not sure it even has an onboard battery.
What I could see as viable would be a micro-ATX case of similar dimensions, sold as a barebones for like $300-- use the extra volume from not accommodating ATX mainboards to store batteries and charging circuitry, which can be off the shelf because space constraints are minimal. Pop in some reasonably priced desktop components, and you'd have a competent luggable for under $1000.
> A standard form-factor means components would be re-usable between upgrades
We don't even have to go that far. Just ensuring that laptops can be serviced by their own users would go a long way to reduce e-waste. i.e. not soldering RAM chips to the motherboard, making it feasible to remove every single part (not gluing the keyboard to the MB for example), etc... instead of pursuing an ever thinner laptop design, which has practically no use.
MacBook Pros not being user serviceable at every component level doesn’t mean they’re not environmentally friendly - not by a long shot. In fact, building a device like that might even shorten lifespan in a laptop form-factor, not to mention no one wants to carry around a heavy, clunky machine so it likely wouldn’t sell anyway.
When people are done with their MacBooks they don’t just throw them out - they sell them or hand them down to their relatives/kids because they still work well enough, are supported by the manufacturer, are durable and have very high resale value in secondary markets.
Robust engineering, longevity, support, and resale markets do more for the environment than making components user-replaceable.
My old 2011 MacBook Air is still going strong and being used by my mother. If anything goes wrong, she can take it to the Apple store and get help promptly. She still gets software updates, and that thing can STILL be sold for ~$250-300 on eBay, Swappa or Nextdoor. If the machine breaks completely, she can take it into the Apple store to get it properly recycled in almost any part of the world.
That’s what minding the environment looks like. You have to look at the entire lifecycle of the product from the moment the raw materials are sourced all the way to making it easy to recycle when a product is end-of-life.
Apparently you haven't seen any of Louis Rossmann videos on Youtube. Let's say your grandma's MacBook stops working because of blown fuse on the motherboard. Something like that would take Louis 5 min to repair, but Apple store would just replace the whole motherboard and charge $$$. How is that environmentally friendly.
One: shout out your favorite YouTubist, I guess, but a repair Apple makes is a repair Apple has to support.
Two: it's much, much harder to support a repair done on-site with a soldering iron than it is to replace a part. These repairs are much more likely to fail under both normal and unconventional use and then will come back for more repairs--which are themselves, still, expensive to provide.
Three: waste concerns have to factor in what Apple does with the part after they do the swap. (I have no insight into what they do, but your comment ignores this.)
>> One: shout out your favorite YouTubist, I guess, but a repair Apple makes is a repair Apple has to support. <<
Saying he is my favorite Youtuber is a bit condescending. I mentioned him, because he is a loud proponent of the right to repair movement.
>> These repairs are much more likely to fail under both normal and unconventional use and then will come back for more repairs--which are themselves, still, expensive to provide.<<
If that was true, I am sure Apple would choose to repair parts instead of replacing them. ;)
> If that was true, I am sure Apple would choose to repair parts instead of replacing them. ;)
Of course it's true--everything from "that fan's just going to get dirty again, and faster, because it's been blown out but can't be re-sealed outside a factory" to "that solder joint is being done by somebody making fourteen bucks an hour, boy I hope I'm not relying on that long-term".
Why would a company that makes its money off of selling the closest thing to a unified end-to-end experience take the risk of a dissatisfied customer because of a frustrating defect remediation experience?
The quoted point is an example of a fundamental misunderstanding of how Apple views its customers and how Apple makes its money. But stuff like that is a closely-held truth in the various repair-uber-alles communities on the web regardless of reality. (And then, as 'Operyl notes, your cited YouTubist attempts to shore up his own little slice of community by instilling in them the "enlightened"/"sheep" dynamic. Petty little cult leader, that.)
Sorry that you read some real distaste for that mess as condescension, but not sorry to voice that distaste.
>> Why would a company that makes its money off of selling the closest thing to a unified end-to-end experience take the risk of a dissatisfied customer because of a frustrating defect remediation experience?
You make it sound like Apple has never done it before.
Case in point: the overheating early 2011 Macbook Pros - a problem experienced by thousands of customers.
Apple basically pretended the problem didn't exist for well over a year (there was a gigantic thread about the issue in the Apple support forums). By the time they did issue their recall (or "repair order", if you want to use Apple's euphemism), a lot of people had already divested their dead Macbook Pros for a loss.
Mine had bricked just after my AppleCare expired, and I wasn't about to spend $500+ to get a replacement logic board (which basically had the same defect, except it was a brand new board. Source: I had replaced my logic board under AppleCare only to have the problem recur within two months). I was lucky that I didn't dispose of my Macbook Pro before the repair order, but I had bought a replacement laptop by the time it was issued (spoiler alert: it was my first non-Apple laptop purchase in a decade).
They also put up barriers to getting the repair order. You had to prove you had the heat issue and that it was causing crashes. Since mine was bricked, it was easy. But a friend of mine (who had two of the affected models) had to jump through hoops at the Apple Store to get his fixed.
Those early 2011 Macbook Pros were mostly high end 15" i7 models, meaning they were not on the lower end of Apple's Macbook Pro line. People paid good money for them. If Apple didn't have their heads in the sand and gave everyone replacements (i.e., a 2012 model, which didn't have heat issues) as the problem occurred, it would have been a rounding error for them. But they didn't do that.
>> fundamental misunderstanding of how Apple views its customers and how Apple makes its money.
Speaking from my one experience - I didn't feel like Apple was interested in my experience at all. While I never considered myself a fanboy, I was very loyal to Apple and totally invested in the ecosystem. After my experience with the 2011 Macbook debacle, I abandoned them completely. It meant writing off a lot of money spent on Mac software, mobile apps, etc.
He’s cringe at best, just as bad as the rest of them at worst. He’s playing for the camera, the audience. I wouldn’t take much of what he says seriously, but that’s just me I guess.
My son spilled milk on our 2015 MacBook Pro. Apple wanted $1,100 to fix it. it took two separate shippings to New York but Louis rossmann fixed it for $550. You need to wake up, grow up, and grow a brain. Apples excessive greed is real. the fact that you were lucky and haven't dropped or spilled anything on your laptop in The last 5 years is not evidence that apple is a great company!
I’m sorry, what? This is the exact shit I’m annoyed about. He is inciting stuff like this, telling his user base to call anybody that disagrees with them things like “sheep”, it’s even in his logo. I do not agree that the best thing to do is call people you disagree with “asleep” or “sheep,” or to “grow up/a brain.”
"User serviceable" doesn't imply that the user will actually perform any service. I would be willing to posit that for the vast majority of users, an ATX desktop is just as "serviceable" as a Macbook Pro. In the case of the desktop, if it breaks, they take it to their IT department or Best Buy and get a technician to fix it. In the case of the Macbook, they take it to their IT department or the Apple Store and get a technician to fix it. And the Macbook Pro is a darn sight lighter, more portable and more attractive to have sitting on your desk...
This is not taking into account that most people won’t know how to fix the problems that arise from connectors wiggling loose or the replaceable hard drive failing. Additionally, there’s also the problem of the connectors themselves wearing out and breaking: e.g. I have a Lenovo X220 that no longer charges because the power cord connector is broken.
That requires significant trade-offs in durability, weight, and design.
Not to mention you don’t want the typical user (forget the HN audience) to replace the components themselves.
Most professional users are on corporate enterprise device plans and you don’t want employees or the IT department replacing components either. It’s far better and cheaper to get the employee back up and running with a new machine while the one in need of repair gets shipped off under enterprise warranty.
In many cases, a well-designed, durable product can be repairable. While the actual earbuds were not wonderfully well-reviewed, the Samsung Galaxy Buds were both tiny and legitimately repairable: https://www.ifixit.com/Teardown/Samsung+Galaxy+Buds+Teardown...
And they were a shit product. As you yourself admitted. It doesn't help if something is supposedly "repairable" (by the 0.5% of buyers who might be inclined to do such things) if the product is such crap that it gets thrown away after a few weeks.
There’re upgrade-friendly laptops on the market. I’ve replaced RAM, disks, keyboard, LCD, CPU, wireless cards in my laptops. Soldered CPUs are unfortunately inevitable on modern ones, but many other components are still replaceable if you pay attention at the time of purchase. Usually voids warranty but I don’t care too much.
As a nice bonus it sometimes saves money. I’ve only picked my netbook for CPU (i3-6157U, 64MB L4 cache), GPU (Iris 550, 0.8 TFlops) and display (13.3” FullHD IPS). Upgraded to adequate amount of RAM (16GB) and larger and faster M.2 SSD. Both were too low out of the box, and even today there’re not many small laptops with 16GB RAM.
> Soldered CPUs are unfortunately inevitable on modern ones...
To be fair, even on desktop replacing a CPU on the same motherboard is a pretty niche thing in my experience. Not to say people don't do this, but most of the people I know upgrade both at the same time, either because of incompatiblity or because of substantial gains with the newer MB. So soldering the two together is not as bad as glueing keyboard to the motherboard in my eyes.
In some cases, upgrading a CPU prolongs useful life of the device.
The desktop I’m using now had i5-4460 at the time of purchase, eventually upgraded to Xeon E3-1230v3. Only going to upgrade motherboard after AMD releases Zen 2 desktop CPUs.
A family member uses a laptop that initially had i3-3110M. I’ve put i7-3612QM there, it’s comparable to modern ones performance-wise despite 6 years difference, e.g. cpubenchmark.net rates i7-3612QM at 6820 points, i5-8265U at 8212 points (because 35W versus 15).
I agree about glued keyboards. Keyboards are exposed to outside world and also subject to mechanical wear. The only thing worse than that is soldered SSDs. Makes data recovery very hard, and also rate of innovations is still fast for them, SSDs that will become available couple years in the future will be both much faster and much larger, upgrading them regularly makes sense for UX.
On a desktop it is possible to replace the motherboard, on a laptop not so much, so not soldering the CPU would at least give you the ability to upgrade the processor to the fastest supported by that motherboard .
My current weapon of choice is a Lenovo y50-70. Adding RAM is super easy, but I had to replace the keyboard at one point which was a nightmare since it was all glued and had small pins to hold it together, not to mention for some reason you have to disassemble absolutely everything before you actually get to it. In the end I basically just tore the thing out semi-carefully and the new one is just "pinned-in" (the glue isn't even needed).
Another adventure was the screen frame which was breaking more and more each time I opened it. For that I drilled holes in a few places and bolted it together with some really small nuts so it still closes fine. It was annoying but a fun experience, got me over my hardware tweaking anxiety for good. I doubt it gets crazier than drilling holes in your laptop.
So yes, laptops should absolutely be made easier to modify, the components get old really fast and I don't wanna buy the whole thing each time I want an extra bit of RAM or some small part gets broken. It's one of the things that make me steer way clear of Apple stuff.
> There’re upgrade-friendly laptops on the market.
yes, but very few, and going fewer and fewer as we speak. Even Lenovo which was famous for that ends up soldering RAM in their recent models and making the battery a hassle to replace while it used to be on the outside before.
Enterprise market is huge. Consumers like thin and shiny things, corporations don’t care, but they employ people with full-time job being counting expenses. Upgradeable laptops are good for them because they can get exactly right hardware without paying for what they won’t use. They rarely upgrade themselves, vendors do, but unless the laptop is designed upgradeable vendors gonna have hard time serving their enterprise customer’s requests, let alone doing it fast.
Update: and if you gonna install Linux, these laptops can always be bought without Windows license. Corporate customers use volume licensing, they don’t need these OEM Windows keys and not willing to pay for them either.
I doubt that enterprises (or even their vendors) do much upgrading. Instead, they have those machines on a refresh cycle and replace them every three years. They do, however, often prize repairability: if you have a fleet of hundreds of the same model of machine, it's easy to maintain spares of the components most prone to failure/damage.
Not always, sometimes you cannot see the proof because a company just doesn't offer any other options. If there was a modern laptop that ran macOS that was user upgradeable, I would absolutely get it in my next upgrade cycle. Alas, there isn't one.
Also companies aren't always superrational logic machines that have coldly calculated their every more; there's sometimes a lot of collective delusion going on that can leave their consumers in the cold, who then just make do with the best out of a bad lot that's offered. Recall the recent iPhones - suddenly every other phone had a notch even when it served no purpose; or the removal of audio jacks, for example. There was NO consumer preference expressed there, just one company that decided it that way for its own purposes, and others blindly copying it.
I'm not by any means knowledgeable on the hardware design front. But I think the notch was a hardware design solution to a problem (I'm not sure what it is but probably to fit more components or save space inside the phone for something) and all the others copied it because it was a clever solution to an existing problem and it didn't appear to affect users much.
Consumers "prefer" what the billions of pounds spent on marketing tells them to prefer.
If companies spent money telling consumers to value upgradability and not to buy new stuff all the time, then we'd value that more .. but that doesn't sell more stuff, it just helps save the planet, so why bother ....
Consumers don't know that they have the option to fix their machines. They are trained to toss devices (not just computers, but also cellphones, TVs, and appliances) instead of taking them to a repair shop.
In the defense of upgrade culture, you ever wonder where those old phones you trade in go? Phone companies have been turning them into profit by shipping them to the developing world. We live in an era where the even the most remote and impoverished places on Earth have, at minimum, a cell phone in their villages. And it's that crappy circa 2000 Nokia you had that plays snake. Now they can call emergency services on demand or keep in touch with long distance loved ones.
Are you serious? No one is using a 2000 Nokia, even in the third world. Do you think it's cheaper to collect phones in the first world, wipe them, test them, and ship them to the third world (assuming they could even connect to any cellular network) than to mass manufacture $15 plastic Androids?
Has Apple offered an upgrade able laptop alongside a non-upgradeable one?
Would be possible to draw that conclusion if they sold a new style MacBook Pro alongside an older style one with similar specs.
Yes. Between 2012 and 2016, Apple sold a version of the 2008-2012 unibody MacBook Pro that had an optical drive and upgradeable RAM, while simultaneously offering the Retina MacBook Pro, the first to solder the RAM to the motherboard. Arguably from a specifications standpoint the Retina MacBook Pro was the superior model due to its high-resolution display and its use of a fast proprietary flash drive (replaced with standard NVMe in later versions) instead of slower SATA flash drives. Eventually the unibody MacBook Pro would get long in the tooth due to lack of updates compared to Apple's annually-updated Retina MacBook Pro models, but it was still sold until it was quietly discontinued in 2016 upon the announcement of the controversial touchbar MacBook Pro.
Consumers are multi modal, though. Many can't be bothered to dig in and debug or want a sleek highly integrated product. Some others care less for those things and want an upgradeable, repairable product.
It's my hope that economic solutions will find the resources accommodate both modalities.
No, because such options are almost completely gone from the market. And I can't honestly believe that there is "no market" for it. It's an anomaly because most PC manufacturers are just trying to imitate Apple.
It makes sense to think this when looking at modern consumer tech, but I haven't met people who actually want that sort of thing. It always seems like people are having to settle.
The 2015s don't have the "a single speck of dust gets in the keyboard and disables a key" problem. All later models do. I don't think upgradeability factors into it much.
Modern intel (and probably most other) boards use hardware scramblers for other reasons (storing all zeros or all ones causes signal integrity issues / current spikes), and secure the scrambling routine with a different code at each boot.
So, unless I’m mistaken, cooling the ram and moving to another doesn’t work any more.
There are two types of scrambling here - one is the bitswapping and byteswapping that you use to make DDR3/4 routing possible. The other is the whitening function for high speed data that ensures you don't have long sequences of the same value without a transition. The latter is a simple pseudorandom scrambler with a fixed random seed generated at boot. It is not cryptographically secure. The former is a simple substitution and quite easy to reverse (and trivial if you have either a schematic or a board to reverse). Both are deterministic and extremely vulnerable to known plaintext attacks. This is not a security feature.
Source: I'm working on a DDR4 layout right now, and the memory controller scrambling and swapping functions are documented in publically-available intel datasheets (for example, see https://www.intel.com/content/dam/www/public/us/en/documents... sections 2.1.6 and 2.1.8)
Nobody is going to do this, because good components are a competitive advantage. I can’t see any good manufacturer wanting their good {trackpad, keyboard, case} either being put in a computer that undercuts them or being forced to dumb down their computer to fit the “lowest common denominator”.
To successfully define a laptop standard, it would have taken a consortium of companies. Likely companies which aren't necessarily in the business of selling integrated laptops themselves, but would benefit from the existence of a laptop standard.
It's likely companies like Microsoft (pre-Surface), peripheral manufacturers (eg, Logitech) and motherboard manufacturers (eg, Gigabyte) would have gladly got on board in that era.
It's likely too late to start this in 2019 (but I may be wrong). Certainly the late-1990s would have been the ideal time for this.
This doesn't work if the peripheral manufacturers are themselves big players (which they are): they can already afford fighting to secure a place in the oligarchy of big players and it's not in their interest to open up space for direct competitors. Whenever you become big enough, you start to share some substantial interest with any other big company: the one of not allowing smaller producers to step in.
As I understand it, only a handful of firms actually design their own laptops. Most of them buy from firms like Clevo or Quanta and maybe do final configuration (CPU/RAM/discs). So you'd really only need to convince them.
In a way, this is much like the situation with desktops-- Dell and HP was/is big enough to come up with their own custom mainboards and cases, but most smaller shops are going to go ATX.
I suspect part of the reason we didn't see much laptop standardization was that the second-tier manufacturers are weaker in the laptop sector than desktop, as well as being weaker as a whole than they were in 2000 when ATX was becoming a thing.
Outside of a few narrow gaming and workstation niches, there are few use cases where you can't find a suitable big-brand laptop, so the second-tier brands (and the manufacturers that supply them) are in a position of fighting for scraps, not one where they can start promoting the benefits of standardization.
This is likely worsened by the mindset that laptops are unupgradeable-- people bought ATX desktops figuring they'd buy new mainboards in 3 years, but generally assume the laptop is going to be stuck in place.
Because most money is in pandering to the lowest common denominator and scale, most manufacturers are starting to make their own hardware/software integrated combo. Apple did this, Microsoft is now doing it as well. Razer is going there, and all of them are (commercially) better for it. On the other hand, it's bad for 'us' (the more hacker-y users) as we have less options. It's why so many stick with sub-optimal solutions like Apple MacBooks (they are not ideal but the other off-the-shelf options are so much worse) or custom stuff (modified Thinkpads and Dell laptops). While the former isn't ideal, it's at least standard and scalable, while the latter isn't. Not really at least.
This would allow smaller players to step in and to start grinding some market shares of the big players. It would also turn the laptop market from a high margin market to a low margin one. Standardization is just not in the interest of any big player so it's probably never gonna happen. If you are a small player and want to go that direction you're probably gonna be bought out. The only way i see would be to somehow get pervasive open standards and libre schematics implementing them, and then cut out the big players and get several manufacturers to produce them. But that too is hardly gonna happen because of geopolitical problems: most of these manufacturers are domiciled in china and thus this move would cut to much income from western (and korean, japanese) countries. So for that to happen we would have to relocate some manufacturing industry and somehow not put them in the hands of any of our local big players. The problem here is not some problematic business decisions by companies, it's how we organized our economy. It would take radical changes in the economic/industrial policy to make that happen: much stronger anti-trust laws, which would keep companies smaller and force cooperation; public- instead of private-regulated prices so that you don't die to foreign companies' exportation when you start doing that; etc. This would drive cooperation up in all of the economy, take power away from multinationals, reduce waste, hinder "useless innovation". Long road ahead but i think that's what we need and that's what gonna happen at some point anyway: the capitalistic class may still be floating currently, but at some point the systemic crisis (financial instability, natural disasters, political instability, scarcity of energy and other basic resources) is gonna hit them too. What we have to make sure is that they don't get more out-of-sync with that than they currently are.
It’s interesting how competition used to be encouraged in the U.S. and now it’s pretty much the opposite. It’s all about consolidation, oligopolies and monopolies.
If standards lower margins and make entering easier, that’s what should be regulated for.
Adding more competition isn't an answer, because competition is about individual profit, thus monopoly. Neo-liberalism, global free market etc do encourage a world-wide competition: it's already the current trend. What you probably mean is some kind of "healthy/fair competition": the fact we need to add another adjective hints that this is about doing something to balance it. I argue this is about encouraging cooperation. A good balance between both leads to interdependence, which is exactly what we would like: a state where everyone has some possibility of moving around a bit, but where nobody is free to ignore what people they interact with have to say.
A ref i really want to push: "Mutual Aid: A Factor of Evolution" (1902, Kropotkine). There is a whole part mostly about the evolutionary analysis of altruism (the rest is about analyzing several human social orders throughout history: pre-medieval villages, medieval cities and 19c industrial cities).
There was a trend for a while of making business/power laptops much more configurable (I have an old Dell with a hard drive cage that swaps out without removing any screws). But most laptops are more about form rather than function; their design requires reworking all the internals to prevent getting a big clunky heavy box that overheats.
For very low-power machines you might have tons of internal space free, but more powerful laptops need complex heat management in addition to reducing size and weight. It's only now that we have very advanced fabrication techniques and energy-saving designs that we no longer have to hyper-focus on heat exchange.
If size and heat and weight weren't a factor, you can bet that a standard would have arose to manage interchanging parts. But soldered ram is a good example of why that's just not necessary, and can be counter-productive for reducing cost and maximizing form factor.
My experience has been limited by the fact that components increase at the same rate, and to get everything to place nice(r) with each other, you have to upgrade everything. "A new motherboard, CPU, RAM and GPU" is almost buying an entirely new computer. You save a few hundred bucks by keeping the PSU (or maybe change it too after 5 years) and casing, assuming the ports didn't change.
Being able to continue to use the display, keyboard, mouse/trackpad means you can choose higher-end components.
Even if you don't want to keep your widescreen DVI display from 2008, the interoperability means that when you drop it off at a e-waste center, it's more likely to be reused in its current state for a few more years, rather than immediately recycled (reduce, reuse, recycle!)
I do agree there is some degree of changes in interfaces overtime (like IDE to SATA to NVMe M.2), but if you build a system for a similar intended use case the changes within any given 5 year period are small. This means the upgrades you do over a 15 year period will go from a 2.5" platter drive to a 2.5" SATA drive, or from 800x600 to 1024x768 resolution display, but not both at the same time (with a different but significant set of components being shared every upgrade)
Standardization limits innovation. If we had standardized on laptop form factors in the late 1990s all laptops would still be an inch and a half thick, and all screens would still be 4:3.
The standard ATX form factor has been upgraded to reduce size over various years with the vast majority of accepted iterations maintaining the same mounting hole and IO panel locations. I literally have a mini-ITX board sitting in a case I purchased in 1999. This probably fits more into the fear you state in your comment with a reasonably new technology "forced" to consume more space than is necessary, but I think it argues for the opposite by showing that incremental changes to a standard format can allow for wide ranging compatibilities.
For an example, when ATX was altered by squaring the board to the shortest length (microATX), it didn't require a new case or a new power supply to be placed on the market in order to be consumed because it fit within the "too big" ATX case. Then when cases that only fit microATXe became abundant and another incremental change to the motherboard size to DTX, we again didn't have to release new cases or power supplies or IO cards to start to consume this version. It allowed consumers to purchase and use the boards until they decided they wanted to reduce their case size, amortizing the upgrade costs over months instead of requiring larger up front payments.
For desktop PCs, the ATX standard means that the entirety of a high-end gaming PC upgrade often consists of just a new motherboard, CPU, RAM and GPU.
And that's great, if you're into generic beige boxes.
It's been years since I put together my own IBM compatible computers. But in the time since then, I haven't really seen any innovation in desktops.
Yes, for a while the processor numbers ticked up, but then plateaued. Graphics cards push the limits, but that has zero to do with the ATX standard, and more to do with using GPUs for non-graphics computation.
The laptop and mobile sectors seem to be what is driving SSD adoption, high DPI displays, power-conscious design, advanced cooling, smaller components, improved imaging input, reliable fingerprint reading, face recognition for security, smaller interchangeable ports, the move from spinning media to solid state or streaming, and probably other things that I can't remember off the top of my head.
Even if you think Apple's touchbar was a disaster, it's the kind of risk that wouldn't be taken in the Wintel desktop industry.
All we've gotten from the desktop side in the last 20 years is more elaborate giant plastic enclosures, LED lights inside the computer, and...? I'm not sure. Even liquid cooling was in laptops in the early part of this century.
Again, I haven't built a desktop in a long time, so if I'm off base I'd like to hear a list of desktop innovations enabled by the ATX standard. But my observation is that ATX is a pickup truck, and laptops are a Tesla.
Nearly all of the tech you have in your laptop was developed, tested, and refined on desktops. PCI based SSD were in desktops before NVMe was a thing. Vapor cooled processors were on budget gaming PCs 10 years ago. Even the MacBook trackpad was based on a desktop keyboard produced by a company called fingerworks. High DPI monitors came first to desktop. High refresh rate came first to desktop. Fingerprint reader? Had one on my secure computer 15 years ago. Face unlock a couple years after that.
Desktop is still the primary place for innovation. Laptops use technology that was introduced and pioneered on desktop, then refined until it could fit in Mobile/Laptop. Don't get me wrong, there's probably more work in getting the tech into Mobile than developing it in the first place... But the genesis of the ideas happen on desktop.
Desktop has the opposite mix of freedom and constraints as mobile. Standard internals, but freedom of space. There are dozens of heat-sink manufacturers for PC... Dozens of small teams focused on one problem. There's some variation between chipsets, but nothing that requires major design changes. These teams can afford to innovate... And customers can afford to try new solutions. If the heat-sink doesn't perform, you're out 5% of the total cost. But there's no similar way to try things out for laptops.
For example... Should a laptop combine all of its thermal dissipation into one single connected system or have isolated heat management? It completely depends on usage and thermal sensitivities of the components... It was desktop water-cooling that gave engineers the ability to test cooling GPU and CPU with the same thermal system to determine where to draw the line.
>And that's great, if you're into generic beige boxes.
>All we've gotten from the desktop side in the last 20 years is more elaborate giant plastic enclosures, LED lights inside the computer, and...?
Have you ever built an ATX computer? I assure you, there are plenty of different standard form factor cases out there. The beige box thing was in vogue in the 90s, but today the big trends are sleek black with tempered glass.
And standard form factor desktop does not equal giant tower. You could also do a mini ITX build, a standard that's been around since 2001 for what it's worth.
High DPI displays? This implies high end displays weren't available to desktops first (they were.) A decent CRT could produce much higher DPI than LCDs could (in that era.) Part of the reason why Windows DPI independence sucks is because Microsoft implemented it super early in, without all of the insights Apple had to do it right, and now there's like 4 different DPI mechanisms in Windows.
All in all I'm not sure what really needs "innovating" so badly with desktop form factor. Do we need to solder our RAM to the main board, is that "innovation?"
You kind of say it yourself:
>that has zero to do with the ATX standard,
So would be the case for any form factor standard. It only dictates how things interoperate.
>All we've gotten from the desktop side in the last 20 years is more elaborate giant plastic enclosures, LED lights inside the computer, and...?
Improved efficiency and the demise of bulky storage devices has created a proliferation of small-form-factor designs. We have two proper standards in widespread use (mini-ITX and mini-STX) and an array of proprietary designs from Intel, Zotac and others. It's now possible to get a fast gaming machine the size of a Mac Mini, or a monstrously powerful workstation that'll fit in a shoulder bag.
Strawman. Nobody claimed the ATX standard enabled innovation. It enabled the reduction of e-waste as OP indicated. Think of this the next time you trash your innovative phone or laptop because the non-user replaceable battery/ram/ssd/display/whatever failed.
Novel form factors are often how laptop manufacturers distinguish themselves from their competitors. There is enough space within a desktop PC case to formalize a standard. As laptops get thinner and thinner, however, many engineering/layout tweaks are used to fit stuff within a thin chasis. Standardizing across different device models would be asking OEMs to stop putting efforts into competing with each other. And I say this as someone who has just had a catastrophic motherboard failure on their 8-months-old laptop and had to do a replacement that would've cost me a new laptop if outside warranty.
Because size and weight is an important distinctive feature for laptops. Customers pay more for smaller, lighter laptops. Using standardized components and chassis would mean a big competitive disadvantage.
Maybe we could try and write an open letter to companies and promise support even for less value at first. Chances are slim, but at least we would have done our part.
For desktop PCs, the ATX standard means that the entirety of a high-end gaming PC upgrade often consists of just a new motherboard, CPU, RAM and GPU.
A 2007 Lenovo ThinkPad X61 chassis is not that different to a 1997 IBM ThinkPad chassis (or a 1997 Dell Latitude XPi chassis). If the laptop industry standardized, manufacturers would produce a vast ecosystem of compatible components.
Instead we got decades of incompatible laptops using several different power supply voltages (and therefore ten slightly-differently shaped barrel power plugs), many incompatibly shaped removable lithium-ion batteries, and more expense and difficulty in sourcing parts if and when components break.
A little bit of forward thinking in the late 1990s would have saved a lot of eWaste.