Hacker News new | past | comments | ask | show | jobs | submit login
Framework Laptop 16 Deep Dive: 180W Power Adapter (frame.work)
183 points by pimterry on June 8, 2023 | hide | past | favorite | 103 comments



Framework's modular design has been advertised as a solution to upgradability, which is huge, but I see the real strength in being able to have a GPU plugged in when I'm using my laptop at home, and swapping it out for an extra battery when I'm using my laptop away from an outlet.

Being able to transform the capabilities of the laptop based on your situation is a huge game changer.


I find that hilarious because we did that back in the day on laptops that let you put batteries in instead of a cd or floppy drive.

Then Apple happened and ever laptop maker only saw the cash, abondend what users needed and wanted to copy what Apple was peddling including making parts on replacable.

Now after waisting all these resources on form over function even Apple has started to go back to function (at least a tiny bit).


>Then Apple happened

As much as I dislike Apple's anti-repair and anti-upgrade practices, they're hardly to blame for this. They only gave consumers what they wanted, and consumer collectively voted for their wallets for sexyer, slimmer and lighter notebooks at the expense of repairability and upgradability, regardless of who made them, Apple, Lenovo, Dell, Asus etc.

Consumers are a lot more likely to prioritize "hey look, my laptop fits inside a Manila Envelope" rather than "hey look, I can unscrew and replace/repair all these components in my laptop if something breaks".

Now the entire consumer electronics industry has conditioned consumers that whenever their laptop breaks they just take it to a "genius" bar where some dude who doesn't have any electronics repair knowledge (because that would be too expensive), takes a look at your laptop and says "sorry mate, looks like it's fucked and you're gonna need a new motherboard replacement for 800$", when in fact it can be fixed for 50$ in a 10 minute soldering job by a technician who actually knows electronics but is either retired or out of work because his job was offshored to China and we don't repair things anymore we just throw them in the landfill "because we're that wealthy and stuff made in China is that cheap".

Hopefully right to repair laws, stricter environmental rules and trade tariffs, will put an end to this consumerist throwaway madness that just jack up corporate profits at the expense of everything else, even if that means that MacBook Airs will have to be 1.5mm thinker.


"They only gave consumers what they wanted"

This is a complete myth and not how consumer economics works.


Sub 4lb laptops were a game changer. It was worth paying the premium for them and many consumers did


Then please enlighten us how it works.


> Then Apple happened and ever laptop maker only saw the cash, abondend what users needed

99% of users don't need a powerful discrete GPU or an extended battery, as evidenced by the Macbook Air (the base, sub-$1k model) being Apple's bestselling computer by far.


> 99% of users don't need a powerful discrete GPU or an extended battery, as evidenced by the Macbook Air (the base, sub-$1k model) being Apple's bestselling computer by far.

To be fair to Apple, the m1 air has an absurdly good battery life.


The intel Macbook Air was Apple's best selling computer even before their switch to ARM.


In fairness as well, Apple's laptop battery life is top notch. So paradoxically it could be said that proves people do need more battery


If we're really being fair, the vast majority of people buy cheap laptops with poor battery life and even poorer everything else.


Let's be honest... That's a new thing. Laptops have sucked for a long time prior.


PowerBook G3 had expansion ports, too. Maybe it was the 1-inch thick titanium powerbook g4 that changed the game.


This is a really strange take on how product development works.

If Apple came out with something that users did not want, and other manufacturers were still shipping what users wanted, why would anyone have ever bought Apple products?

And even in the Windows PC market, if users really want modular batteries, wouldn't you have been the last manufacturer to move away from them? If you're Acer, and Dell / HP / Lenovo are shipping products users don't want, wouldn't you clean up by continuing to ship what users want?

I think you should be more upset that users want the wrong thing (from your perspective). Product design, especially in competitive markets, is all about differentiating by better meeting user needs. This tinfoil idea that users are sheep who are tricked into buying what they don't want is not realistic.


There is no single user with a single taste. While some users sincerely like the razor-thin, flat-keyed macbook airs with their mirror-finish retina screens. Some others, like me, strongly prefer the bulky, perfectly user-serviceable Thinkpad T series laptops, with their superior keyboards and matte screens. Some other users love their tiny 12" mini-laptops. There are customers who consciously choose Alienware gaming laptops, or Thoughbooks.

What Apple make is always partly a fashion accessory though. Many wanted macbooks for the reason that they look cool, even though the hard edges are manifestly uncomfortable while typing. Some laptop producers made devices with similar aesthetics, again because it was a fashion statement, not because cardboard-thin laptops are more comfortable to carry around, or are functionally superior.


I think you may be misunderstanding the motivations of people who have different priorities than you.


I mostly am trying to say that there are different priorities. Looking cool is a valid priority for some, and I'm not going to devalue it. Compactness and light weight is also very important for a lot of people. This is on top of some very impressive engineering that Apple put into their machines.


My first "real" laptop was a beast of a Dell workstation-class laptop in 2005. One of its key features for me was being able to swap out the CD drive for an extra battery.

Of course, by modern standards the thing weighed a ton. These days, my 15" MBP feels like a brick in my bike bag, and it's likely 30-50% lighter!

(Oh and fun fact: recently I wanted to test out Moore's Law, and see how the RasPi 4 compared to that Core Duo laptop (that I still have!) Well, the 18y-old laptop still handily beat out the RasPi4 in single-CPU performance! Moore's law can't quite make up for opposite market segments)


Yeah, this used to be pretty common.

Through the 90s laptops came with expansion cards and often swappable bays. Even late-90s Mac laptops had two bays that you could swap batteries, DVD/CD/floppy drives into and out of, plus a PC Card slot that could take things like TV video capture cards or Wifi adapters.

I don't remember if PCMCIA slots had enough bandwidth to make external video cards practical (plus... it would require a separate monitor, I guess), though.

They all went away in the quest for lightweight size and capacity... especially as more things got built in.


> I don't remember if PCMCIA slots had enough bandwidth to make external video cards practical (plus... it would require a separate monitor, I guess), though.

PCMCIA surely not, but express card most likely.

I remember some mods installing an external GPU onto the mythical ThinkPad X220 by means of a bridge card that would fit in the express card slot and would allow a full pci-express to be connected. See: https://artemis.sh/2021/08/04/eGPU-on-thinkpad-x220.html


> I don't remember if PCMCIA slots had enough bandwidth to make external video cards practical (plus... it would require a separate monitor, I guess), though.

Original PCMCIA was 16-bit ISA (not sure the speed), CardBus was 32-bit, 33Mhz PCI with DMA and whatnot. Certainly there were many video cards on 16-bit ISA, but I wouldn't think you'd want to stuff one into a PCMCIA slot, maybe a hercules card so you could do monochrome/color multi-monitor; but multimonitor didn't really come into popularity until windows 98 and 2000 on the NT side; advanced graphics card back then were AGP, but some were released as 32-bit PCI as well; they almost certainly wouldn't have been able to be compacted to fit in the slot, but you could probably have a cardbus -> pci slot adapter and some ugly thing. A quick search doesn't find any, but I'd expect something to exist as a development tool.


PCMCIA

I remember those slightly larger than a credit card (and thicker).

I had 2, and IBM one which had a eithernet and modem on it (and a really fun sparkly label). and a scsi one for a zip drive (they had parallel port zip drives... but)


I had one I got at the Spy Museum in DC that was a little compartment! In high school I used to keep a $20 bill in there in case I wanted to do something with my friends that cost money


I have a 2008 Dell workstation laptop (M4400) that had the swappable bays feature that I bought used around 2012 and used as my primary machine for several years (which as an aside, worked so, so much better than buying a brand new machine for the same price would've).

In my case I swapped the optical bay for a SATA bay which enabled quick swapping of SATA drives. For my usage at that point in time that ability was very handy… perfectly possible with an external USB gadget of course, but it was nice to have that without the cables and gadget. Most of the time that bay held a large HDD for "cold" storage, allowing the primary SATA slot to be filled with a fast 256GB SSD.


"(Oh and fun fact: recently I wanted to test out Moore's Law, and see how the RasPi 4 compared to that Core Duo laptop (that I still have!) Well, the 18y-old laptop still handily beat out the RasPi4 in single-CPU performance! Moore's law can't quite make up for opposite market segments)"

shouldn't power consumption also be a factor in comparing performance?

I would think an ARM would use less wattage than a Core 2 Duo.


Certainly, in a MIPS/W or MIPS/$, the RasPi beats the pants off the old laptop (Edit: which I now remember was a Dell Precision M65, which specs say weighed 2.81kg (6.2lb) (likely without extra battery), whereas my current MBP M1 Pro is 1.6 kg (3.5 lbs) so more than half as heavy!)

But I had the mistaken assumption that 15+ years of Moore's Law would certainly leave the old laptop in the dust, in the same way that "modern smartphone more powerful than an old supercomputer". It was interesting to be corrected!


How hard is that to do? I know Frameworks are repairable, but you can really do a swap like that completely on the fly?

Honestly, this comment is a somewhat decent sales hook for me; I've been thinking about getting a Framework for my next laptop whenever the current one I use really starts to fail (depending on how their AMD stuff works out), and this comment makes me more curious to look into them.


Swapping an Expansion Bay Module like an discrete GPU will take ~2-3 minutes:

1. Shut down

2. Slide out the input modules covering the Expansion Bay connector, and open the cover door.

3. Unscrew three or four captive fasteners to remove the Fan Cable or GPU Cable.

4. Unscrew two captive fasteners to release the Expansion Bay Module.

5. Slide out the Expansion Bay Module.

6. Reverse the steps with the new module.

It's not something we specifically designed to happen on the fly like swapping Expansion Cards or Input Modules, but it is quick enough that you don't have to plan ahead for it.


Thanks for the reply/clarification.

This kind of direct response from an owner is also somewhat of a selling point that makes me more interested Framework laptops. :)

> on the fly like swapping Expansion Cards or Input Modules

I knew that Framework was doing input modules, but I didn't realize that stuff like input modules would be hot-swappable. That's pretty cool to see in action.

I'm sure it's not something that there are any current plans to do, and I'm sure it's not possible to talk about even if there were plans, but just as a side-note: seeing the keyboard on the fly get replaced like that -- if there was a way to get a digitizer/screen with good stylus support to be thin enough be used as a keyboard replacement, I could easily see that module turning the 16 inch Framework into not just a laptop but also a replacement for my travel drawing tablet (especially knowing that I can stick a GPU on the back).


I don't know about the GPU, but in general, you unscrew five screws, lift up the magnetic lid/keyboard, and you have access to all the internals. I imagine that swapping the GPU is as easy as undoing one more screw.

Removing the battery is a cable and a few screws, for example. Everything is very accessible.


From what I've seen the GPU add-on is not an internal component (within the laptop's case) but similar to swappable batteries on oldschool laptops, slotted into a port on the back. It makes the laptop larger just like extra large batteries on Thinkpads.


Hopefully they use some crazy hard screws so the heads don't strip with repeated use. (I don't know much about Framework... yet)


No, they're built to unscrew easily, and they're captive, so you can't even lose them.


I don't think anybody outside Framework knows for sure yet (maybe not even internally) but my impression from what they've shared so far is that the expansion bays are intended to be very easily swappable (external, a click & push like the expansion slots) but almost certainly not hot-swappable while the machine is powered on.

They do have some initial specs at https://github.com/FrameworkComputer/ExpansionBay that somebody more knowledgeable than me might be able to glean some practical info from.


Hot swapping would be amazing, but I wasn't really expecting that to be supported.

The images/videos I'm seeing linked in your reply and sibling replies are basically about as good as I would hope for, that it would be the equivalent of swapping out a battery, that it wouldn't require a screwdriver, and that the components wouldn't be so fragile that I couldn't carry them in a backpack or computer case. Definitely something I'll be paying more attention to.

If somehow it is hot-swappable then even better of course, but I'm assuming that wouldn't be the case.


It looks really easy in their product preview: https://www.youtube.com/watch?v=km3MVZ8HZeY


I think we'll have to wait and see whether these are hot-pluggable.


Interesting that nobody came with a concept of a desktop PC, but as a laptop (commercially) before.

Most like due to perceived niche, no standard format of laptop motherboards, not easy to buy laptop motherboard off the shelf.

Then also the technology was evolving, so manufacturers maybe didn't want to restrict themselves to the certain layout and way how components can be laid out and connected.


I would love to see an external monitor with embedded GPU/extra CPU etc that would serve as an extension to a laptop, tablet, phone or stick. If such devices were popular enough in remote-first offices, coworkings and hotels, it would be a game changer making computing more inclusive. Imagine having a $100 laptop with easily replaceable components that is transformed into $1500 workstation on demand.


External Thunderbolt GPU enclosures exist and can behave as a dock. A friend has one at his place, I can just plop my own cheap laptop in and it gets a massively upgraded GPU, access to a few monitors, USB PD, a USB hub with all the peripherals plugged in, and wired networking.

Its not all built into the monitor, but IMO having the several hundred dollar GPU outside of the monitor gives really good flexibility. The dock isn't much larger than the GPU and a few hundred watt power supply which you'd need anyways.

The only thing really missing from that equation is the extra CPU power and it being integrated into the monitor. But having a powerful CPU that scales down to be power friendly when on the go seems easier than finding a powerful GPU that doesn't suck the battery dry just rendering a desktop and a browser window.


Isn't that what "Display Link" usb monitors kinda are (without the powerful gpu and large screen real estate)

I've used a usb-> monitor adapter from time to time (with my own screen). The performance wasn't great, but its not bad and been more than usable for work and such.

with usb3 it should be better than the one I used.

https://www.synaptics.com/products/displaylink-graphics/disp...


Well, the connector does not really matter, the key is the hardware at the extension. CPU+GPU+screen+network+camera+HDD with some software that you need only in workstation mode etc. The software could be sandboxed by the OS on your device, security of extra CPU and GPU is interesting, but probably solvable problem.


That's pretty much an all-in-one PC. E.g. an iMac, or a Surface Studio.


yes, if all components are present, it looks like an all-in-one PC. But again, I’m talking about an extension, where OS and your data belong to your portable device and run in trusted environment using extension resources when necessary.


I had the same idea before, its basically possible now if you pass through the SSD to a host computer.

I even thought about maybe having an external SSD module which you could swap out the SSD to a beast machine when you need it.


This solution will work even with good old HDD, but you need a whole host computer and you swap, not extend your hardware.


Standing on the shoulders of giants is a pointless endeavor if the giants simply aren't there or if their abilities are too narrowly focused on the very hard problems.

Very fast (on a contemporary scale) mobile extension interfaces have come and gone before, but only PCIe in its USB-C and m.2 incarnations not only excels at high performance but also scales down nicely to things as pedestrian as plugging in a mouse or a non-exotic storage extension. PCMCIA for example was dead weight unless you happened to be one of the very few who had one of those super exotic cards. An unused lightning 4 slot? There can never be enough of those, the more the merrier. You might find yourself using it for things as simple as topping up your bike light battery when it's not running an external GPU.


(eh, sorry for typing lightning when I meant thunderbolt)


I strongly believe the tech just wasn’t there 10 years ago. It shouldn’t have taken as long as it did to make a modular laptop, but making some ten billion smartphones (and the constant drive to make them ever thinner and smaller) made it so we can pack far more power into a smaller space.

Modularity and repairability is always going to lead to less space efficient designs. I will gladly concur that Apple and others have taken severe advantage of this in an excuse to push planned obsolescence, long after anyone cared about shaving another millimeter from a device. But there was absolutely a time where it just wasn’t feasible to make a laptop do what people wanted, at the desired price point, and also make it modular. Computer architecture’s all about tradeoffs and there just wasn’t a viable product on that part of the “efficient frontier.” Similar argument explains why gaming laptops were also a massive disappointment for so many years. There are conflicting requirements.

People also didn’t really care as much about right to repair, when hardware improvements made you want to upgrade every few years anyway, and we didn’t have such rapacious anti-consumer monopolies.


There was a short period where external GPUs that could connect to your laptop were kind of big a few years back. The never really took off though, and that was the only big component I saw that with.


Any idea why they didn't take off?


They often just don't justify the cost. The enclosures themselves are expensive, and there's a really rough performance hit, less than half the framerate of the same card connected via PCIe in some cases[0]. A decent gaming laptop probably costs about the same while having better performance[1].

They're not particularly portable either, so you'd be better off getting the gaming laptop, or building a desktop with a cheaper GPU for roughly the same price.

The size/power delivery of the enclosure can be a limiting factor as well, new GPUs have tended to be bigger and more power hungry, resulting in many enclosures just not being compatible with newer cards.

[0] https://www.youtube.com/watch?v=NlYHPj-0DTE

[1] https://egpu.io/forums/mac-setup/pcie-slot-dgpu-vs-thunderbo...`


Guess on my part, but likely cost.

The cheapest egpu case I could find is 300 Euros and that's just the case with an internal PSU. You still need to buy an incredibly expensive GPU to put into it.

So now you're spending probably 1k just for the laptop, plus 300 for the case and an additional, let say, 350-400 on the GPU. For that amount of money, most would probably look for good laptops that have an integrated one, buy a handheld, a console, or just don't want to spend that amount of money on it.


"Interesting that nobody came with a concept of a desktop PC, but as a laptop (commercially) before."

Oh, they did, it was just a niche market you needed to go looking for, because they were and still generally are expensive. I remember my father once brought home a laptop from work to evaluate whether or not they wanted to use it to do data collection in car certification tests. It was a hardened laptop, so the entire thing was made out of sheet metal. I think you could literally run it over with a car and it would be fine. The most noticeable feature was that it featured 4 ISA expansion ports and the volume necessary to contain the cards. I think it was a 486/100 or so. They called these "luggables", but, yes, technically it fit on my lap. I wouldn't guarantee it would fit on everyone's lap, though; I'm quite tall. It was ~$8,000 in ~1995 dollars, and I'm pretty sure I had a Pentium 233MHz by then, and I was never cutting edge on that stuff, was always scrounging around. So, very expensive.

This sort of thing has been available for a while, you just had to go looking for them. A lot of advancements combine to make something like Framework possible, such as the fact that any such testing platform today would have all of its data gathering gear as a combination of USB at worst, bluetooth or wifi quite possible, and software driving the rest, leaving nowadays a GPU as just about the only thing you might want to wire to a laptop at such speeds, so the market has toned down.

A couple of years ago it lived on in the gamer laptop space where, at the very top end, the laptops are basically just desktops jammed into a luggable laptop shell with desktop GPUs in them. You paid through the nose for these things, though you did recover a bit of the price on the grounds that you were using off-the-shelf desktop parts and not paying the mobile premium. I'm qualifying that with "a couple of years ago" because I haven't looked since then, and the external GPU enclosures are getting steadily more practical and well-supported and I expect that will probably eat that market. Hauling around two bits of kit is nominally worse than one, but an external GPU enclosure can have such massively better cooling than trying to jam it in a case with everything else that it will probably be able to perform even better and at this level performance is everything.


The inclusion of GaN technology in Framework's new 180W adapter is a game-changer. GaN's superior power efficiency and ability to sustain higher voltages translate to smaller, more efficient charging . Great to see tech reducing waste and improving performance!


We don't know how efficient this is. Single number given in the article doesn't mean anything. SMPS come with an efficiency curve over the whole range of provided power. It can be 70% efficient at 20W or 85% at 180W and 93% only at 70W. Complete information was not provided by the article.

And USB-C supplies can also switch voltages, so the efficiency will also depend on the voltage (both input and ouput, as this can probably be used in both 110V and 230V grids)...

It's just marketing... "up to 93%" :)


Exactly, the mentioned components used are not impressive if at all. Guess what, almost all named brand laptop power supply has an Onsemi chip, the only difference is big manufacturers like Delta or Flextronics use customized Onsemi chips made for them.

What made it even more interesting is mentioning Weltrend in a marketing material, almost 50% of projects I used to work on include a cost down part to switch from TI chips to Weltrend or similar. Whoever wrote this material doesn’t understand what they really implies.


I happen to have a 180W adapter that came with my 2020 laptop.

Crude measurements with a ruler indicate that Framework's unit is approximately 23% smaller by volume.

I don't know how far this field moved forward over the past three years, but I see this as a win - mine is especially annoying to handle due to its length - ~150mm.

Would love to know something about the weight, because that thing is a brick, which for this reason I don't normally take along with me on trips and opt for the 60W USB-C charger instead.


I've gotten sucked into videos of making low-power low-noise rack mount servers for home offices lately and that's come up a couple times. Yeah you have a PSU that's 90% efficient under load, and stays above 80% over a reasonable range of power levels, but some of them are pretty bad when you get down to 40 watts.


Slimq makes a wide range of GAN chargers all the way up to 330W. There are review of their 240W charger on YouTube that provides fair critiques in their design and Slimq also responds thoughtfully. I'm looking to get a 330W charger for my Legion 5i Pro 2022 that came with a huge 1KG power brick (300W).


I can't read and not think of WE GAAN


Generative Adversarial Networks nis my first thought


Wait, this isn't a deep dive at all, this is just an ad.

(Is this a deep dive for the laptop itself or a deep dive for the adapter? I can't tell.)


We're doing a series of write-ups of each of the major subsystems of the Framework Laptop 16. This is a post we wrote about the power adapter, explaining the design decisions and sharing which key ICs and manufacturing partners we used to develop it. We wrote it specifically for our newsletter, which people subscribe to in order to get updates on our product developments. It being on HN means a newsletter subscriber probably submitted the URL, and enough people were interested in it to vote it up.


It would've been nice to at least see the PCB. Not much of a deep dive if all you do is drop a few names.


I guess a deep dive would have had to get into more gritty details about USB-PD? Or maybe show how they crammed all the requisite capacitors and heat dissipators into the thing.

It did answer my question, though, which was "how do they pull that much power over a USB-C cable?" A: USB PD 3.1 allows up to 48V * 5A to be supplied so long as the cable has the right e-marker on it.

Though honestly, I would be careful trying to get deeper than that. The USB standard has become a bit of a spaghetti mess. Unless you are building your own equipment or trying to daisy-chain some DisplayPort stuff through some Thunderbolt 3 hubs you're probably better off thinking of it as `capabilities(USB network) = min(capabilities(computer), capabilities(cable), capabilitieS(hub))` and then dropping 50$ for a nice cable that you can brag to your friends about ("ALL MY POWER, PERIPHERALS, AND VIDEO OVER THIS ONE [insanely overengineered] WIRE WOOHOOQ")

Anyway, my two cents are that I'm glad they kept the new power brick mostly-the-same-dimensions as the old ones so that my 3D-printed gridbeam-mountable power brick holders[1] continue to work.

[1] https://www.thingiverse.com/thing:5400078


> I guess a deep dive would have had to get into more gritty details about USB-PD? Or maybe show how they crammed all the requisite capacitors and heat dissipators into the thing.

I personally went in with the expectation of seeing the PCB inside the adapter, but they don't even show that. :(


Yeah this is not anywhere close to a 'deep dive'. Surprised Framework would editorialize the title so much, not a good sign.


I am very happy with my Framework laptop, and love their vision of modular design and upgrades.


That's still 12W to get rid of from a small enclosed space. I wouldn't want to operate this at peak power all the time. And SMPS don't usually have peak efficiency at rated output power.

Do these things re-negotiate power contract, or will they just go into thermal shutdown when overheated?


Are there other laptops/chargers out there that support USB PD 3.1 EPR? Curious how those efficiencies compare.


The latest Macbook 140W charger is USB PD EPR; It's not 180W, but it's one of the first running EPR.

It appears the Macbook charger gets about ~93% efficiency at ~100W load, and 87% efficiency at ~27W load. No metrics on the 140W charging I can find, though.


https://www.theverge.com/2021/10/19/22734233/apple-140w-macb...

MBP 16 inch in 2021 was apparently the first PD 3.1 apple charger


Related threads:

- https://news.ycombinator.com/item?id=35277660 "Framework announces AMD, new Intel gen, 16“ laptop and more"

- https://news.ycombinator.com/item?id=35286544 "Framework Laptop 16"


there's some guerilla marketing going on with Framework, I see posts on Framework on front page of HN every few days and I think everyone's tolerating it because we all support Framework's goal. But this is effectively advertisements at this point


Some of us are just really excited about a laptop which is intentionally built for upgrades, repairs, and re-usability.

The current trend of the household name manufacturers is to solder and glue as many components together as possible, meaning if one thing inside goes ka-plooey outside the warranty then the whole unit is shot, forcing you to buy another. They say their goal is to get their products thinner and thinner (for some reason) but of course we all know the likely real reason is designed obsolescence.

Edit: I don't own a Framework laptop yet, but the 16 is near the top of the list of contenders when I upgrade within the next year or so.


> designed obsolescence

I am old enough to remember “reseating” memory to fix faults. Connectors fail, often disgustingly intermittently. Solder is used for reliability, performance and price reduction: I think most people prefer those attributes versus upgradability (although nice to get choice with Framework!). Connectors now limit performance: there’s a reason why the highest performance memory does not use pluggable formats. HBM2. Remember when cache was a seperate stick of SRAM memory?

The majority of PCs were never upgraded, even if they could be. I know, because I would do it for non-technical friends back-in-the-day and eek a couple of years of extra life out of a PC: however without my free labour (and usually trickle-down free parts) those friends would just buy a new PC and be happy. Upgrades are mostly for geeks.

Meanwhile good hardware (phones, laptops) is often becoming obsolete due to lack of software upgrades, not lack of grunt. I just bought a new iPad because of this. A friend’s MacBook Air: working fine but needed new battery and no more MacOS upgrades. But also they wanted a new device with a better screen since they spend many hours a day on it, so less value in upgradeability?

Yeah, batteries. I haven’t seen a good cost-benefit analysis of sealed devices. Modern batteries now seem to outlive the device, which wasn’t the case 10 years ago with iPhone6. Certainly I appreciate sealed mobile phones that don’t die from water damage.


Yeah I think the 16's will be very popular. Do you know if the Ethernet adapter still sticks out past the edge of the unit?


That's how it's designed, I'd stick to a dongle if it's an issue. RJ45 size is pretty legacy. See Lenovo T14 vs T14s. Hopefully they'll be able to design one that doesn't stick out eventually, but a bit extra downwards instead.


Legacy? I use them all over the place whenever I travel to client sites, and all over my house and office. More reliable than Wifi.


Maybe I am missing something, but the Framework focused posts I have seen seem to mostly come from long active accounts by actual community members. Think this is more simply a case of the product being interesting to the people on here.

Framework isn't even engaging in any tactics that may help to get more attention organically, like Cloudflare when they group a lot of releases into a very small window to dominate the frontpage.


Yep, we don't post these, and often times don't realize that they were posted until we check analytics later.


I am one of the people that posted a Framework announcement in the past. I do have a Framework laptop, but no relation to the company other than being a customer.

I posted the announcement because I follow Framework pretty closely, and figured other people might be interested in a company that works in a direction opposed to the mainstream manufacturers. It's Hacker News after all.


Or, and hold your horses here, because so many people support Framework here we see so many posts about it.


Advertisements? On hackernews? Perish the thought


So why didn't this support the 240 watt mode (ie. 48v 5 amps)?

I can't really envision any standard component choices which would support 36 volts and yet not support 48 volts...


I don't understand why they don't put two or three USB-C ports to the charger. It could serve as universal charger for other devices as well.


Obviously that would cost more. Anker sells that BTW.


Apple has a couple models with 2 ports now but none of them are particularly beefy. One is only 35 watts, which means it's basically saving you from routing a whole bunch of current into your Air and out the other side to charge your giant cellphone.


"while also outputting enough power to handle the Graphics Module with a discrete GPU"

Oh? This is news to me. Will it have an nVidia graphics card?


the 16" will have a slot for discrete cards (I don't think it will be a PCIe slot but could be wrong).


No need to wonder or guess, according to The Horse's Mouth[0], the expansion bays in the 16 will have a PCIe x8 interface. (Not to be confused with the expansion cards from the Framework 13.)

[0]: https://frame.work/blog/introducing-the-framework-laptop-16


Thanks, I remember seeing that, but wasn't sure of the form factor (eg I am not sure you'll be able to plugin any old gpu).


It's not a desktop PCIe slot on the back of a laptop, but the interface is already documented. The electrical, shell-fit PCB, and general form-factor specs of the 16" laptop's expansion bay are already published under a CC-BY license by Framework: https://github.com/FrameworkComputer/ExpansionBay

Direct link to the pinout PDF: https://github.com/FrameworkComputer/ExpansionBay/blob/main/...

It's a 4th-gen x8 interface, so there are limits to what "any old GPU" might be capable of performance-wise, but creating a passthrough-port accessory seems possible.


Thanks, I had forgotten they had published specs. Note with Thunderbolt and a eGPU chassis, you can get the adaptor interace today. I was considering this to repurpose an older machines GPU for my 12thgen 13", but decided instead to turn the older machine into a Steambox instead.


A Power Adapter like this would be great for a small 3d printer.


Holy moly; 180? Sufficient to power an entire house, my God.


You must have a very efficient house.


Actually I do. I have two solar panels, 60W usbc power brick for my laptop and phone. And a small light...it was a joke. OK? Why on the defense? :)


we need a sarcasm tag/symbol.. That would be super useful. I honestly can't tell tone anymore.



I would not say we need tag or symbol. First of all it was not sarcasm at all, just a plain joke. And second, instead of a symbol, what we need is to read more books.


if "my God" part wasn't evident, and you need a mark for sarcasm/joke/take the piss... I guess that part is beyond help




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: