To put that into context, it will be 18 mm and 4.4 pounds, versus 15.5 mm and 4 pounds for the new 15" MBP. Apparently the new battery is 79 watt-hours (the old one was 64 watt-hours), versus the 76 watt-hours on the 2016 rMBP. With the power-guzzling 4K screen, it'll almost certainly get less battery life than the new MBP.
Which confirms the consternation over the 2016 rMBP. It's a huge regression compared to the 99 watt-hour battery in the 2015 rMBP 15", but with few exceptions PC manufacturers were shipping 50-60 watt batteries in similar configurations. And going with power-guzzling 4K displays instead of the much more efficient 3K model in the MBP.
On a tangent, weight I can understand, as folks physically haul around a laptop, but I've never understood shaving a millimeter or two from the thickness of a laptop. I've heard genuine complaints about heavy laptops, or too-large form-factor (big screen or overly large bezel), but thinness seems to be just bragging rights. I've known people who bought a laptop to fit in a leather bag they already liked (rather than the other way around!), but I've never heard anyone complain that laptops are just too thick.
In addition to what others have said here and elsewhere, I think some of it is also internal competition gone too far among mechanical designers to make the thinnest edge or the most subtle bezel.
When I worked with product designers in the past, there was always a lot of oohing and ahhing whenever they'd come across a competing product with tighter tolerances than they'd seen before.
Its like the product design version of needing to use the latest pre-alpha NodeJS framework or transpiled front-end language; it doesn't necessarily have a direct effect on the end user experience, but it satiates its creators own need for new and shiny.
I agree, but I just bought a 15" MBP 2015 and 15,4-inch (2880 x 1800), ~2 kg, plus the battery life feels just right to me (YMMV). Its ~2/3 of my previous 15" MBP 2010, the battery life is ~25% better, and the screen is noticeably brighter and crispier (partly due to the old screen being matte and 6 years old).
I didn't have to go to extremely low weight device. I didn't have to go to small battery. I didn't have to go to low or high resolution extremes. Sure, the GPU isn't great, but I decided to not play games on it from the get go. I wouldn't have my gaming equipment for it anyway (mechanical keyboard with binds, Naga mouse, headset, etc).
There's a video[1] that has some history of the calculator in Japan that kind of mirrors this. There's a race to get the smallest and thinnest, but it ultimately ends with the calculator not being particularly good to use.
Laptops sitting on desks are nicer to type at when they're thinner, because the difference between the arm on the desk and the palm on the laptop is minimized. Thinner also usually correlates with weight.
This can be at least partially mitigated with a trapezoidal laptop shape, thicker towards the hinge. Though I will admit that a trapezoidal profile is much harder to fill with components than a rectangular one.
Well, I don't type on laptops as a rule anyway. My dev machine may be a laptop, but it sits on a stand plugged into an external display and external keyboard.
In any case, if you do the ergonomically correct thing and keep your palms off the keyboard, it's still probably easier to type on a slimmer laptop.
I used to not care until I stopped using workstations and desktop replacements, 2.5mm doesn't feel like much until you switch from one to the other and it fits in your hands that much better when you're moving with it. That was one of my favorite things about the Spectre
Battery life and increased battery life are almost always a feature that is marketed by laptop manufacturers. On the Macbook Pro one of the key features being marketed is "up to 10 hours of battery life". I guess you may mean that making it thinner increases the "sleek look" and visual appeal?
While a lot of people clearly care about it, I'm just saying that the selling power of visual, tacile aspects of these products is much greater than almost anything else.
I feel that the introduction of "HD" came with some issues for computers. Previously we saw a much wider variety in screen resolutions and thus had more choices. "HD" introduced fixed resolutions that non-technical users could relate to. However, those resolutions were optimized for movies and nothing else. Ever since we are dealing with resolutions that make sense for watching movies on big TVs. For TVs and movies 3k doesn't make much sense. Nobody will try to sell you a 3k movie. There is huge marketing money spent on "4k" thus everyday Joe doesn't want to settle for "3k" even if it would be better. It's sad.
It really is, thanks for bringing this up. I don't know if it's just me but most of the work I do (reading/programming/pcd design) just seems to work best fullscreen on 4:3 or similar ratios. When LCDs weren't mainstream yet you could rather easily get monitors with nice and comfortable (imo) 4:3 resolutions up to 1600x1200. Then came LCDs and the trend shortly continued (though initially at lower resolutions IIRC). Then out of nowhere (?1) came the HD trend and it became a struggle to find non-wide monitors and/or monitors with at least some vertical resolution to work with (seriously what was I supposed to do with 1024 pixels vertically?). Saddest thing is that when the average laptop/pc screen user isn't watching movies he/she is probably browsing a website. Which is the exact opposite of the screen ratio: high and small. And everywhere around the world scroll mouses were being scrolled more..
(1) I really don't know where this came from. Just pushed by the movie industrie? Seems far fetched. Screen producers? They were fine with creating 4:3 LCDs for a while so can't really be that either. Or one of the reasons is maybe that because laptops became more and more popula and due to the keyboard it it makes no sense having a non-wide screen on them?
Yea it's one thing that Apple really did a great thing by branding things as "Retina". It let them avoid a lot of the HD vs 4k vs Full HD marketing buzzwords and sell people something that makes more sense for the device.
It actually makes even more sense for customers than the xHD specs. Retina means mostly double the resolution (per dimension). So it's relative. And across all Mac devices it stands for a good image quality. Compared to that HD/FHD/UHD/... are all absolute. And even though FHD is great on a Smartphone or 8" Tablet you wouldn't consider it a brilliant image quality on a 15" notebook anymore in 2017.
Yeah but MBP owners are convinced they have the best screen on the market now because "It's a retina screen" when PC laptops are shipping with 4K and the "retina screen" is 3K.
In this case, it sort of is the best. 4K laptop screens are significantly over 200ppi, which is kind of a waste of you're displaying it to an actual human being at a typical viewing distance. And in return, you get a massive decrease in battery life.
So Apple's screen is "good enough" in resolution, but uses significantly less battery. And as I understand it, has top-end color reproduction and gamut range.
retina itself is a buzzword and literally means "high pixel density"
No, 'Retina' means that the pixel density exceeds what the eye can distinguish at the expected viewing distance.
IMO, it's a much smarter way to market than 1080p, 4k, 5k, etc. since it has real meaning to the consumer instead of chasing what may be arbitrary specs.
Given current software and hardware constraints, the sweet spot for a laptop is about 200 ppi.
There’s no reason to put a 300 ppi display on a laptop, unless you’re expecting users with perfect vision to hold the laptop <12 inches from their faces. Which, frankly, nobody should ever do for any length of time.
If GPU, display cost, battery life, etc. (not to mention software support) were no object, I’d love to have arbitrarily pixel-dense displays that I could look at perfectly sharply under a magnifying glass. But in the world we live in, it’s a very poor choice of trade-off.
I certainly appreciate high DPI screens, and clearly a lot of people agree with me. There's certainly not "no reason" - it's the primary point of interface from the computer to the human, so of course there's value in optimizing it.
Sure, "optimize" it. But if you put a 300 ppi and 200 ppi screen next to each other and asked users "Do you want the higher resolution screen or do you want an extra 2 hours of battery life?" I don't think they'll choose the screen. I have 4K in a 24" desktop screen, and that pixel density is plenty high.
In the tradition of digital camera manufacturers and countless other tech companies, HP chasing the consumers who know that "4K" is the next-big-thing and that bigger numbers are always better.
To be fair to HP, they're doing it because it works. Doesn't mean it's "optimized." Or that I'd take one over the 2016 MBP.
Lots of manufacturers have done that. The high resolution screens are frequently an optional item with known costs to battery life, and which directly cost money. If as you suggest they wouldn't sell, I seriously doubt we'd be seeing them so pervasively.
Also, pixels densities don't scale linearly. Having a 200 DPI desktop screen isn't equivalent to a 200 DPI laptop screen, else phones would still be at 480 x 270 or whatever.
PPI (DPI) is actually bad unit for display's "sharpness", because it makes difficult to comparise different types of devices, meant to be watched from different distances. The correct universal unit is pixels-per-arcsecond, i.e. the ratio between PPI and viewing distance, making possible to compare e.g. mobile screen with video projector image.
Oh I'm not suggesting they don't sell, just that they shouldn't.
It's like how people will buy an 18MP camera with worse optics instead of a great 12MP camera because it has a bigger number. And god forbid they ever need to make a 20x30 print, you wouldn't want to see any pixels in that.
To take the opposite extreme of your 480 x 270 cell phone example: You can get an Xperia Z5 with a 5.5" 4K screen, but that doesn't mean it makes any sense.
4k phones aren't taking over the same way 4k laptops are. The best selling flagships don't have 4k screens. The largest push for 4k phones seems to be with VR as a justification, and VR sorely needs pixels.
Your comparison with phones would be more convincing if they ever really took off, but as far as I'm aware it never happened.
The problem is that the choices aren't well balanced. You can either have 1080p, which is noticeably low-resolution, or you can take a 30-40% battery life hit for 4k. Almost nobody except goes after the sensible middle ground of a 200 dpi display.
I expect the primary advantage of 4k for most people will be the ability to scale 2x and retain a decent resolution. You don't save much (cost, battery life) by going to an intermediary size, but lose flexibility. On top of that, there are definitely noticeable improvements to be had significantly above 200 dpi.
I wonder how many people opt for the higher density assuming it'll make a significant difference without really knowing what the visual threshold for being able to see a noticeable difference is.
Have you actually compared the same content on a 200 PPI screen vs. a 300 PPI screen? Even though you cannot see the individual pixels, text on a 300 PPI screen appears much sharper (when scaled to be the same physical size, of course).
The "visual threshold" doesn't mean "you will not see anything smaller than this so it cannot physically matter". You can still see a hairline even if it's less than 1/300 inch wide.
I had massive issues with the others (KDE5/Plasma, Unity, Gnome 3, etc). Cinnamon is actually the best in terms of usability. The other windoow managers were straight busted in terms of UI scaling, ability to sleep, disable the touchpad while typing, screensaver, etc.
"6/6 vision is defined as the ability to resolve two points of light separated by a visual angle of one minute of arc, or about 320–286 pixels per inch for a display on a device held 25 to 30 cm from the eye."
Around 200 PPI seems ideal, with good margin, for typical laptop screen viewing distance (50…60 cm). Empirical test with my 15" Retina Macbook confirms this: I have to put my face impractical close to screen before sharp edges (e.g. in letters) start to look soft.
A 1080P screen on a 15" laptop is high enough resolution that the smallest readable text is getting hard to read. Doubling that resolution in either direction may not be pointless, but if it also destroys my battery life then it's certainly not improving the device as a whole.
Edit: As pointed out below, that's a terrible way to describe it. :P I meant that text at the smallest legible pixel size is physically small enough that it'd be hard to read even at a larger resolution.
The smallest readable text is always hard to read, as otherwise you'd go smaller. I'm not sure what that measure proves. Going from 1080p to 2160p makes that same size text much easier to read, because it's much sharper.
You’re comparing a 150 ppi (2 megapixel) 1080p display to a 300 ppi (8 megapixel) 2160p display. Many of us agree that displays in the 100–150 ppi range are noticeably pixelated at typical laptop viewing distance.
But you should be comparing to a ~200–230 ppi (4–5 megapixel) display, which at laptop viewing distances (let’s say 24 inches) is just about at the limits of human perception. Dense enough to not be worth the hit to performance/cost/battery to bother going much denser. The Apple 15 inch “retina” display is 220 ppi, 2880 x 1800 pixels. (That’s the analog of 1620p but at a 16:10 rather than 16:9 aspect ratio.) The Apple 13 inch display is 227 ppi, 2560 x 1600 pixels, the 16:10 analog of 1440p.
I was just continuing with the comparison in taneq's post.
Personally, I noticed the (admittedly very minor) improvement going from ~245 ppi to to ~260 ppi, so I doubt I'd have missed a much larger jump, but what's true for me isn't necessarily true for everyone else.
Up to about 300 ppi (maybe even 400 ppi), it’s still possible for someone with good vision to notice a difference at 24 inch viewing distance. I can tell the difference too. If I bring my face to 8 inches from the display, I can notice up past 600 ppi.
But being able to notice something isn’t the same as benefiting in a substantial way. 300 ppi is well past the point of diminishing returns for a laptop. If there were no competing trade-offs, having a 300 ppi display on desktops, a 400 ppi display on laptops, and a 600 ppi display on phones would be great. I just don’t think they’re worth it in return for a big hit on performance and battery life.
Apple’s choice to take early-2000s ~100–120 ppi resolutions and just double them all (using a 2x mode for legacy software support) got them to IMO just about the current sweet spot for PC displays.
The immediate next things to improve in displays are bit depth (12 bit/channel), dynamic range, frame rate (120 Hz? variable?), maybe color gamut (personally I wish we could get more than 3 primaries to widen the color gamut while avoiding spectrum spikiness, but there’s a big content chicken–egg problem), and power use (I want a laptop with 2 day battery life).
In a decade, more software will be properly designed to handle arbitrary-resolution UI, integrated GPUs will be beefier, display protocols will support higher resolutions and framerates, and it will maybe make sense to hop up to 300+ ppi PC displays.
To be clear, I'm not saying that everyone should use 300 ppi displays, or that everyone would benefit. But you need to remember this goes both ways. Not everybody cares about battery life. Some people want 5 hours, and everything above that is superfluous. Some people basically never even unplug their laptops. Not everybody cares about the performance difference; as long as it renders browsers and office apps, it's fine. Others want 4k because it's a pixel doubled 1080p, and they like gaming at 1080p. Some people never even use the built-in monitor. There is no single, homogeneous sweet spot.
I agree that there are other useful steps. OLED laptops are happening, though scaling slowly. 144hz, variable framerate laptops are happening, and are set to completely take over the gaming laptop segment. Touch screen happened. Power efficiency is improving, a recent step being IGZO. It's not like these other aspects are being ignored.
There's a lower limit determined by pixel size, and a lower limit determined by the angular size of the letters as seen by the user. For me, at least, the two are about the same on a 1080p 15" screen. A little more resolution (1440p maybe) could be useful but even that's past the point of diminishing returns.
I've had very few issues. Especially since after the latest update. For sure, there are some older applications that balk, but even those seem to behave.
Have you tried a multi display setup where the other monitors are not 4K? It was awhile ago that I tried it, but I seem to remember running into several show stoppers.
My issue is that it gets confused about whether it is scaling the UI or not sometimes, so from the High dpi screen to the low dpi screen an app develops ginormous controls, generally fixed by dragging a window from one side to the other but annoying.
I had that setup for a while, no real issues. Biggest offender was steam and various games (for some reason game devs seem to have never heard of UI scaling)
Office 2013 completely falls apart on separate-DPI monitors. It's small on one screen and gigantic on the other, and doesn't automatically adjust as it should. It's more than just a few applications. The downsides of 4K don't really seem worth the benefits (if there are any).
I don't have that problem with Office 2016 and Windows 10.
Some old corners of Windows (e.g. Device Manager) are blurry and poorly scaled but that's about it. I never hit any real issues with my 4K monitor or its 1080p sidekick that I use in portrait.
That can't be true because every single app I've ever moved between monitors works fine and switches DPI, and I doubt every app has been updated for this.
This was a huge annoyance of mine for the longest time with Windows 10. 1080P laptop screen with external 4K monitor, everything on the 4K monitor was blurry due to scaling not working properly. Everything fine when just running the external monitor on its own.
Anyway, to cut a long story short, they have finally fixed this in the latest update. No idea why it has taken them so long but it's a huge improvement.
Hyper-V's GUI is just a horrible embarrassment in general, especially when it comes to dealing with non-Windows guests. VMware Workstation is still lightyears ahead in terms of UX even though the team behind it has been disbanded for a whole year now.
Wouldn't it be great if laptops had a "dewalt" powertool style bottom plate. You could decide if you want the "thin" or "thick" baseplate to balance between all-day battery life vs. ultra-thin portable 3-4 hour life.
This would give the option to create a solid "3-inch thick" battery that lasts a week. It would weigh a ton, but for some situations like remote fieldwork it could be a really great solution.
This is only tangentially related to what you say, but I do like reminding people of this when we talk about laptop batteries.
FAA regulations state that single batteries larger than 100Wh cannot be brought on commercial flights. This guides a LOT of hardware development; why do you think the battery in the MBPr pre-2016 was 99Wh?
"With airline approval, devices can contain larger lithium ion batteries (101-160 watt hours per battery), but spares of this size are limited to two batteries in carry-on baggage only. This size covers the largest aftermarket extended-life laptop batteries and most lithium ion batteries for professional-grade audio/visual equipment."
Good point! My boosted board electric longboard has a 99W battery. My ego longboard has a much larger battery and it was a super pain having it shipped overseas.
have you travelled a lot with the boosted? i hear that after the hoverboard fires the airlines sometimes wont allow it on board, even though its under the limit.
thinkpads have this. I use "the tumor" (a 9-cell battery) and it's amazing. Also, you can hibernate the computer, remove the tumor, and replace it with another tumor (or the 6-cell battery), and restart- no reboot required to switch batteries.
Hot swapping batteries is so amazing. I use my small one when i keep it plugged in and change to the big one when i take it out. Never have to restart my laptop just because of that.
And I'm actually not sure it's a bad thing. I used to be in the "just give a phone that's a bit fatter already" camp. But now, to be honest, I'm more inclined to think I want the thinner/lighter phone for the 80%+ of days when the thin battery has enough juice and just carry the recharging brick when I need something more.
Depends how easy it is to slip on and off. Also, a battery case is more or less specific to a particular phone model. You basically toss it when you upgrade to a new phone.
I'm working from a tablet these days, so has been a while I've charged a laptop with it, but I've seen both the MacBook and the new MBP charge from it.
Ive used this configuration but noticed it could be a problem for battery longevity for it to be in contact with underside hot spots, which rear positioned or otherwise confined batteries can avoid.
Also as a battery pack gets bigger, it can rather beg its own independence so that it can ergonomically free lap-weight, and charge other things too. For these reasons I prefer a universal brick to a power slice.
My Sharp MM10 had swappable batteries: one was thin (total 0.54 inches for the laptop), and another would protrude out the back, but provided 8 hours of work. Not bad for 2003.
Why don't HP desktops and laptops get more attention in 'hacker' communities? I hardly ever see them discussed.
IME with large numbers of HP corporate (not consumer) machines, they have by far the best quality of any Windows options. They run for so long that users get frustrated, wanting shiny new equipment but having no reason to replace their old ones. An EliteBook not far away from where I'm sitting even has a tool-less case - you can pop off the cover and service it by moving one lever, no screwdriver required - on a laptop!
However, I don't have data on quality; that is hard to come by.
> Because HP did a LOT of crappy things and produced a lot of crappy hardware and provided a really crappy customer experience.
In my experience, they still do.
I have a Elite X2 1011 G1 2-in-1 business laptop that cost a fortune when I bought it. It has numerous serious, well-documented issues that HP has done nothing about since release.
A few that I've personally experienced include:
- Touchpad randomly stops working every few weeks until you remove and reinstall drivers
- Severe power leakage during sleep, even with connected standby turned off (I'm talking about 100% to 0% if left unplugged overnight)
- WiFi stops working after wake from sleep until disabled and re-enabled
- Frequent intermittent audio crackling issues that requires a reboot to fix, and an audio "enhancement" suite that's seemingly impossible to disable, and makes anything coming out of the headphone output sound like absolute shit (and this is not just my snobby audiophile side speaking, even my parents can tell something's off with the headphone out on this laptop, and so can everybody else in this 14-page thread http://h30434.www3.hp.com/t5/Notebook-Audio/DTS-Audio-Contro...), which severely hampers its use even as a media consumption device.
All of which I would consider to be inexcusable issues even for the most basic machines, without factoring in the device's $2000+ price tag.
The only redeeming factor was its WiGig chip and the accompanying WiGig Dock (separate purchase), which is seriously cool tech that I really hope catches on, along with wireless charging. Albeit it has limited utility on a machine with such debilitating defects.
Needless to say I'm now firmly in the "I will _NEVER_ buy another HP" camp.
I used to work for HP, and I'll never buy anything from them after they screwed me in the pay department: -5% pay adjustment on hire/transfer from EDS; also they did not pass on shift differentials of 15% for 3rd shift that they were payed for my work.
Paying your workers may not motivate them (above a certain baseline), but screwing them certainly will!
I can count on two hands the number of printers and consumer-level laptops from HP that have died on me. I'm sure its different in EliteBook land but I am still trying to find out what Windows laptop vendor offers the best build quality and performance.
Buy the corporate line laptops from whatever vendor you use. Speaking generally,
* Consumer products are sold based on cost; when consumers shop, that's all they look at. Corners are cut to keep costs down - that's what consumers demand.
* Corporate products are sold based on availability (reliability), serviceability, and support; that's what IT departments shop for because those line items cost businesses far more than the extra couple hundred in up front cost. Imagine the cost of just a a few hours of downtime over the entire lifetime of the computer - lost productivity, skilled labor to for repairs, parts, distraction and disruption.
Yes, corporate products cost more; if you insist on lower up front cost, the manufacturer is going to give it to you. You get what you pay for. The same goes for support; pay for premium support.
My impression of HP's consumer-level stuff is that it's cheap in every way, but I never use it. I do have a ~12 year old heavily used HP corporate line laptop not far from where I'm sitting; it works perfectly except for the trackpoint (touchpad is fine), and the finish is a bit worn where the user's palms rest.
I have a Skylake Envy 15. In general, I like it as a Linux development machine, but there are a lot of things that I'd change about it if I could. The polish just isn't there...and somehow it manages (with an aluminum body) to feel more rickety than the plastic Lenovo that work issued me.
I'd rather just use it without getting into a discussion about why I'm using an imperfect computer rather than [insert other guy's preferred brand].
edit: I suppose it's not really a business model, but it is supposed to be their premium consumer model, and to supposed be kind of an HP Macbook "replacement".
I don't know about their other lines but I had a Zbook for work that made me never want to own anything made by HP. Huge clunky piece of flimsy plastic with terrible battery life and a worse trackpad. The display had a habit of breaking, meaning I went through several replacements before they finally just gave me a MacBook.
HP has tried to reform themselves with some new premium devices like the Spectre's line (x360), but they really did make some horrible hardware back in the day.
Since my last HP work laptop died, I've just been 'upgraded' to a 1.5 year old 15" ZBook G2, after flat out refusing the 17" version (which weighs about as much as me, and is bloody massive).
It is heavy, has a pretty big screen bezel, and generally feels clunky. The i7-4710MQ is also slightly slower the >3 year old i7 it replaces. Only slight positive is I now get ~3.5 hours of battery life instead of ~2.
That depends, it's a diverse crowd. If you're going to install linux then it's not an issue. If your leaving the OEM version of windows then it is. IME a fresh install of windows is often hard/impossible to track down all the drivers you need. And those drivers often come with their own crapware.
I recently did a complete reinstall of windows 10. I hat to re-download some drivers from dell to get touchpad gestures working but otherwise everything was fine.
Recently upgraded a zbook from 8.1 to 10 and the audio went awry, and the DTS audio stopped working and that took time to fix. There were some issues with win 10 switching between wi-fi and a cabled connection which should happen automatically but I had to fiddle around to make it work.
ON the plus side: virtual desktops in Windows! Not the best implementation and still not as good as Linux's, but usable.
I have an elitebook from 2012, and the build quality is pretty bad compared with my Lenovo (Thinkpad) computers. Everything except the motherboard has broken at least once.
> Thinkpad T or X series. Which also have near perfect Linux kernel driver support.
Isn't that because Linux (and other *nix) devs typically use Thinkpads?
And does anyone know why that is? I'd guess that it's because when these projects started in the 1990s, IBM Thinkpads had a great reputation for quality.
That, and the hardware used by thibkpads is usually very close to the Intel reference design for the northbridge chipset associated with the CPU, and things like Intel or broadcom Ethernet controllers and not shitty ones.
I was happy with my HP till I moved north of Darwin: 99% humidity, 34C/93F 9 months of the year.
It became impossible to use unless I wanted to cool my workspace to the point where stepping in and out of the workspace all day could shock my system.
I switched to a Thinkpad. It worked. Humidity/dust killed it, like everything else, but it took 18 months.
I've got an EliteBook that has issues with this... at high load after about 10 - 15 minutes it throttles itself WAY back because it can't get rid of the heat.
I think it's because of model fragmentation and short lifetimes - there aren't enough devices with a similar enough configuration available for long enough time that any community can really develop.
Interesting. Generally the components in different manufacturers' laptops are subsets of the same set of options: Intel, AMD, NVidia, Broadwell, etc. Using the same components, I'd expect all manufacturers have similar ranges of variation and rates of change.
There are corporate machines that promise platform stability for 1-5 years in order to give enterprise IT more standardized platforms to support, but I'd be surprised if many FOSS projects targeted those models.
There's a small community of Linux on Mac hardware hackers because they have the slowest rate of change and fewest models to support. In comparison all the PC makers are high rates of churn and configurability. If there were a Mac class hardware build that integrated out of the box with Linux that showed reasonable product-line stability, I'd be buying that in a heartbeat.
The corporate machines may be more stable, but generally they're tied to corporate sales channels that aren't built to be accessed by individual buyers.
> There's a small community of Linux on Mac hardware hackers because they have the slowest rate of change and fewest models to support.
Interesting; I hadn't thought how Apple's culture of minimizing complexity for users would benefit even FOSS.
> The corporate machines may be more stable, but generally they're tied to corporate sales channels that aren't built to be accessed by individual buyers.
IME: You just have to look hard for those with platform stability; it's not prominently advertised and your non-corporate sales rep will have no idea what you are talking about. But once you find one, you can buy it online direct or through many channels, at least SMB channels (e.g., Connection.com); it's just not available from Best Buy or Staples.
Come to think of it, I would think the FOSS operating system community would happily adopt them.
I have an older, refurbished elitebook that I picked up for about $140. It's fantastic. It has a removeable battery and the tool-less backplate. Every component inside can be serviced easily with plain old mini screwdrivers.
The best feature is the hinges. I've had several other laptops over the years that I had to abandon because the hinges seized up and the attachment to the screen was so weak that eventually I opened the laptop and busted the screen. I don't see that ever happening with this elitebook.
I used a 2007 EliteBook for years before my current MacBook. They're great machines.
Over the holidays, I found out that my mother's old (2012?) Dell had broken. I gave her the EliteBook; it was a dramatic upgrade from the newer Dell, along every dimension: build quality, keyboard and touchpad, CPU speed, memory upgradability. It was ridiculous.
Personally, it's because my experience with their high end consumer laptops made me never trust their engineering again.
Never have I had a laptop that was such a failure in terms of thermal engineering and also mechanical design. Constant thermal throttling, and both hinges cracking after a year from normal opening and closing of the screen.
I've had some issues getting several HP laptops to play nicely with Ubuntu. Some of this was the graphics chip, but there were some basic driver issues as well.
Though, thinking about it now, the machines were not bad. No Macbook Pro in build quality but it worked well enough.
Can we stop using MBPs as quality standard please? I dont know how a such fragile case, mushy keyboard, heating issues, driver incompatiblity everywhere except OSX, ... Even qualifies as "quality".
No question they are good laptops, but there are several out there with much higher build quality. Thinkpads, XPS, ...
Edit:// Before i get more hate on this, think about it. If we make manufacturers think we want macbooks all we get are shiny slim cases, if we messure quality with Thinkpads we get a lot more
Having gone through Dells, HPs, Toshibas, and Apple laptops...
The reason I use MBPs as the reference is because it is still the laptop I enjoy using the most on almost all metrics.
The thinness is nice, a touch thicker to get more battery life would be nice but I use my laptop for several hours unplugged heavily without issue.
The aluminum case feels better than any other laptop I've held. XPSes feel nice, but not like my laptop feels.
The trackpad is excellent. Smooth surface instead of that bumpy stuff on most laptops.
Last-gen Macbook keyboards (pre-butterfly) feel very nice.
I use MBPs as a reference because that's what I mostly want. If I could get a MBP with dedicated GPU and no driver issues for Linux, I'd be pretty happy. Thin things are nice.
I have the Skylake HP x360 with the 4K screen. The 4K screen has awful color rendering that makes everything look like barf and the trackpad is even worse.
Nobody ever said anything good about HP's consumer laptops. You can basically think of their consumer and enterprise lines as separate divisions, almost separate companies.
I'm not sure that's true anymore. 4k resolution means that you can now fit two 4:3 monitor's worth of pixels (1920x1440) width-wise side by side on a single 4k (3840x2160), 16:9 screen. It's almost 2 monitors in one.
TL:DR - 4K screen sucks regular batteries dry - needed bigger battery for same battery life. :(
I got excited that this might be a good trend until I read this paragraph:
"Unfortunately, the claimed three hours of additional battery life aren’t meant to make this laptop into some long-lasting wonder — they’re really just meant to normalize its battery life. HP will only be selling the 15.6-inch x360 with a 4K display this year, and that requires a lot more power."
"By increasing the laptop’s battery capacity, HP is able to push the machine’s battery life from the 9.5 hours it estimated for the 4K version of its 2016 model to about 12 hours and 45 minutes for this model. So it is adding three hours of battery life, but in doing so, it’s merely matching the battery life of last year’s 1080p model."
It is still a net gain over the previous year's 4K model's battery life.
I believe Apple is already making the argument but how much battery life do you really need? 8 hours seems like an easy figure to throw out because its a "work day" although most people can just plug in their laptops.
What use case needs 12+? That seems "good enough" to me for most applications. Is it building in more life so that as the battery degrades there is still acceptable battery life?
Are there more applications than I think where the laptop is actually not plugged in all day? When does that happen? Manufacturing? Some kind of in-the-field work?
I have no experience with it so honestly curious how people use laptops as more than a mobile terminal in an office setting.
Generally speaking "12+ hours" means "basic browsing" or watching a movie which can be natively decoded with the built-in video processor. I don't think that includes any serious workload (tens of tabs open and any kind of developer/creative workflow). New Intel processors have greatly improved idle battery life, but "working" battery life still requires the watts.
In Apple's case they are reducing watts and presuming lots of idle time. This may match a large number of their users - I only hear complaints from developers.
EDIT: watt-hours? Sorry, my electrical knowledge is not awesome.
The problem with Apple is that extra battery is thrown out the window for the sake of thinness above all else. Can you imagine how much battery life a MacBook Pro could have if they just added back 3mm of height to the mold? Throwing away 4-6+ hours of battery life in the supposedly professional models for the sake of shaving off a couple of millimetres... it's a joke.
The latest MacBook "Pro" is no longer a professional machine... it's nothing more than the new generation of MacBook Air. There is no longer a Pro line, only upgrades for the least common denominator of consumer.
Source: desperately not wanting to give up OS X (it's the only operating system I can stand), but I'm stuck in a position where I flat out refuse to shell out $4200 CAD on a new laptop, when 3 years ago the same level of upgrades cost $1000 less. Their pricing has reached unacceptable levels of greed. That, and no fucking escape key... on a "professional" machine. Give me a break.
I hope you're taking into consideration that over roughly the last 3 years the Canadian dollar depreciation has added around $1000CAD to the same levels of upgrades by itself.
What they don't tell you is that it's only 8 hours under ideal conditions, and when the battery is new. After a year or two of heavy use it might become 4 hours. That's when the 12+ hours really shine.
Someone who migrates from conf room to conf room and doesn't want to carry the wall wart with them?
I know that I move my laptop (MBP) from location to location in my house and don't have a dozen chargers. I like that I can get everything done on battery and just take it back to my regular "desk" when done if I want.
Also 12h+ battery life today means 10+ next year and 7-8ish the year afterwards... future proofed in a way against both expected degradation and future software/bloat requirements.
A few use cases where I have wanted long battery life for include travelling, and spending the day at university studying. (IME there were quite corners of my campus, but they had no power outlets).
I routinely make 14-hour plane flights. Having long battery life for that is killer, at least until we finally see plug-in power ports in economy class.
But even day to day, "12 hour battery" is assuming CPU- and GPU-light workloads. The actual work I do, requiring a lot of compilation and GPU crunching, cuts a battery life in half or by two-thirds depending on the model. A typical laptop then is reduced to about ~2hrs of battery life, and an extended "12 hour battery" is more like 4-6hrs.
How much more work would it be to put in more RAM that could only be used when plugged into AC power? My understanding of the MacBook Pro design constraint is that they had to hit <100 watt-hours while using LPDDR3. Could a laptop easily have another 16 or 32 GB of RAM that consumes more power but is only available when plugged into AC power?
I would be happy to have such a constraint. I love my MacBook Pro. I love being able to carry it around and deal with things like email when on battery. When I'm doing serious development work, I'm sitting down somewhere plugged in anyhow.
The OS could gracefully page out to swap and power down the less efficient memory when switched to battery.
Are there any existing operating systems that can handle a dynamically-changing amount of RAM?
> The OS could gracefully page out to swap and power down the less efficient memory when switched to battery.
According to benchmarks, the SSD on the latest MacBook Pro hit 1.4GB/s. Even at this speed, it would take at least 10 seconds to flush 16GB of RAM to disk. I doubt your computer could do much while that was happening. I wouldn't call that very graceful.
I agree with that in this particular case. I would imagine the next rev of the MacBook Pro will use some Intel chipset that can support LPDDR4 or something.
But I'm wondering about this in the general case. For some definitions of "power user", it would make sense for the system to behave differently when running on battery. Pushing these design decisions down into hardware is something Apple does better than others, as they control the whole stack.
Most of the weight is due to the toughened design, not the battery, actually. We are talking about laptops you can use as a weapon to bludgeon someone, then take with you to swim, and they'd still be in a working state.
Actually you can run over them with an SUV and they still work!
Of course, unlike common laptops, they don't need a case or anything. Just carry them as is with their built-in handle.
If they didn't cost so much I would consider buying one. The added weight may make them slightly less comfortable in one way, but in another way, if you think about it, they're not flimsy pieces of denting aluminum you always have to treat with care, put in protective bags during transport and the like. Using a laptop like this must feel.. liberating.
I want something that's identical in size/weight to a 2009/2010 Macbook Pro 17" (huge in comparison to today's stuff), but with all of the internal volume that was occupied by the DVD-RW drive full of battery... I bet you could fit 100Wh of battery.
I used to use (and still own) one of those PowerBook G3 laptops where you could swap the media bay for a CD-RW, a floppy, or a Zip drive, or just more batteries. With modern Li-ion battery technology in both the main battery and the media bay, that thing ran for 20+ hours. It was amazing.
I strongly suspect that the weight of the battery that fills the volume of the DVD-RW drive is much greater than the weight of the drive, which would violate your initial condition. So which constraint do you break to satisfy your requirements?
Neither, I did not literally mean use the dvdrw drive space only, but keep the same chassis thickness and dimensions, much smaller motherboard, no 2.5" drive space (m.2 SSD only), fill the rest of the volume with battery. It would probably get a 12 hour low CPU load battery life even with the big screen.
You have to cap either battery life or weight/size. They capped battery life at ~10 hours and decrease size every year. Maybe one day laptops get light and thin enough and that changes.
The 2016 13" macbook pro with 70 hours of battery life would weigh ~3kg [0]. Have you ever used a light laptop (new macbook, macbook airs or ms surface to some extent)? The (lack of) weight feels really nice. The macbook would imo be the best laptop if it weren't for the keyboard. In fairness I've never owned a laptop with day long battery life so can't compare. Actually if I could buy a 100gram laptop with 1hour battery life I'd do that, the battery is less than 20% of the weight of a laptop though.
Portable 700gram 100Wh (the 13" pros are 55Wh and 50Wh) battery packs are less than $100 on amazon.
Ultimately, if you _really_ believe in battery life, line the back of your laptop with 18650's (one cell is 10Wh for less than $10), all you need is like 2 circuit boards, some solder and a bunch of electric tape. You can theoretically fit 47 on the bottom of a 13" macbook pro (you need ~35 cells for 72 hours[1]). Or get someone to manufacture a portable laptop battery case, macbook pros vent through the hinge, so heat should be less of a problem.
That's the "eat your vegetables" line of smartphone marketing. Good for you but a hard sell. Besides if battery life really matters to you then you would get much more utility from an external battery that can store far more than a single smartphone charge.
My charging brick is almost as big as my phone. I think I can charge my phone 3 times with it. Sure, it has its uses for times when I might be away from a charging source for a few days or I need make sure multiple devices stay charged or when I might be on it way more than normal (ie, train/plane travel). But while those use cases do exist, they are actually pretty rare. I don't want to have to carry it every single day just to get through the day. I rarely have a bag with me during my normal day-to-day activities so any external battery is going to be in my pocket (or I guess left in my car if I know I'll have access to it when I need a charge). But 95% of the time I would much rather just have a phone ~10% thicker that could really go all day instead of needing to always carry a second device w/ 3 days of charge. I don't really need 3 days of charge very often. I just need a good solid day, every day.
Why is it hard to sell people vegetables? I agree with you but I don't actually know why. Is it just because there's no industry group running a "got carrots" campaign?
I sell people vegetables professionally. It's not difficult at all. People will pay a fair price for quality organically grown tomatoes, cucumbers, kale, chiles, peas and snow peas, and various Southern and Asian vegetables. There's too much hand work involved in root vegetables to pay easily for first world labor, but you can make it work.
The problem being invoked is that the junk food market is much bigger.
There's only one phone maker that is getting to absolutely ridiculous levels of thinness, and that's Apple, while everyone else has understood that people enjoy 3000+ mAh batteries and care more about screen size, overall compactness and weight. I've never heard anyone say they wanted a <7mm thick phone (rather the opposite in fact).
Speaking as someone who's owned every model of iPhone (except the iPhone SE, but that's the same form-factor as a previous model), I actually do appreciate the thinness. Picking up an older phone is kind of weird now since it seems crazy thick. Heck, I don't even use a case on my phone, and a big part is because they make the phone significantly thicker.
I don't give a rats ass about shaving off an extra 1mm of phone thickness. I wouldn't even be able to tell the difference, to be honest.
On the other hand, being unable to both charge my phone (which needs it, because the short battery life) and listen to my conference call on headphones is ridiculous.
Only one vendor is pushing for crazy thin phones, and they are sacrificing the needs of their users to do so.
On the other hand, being unable to both charge my phone (which needs it, because the short battery life) and listen to my conference call on headphones is ridiculous.
That would be ridiculous. good thing that's not the case in reality, huh? Thanks, twenty year old Bluetooth!
This was the biggest factor to my switching to Ubuntu when deciding to upgrade from my 2010 MacBook Pro.
I can't stand glossy displays and I want battery life. High res displays tend to be glossy because at some point the grain on the matte coating is larger than the pixels.
On the PC side, it's easy to pick a matte 1080p display on the XPS13 and Asus UX305 and it greatly increases battery life. The 1080p XPS13 gets 14 hours! And it's matte! Win-win.
Whereas it's Tough Shit For You on a MacBook. Don't know why though, my old MacBook Pro has a matte display as a configurable option.
1080p? Why not just go back to individual blinking LEDs? A non-retina screen is a throwback to previous centuries and should be tolerated only by the visually impaired.
As an alternative point of view (as to say, I'm not calling you wrong at all, just that there are other opinions on the matter): I don't care about weight at all, or how thin the thing is. I want battery life, bigger screen size, and horse power. (Apple about lost me when they killed the 17 MBP). I call my laptops "luggable".
The good thing about bringing back the 17" MBP would be all the objections over battery life and low power GPUs would be moot. Even a thin 17" is going to be huge. The last one looked like 4 cafeteria trays stacked up.
I'm just not sure that there is much of a market for it. It's fun to imagine though. Mobile Xeon, ECC memory, 32 or 64 GB DDR4-2400 RAM, RAID NVMe SSDs. Priced like a Mac Pro :)
My Zbook Studio G3 is the size of my old rMBP and has a Xeon 1545m, 32GB of ECC RAM, a 950 NVMe SSD, and space for another (but m.2, not NVMe, unfortunately).
The 17" allow me to fit a dinner plate, side dish, and drink on top of it (closed), then carry it back to my desk or to the next meeting. The 15" won't fit the side dish anymore. I miss my Silicon Valley Lunch Tray.
Same here. I have a 15" lenovo W530. It's not thin and it's not light, but it packs enough of a punch and has a large, hot swappable battery. I just wish it came in 17 inches and had a better trackpad.
I loved my Lenovo W700, which was 17 inches. I keep thinking i should actually replace the keyboard and battery and put it back into service as a secondary machine. It was fantastic, but they stopped making the W7xx line several years back. :(
Windows and Linux have issues with HiDPI when scaling is not just 2x. 1080p is conveniently ~1/4 of 4K, so 2X scaling for H and W makes it look normal.
Also, panel manufacturers want to pitch 4K, not 2.5k or whatever would make sense. It's the same reason we have 16:9 instead of 16:10 which many people (including me) liked much more: panel manufacturers and companies selling these to consumers care more about marketing (and selling) than how good they are to use. 100% srgb coverage is easier to sell than e.g. Backlight uniformity or low backlight bleed.
Which confirms the consternation over the 2016 rMBP. It's a huge regression compared to the 99 watt-hour battery in the 2015 rMBP 15", but with few exceptions PC manufacturers were shipping 50-60 watt batteries in similar configurations. And going with power-guzzling 4K displays instead of the much more efficient 3K model in the MBP.