Keep in mind that probably the most important spec when considering a new laptop is one that is often not directly stated: the processor series.
I'm not talking about i3/i5/i7, but rather U/Y/H. This letter determines the TDP (thermal design power/point) them machine is designed to run at. The TDP will govern the setting for the base clock speed, and, just as importantly, the throttling behabior under load.
Processor series TDPs are Y: 4.5W, U: 15W, H: 45W.
The new MacBook Air appears to have a Y series processor, like the MacBook, which means it will be aggressively throttled to keep power consumption and heat generation low.
Practically, that means that the new Air will not be capable of running sustained workloads much above its base clock speed, which makes it unsuitable for many programming-related tasks.
The Pro is still a much better choice for programmers. The 13 is suitable for many things, but the 16, with the H series processor, is really preferred for computationally intensive work.
You can get away with this machine if your workflow primary involves a text editor and remote servers, but otherwise I would still opt for the pro.
Totally. To back this up with a real world example, Apple screwed up the throttling of the i9 MacBooks originally, causing them to be over throttled. A software update that pretty much amounted to an MSR write on boot fixed it.
Why does Intel let OEMs play with that? It's to give OEMs a knob they can play with to balance perf and price for their segment. It lets OEMs cheap out on cooling (or go with new form factors like the tiny GPD models) and go with the low minimums, or have a full system that's capable of higher TDP like Apple's tend to be.
"Boost well" probably means different things to different people.
On my fanless 12" MB with a 7W TDP m7-6Y75 (1.3 Ghz base, 3.1 GHz max turbo), running a Prime95 torture test stabilizes at the following for the CPU via Intel Power Gadget: 9.5W, 1.95 GHz, 90°. (Laptop feels warm to the touch - it only ever feels "hot" under GPU load.) Normal all-thread load stabilizes around 2.5 GHz.
This is underestimating modern processors and overestimating programming compute resource usage.
Your post may apply if we're working with 4K+ resolution video files, rendering and other activities, but the modern programmer, even compiling binaries, will be fine on a MacBook Air.
How do I know?
I use a 2 generation old Macbook as my primary personal development machine. I write Go, Rust, Java and TypeScript using the common tool chains for all those languages.
Generalizing "programming" is a fools errand at this point. For some, their IDE alone would warrant a pro-spec machine. Some people need to virtualize their development environments. Some people need to virtualize several environments interconnected. And for some people, programming is just a simple terminal session with tmux running - VIM on one screen, shell in another, tests in another.
I totally agree that you can't fit all programmers into one bucket. I think we should be able to describe some general patterns in programmer workflows such that we can give concrete advice to programmers about what kind of hardware they would need.
A perfect example of this the recent Level1Techs video about compiling an Unreal engine game using the AMD Threadripper 3990X[1]. Some concrete advice:
1. Often the heaviest aspect of a programmer's workflow isn't compiling the code, but running automated tests using VMs and stuff
2. compilers tend to benefit from large cache sizes and good single-threaded performance. Multi-threaded performance only makes a difference in limited scenarios.
I've also heard that storage performance, and in particular storage latency, can make a big difference on compiler performance, but the difference is starker on Windows than on Linux because NTFS is a lot slower than many Linux file systems (EXT4, XFS, etc.).
The thing is, we as programmers don't have a good mental model for how the software tools we use (IDEs, virtualization/docker, compilers) scale with modern hardware, and very few technology reviewers are producing content that would help us understand.
You haven't been forced to deploy something that builds a local K8s cluster and builds and runs 10+ containers just to get a microservices app up and ready for development, I see... (only partially /s).
I don't get this argument. Run local for basic testing, else use it as an ssh machine. I work with super computers. The power of my development machine usually doesn't make much difference, I spend a lot of my time ssh'd anyways. So if something has a nice screen, good battery life, and a comfortable keyboard then I'm good. As long as it works for presentations and can compile my latex files, I'm good.
I don't see Airs as development platforms. They are portable platforms.
It doesn't even take that much. Docker Desktop on MacOS is widely known to be terrible and have extremely poor performance. Running webpack-dev-server with filewatching through Docker nearly grinds my 2017 Macbook Pro to a halt. Obviously most node/webpack/JS stuff can run natively on Mac but it's not uncommon to see dev stacks that dockerize everything because that's how the production stack works.
docker-sync helps a little bit, and so does limiting Docker to 1 CPU core, but I still routinely get spurts 100% CPU from Docker that despite supposedly being on 1 core still grind my machine to a halt.
I often have up to 10 VSCode editors, an IntelliJ window and two browser instances open.
Add some server processes, UI watchers/builders, and my Thinkpad starts to freeze up. In an ideal world, I would have a low-latency remote desktop going, but even in 2020 it seems like a pipe dream.
I’ve been using paperspace cloud machines with parsec for low latency Remote Desktop. Originally used it for gaming. It’s remarkable how good it is for everything else too.
That's my work pretty much except not k8s but it does spin up between 4 and 8 containers (depending on which system) some of which require 4GB of RAM or more (don't ask..).
On the flip side standard issue laptops are now 16" Macbook Pro's with 32GB of RAM which is pretty hardware.
Well my "programming laptop" spends all of its CPU these days on corporate garbage such as splunkd, which regularly melts down, some kind of corporate managed software updater which is terrible, Zoom, and trying to render dashboards from Grafana which is among the most flagrantly wasteful javascript hacks ever devised. It doesn't help me much that vim and screen are efficient.
In fairness, as a Rust dev my computer sounds like it's a jet engine anytime I compile. I wouldn't even think about using a slimmer CPU. Not that you're wrong, it's just where my head is at haha.
I thought the same thing, but then I got a MacBook Air and couldn't believe how slow it was in practice. I didn't know if it was thermals or memory bandwidth or something, but Java compiles which used to take 5 minutes would suddenly take 30 minutes and it just wasn't even really tractable to use.
Generally speaking If you need any heavy computational workload for more than a few min the MacBook Air doesn't fit. The 3.4Ghz Boost only works in seconds and it long run it could only sustain at about 2 Ghz.
Not to mention previous MacBook Air were Dual Core only.
Even Chrome brings my machine to a halt. I've been running Docker recently and 8Gb of ram is instantly gone.
Sometimes I think most folks are simply more patient with their computers, and will wait a few seconds for operations to finish, while I get a feeling something is taking too long and start messing around with ps aux, killing processes and such...
I agree with you :) but let's not tell the employers who gladly buy the top-of-the-line models for their programmers ;)
I am quite happy with my new 2020 MBP.
Or how about just having a workstation at work? E.g. the Dell XPS 15 is a very popular laptop, but an equivalent cost workstation is around 8x faster for sustained well-parallelizable loads. Use a KVM switch with your work monitor, or use it as a headless server, do whatever. Want to work on machine learning stuff? Throw in an RTX 2080 Ti, off you go, never worry about cloud costs. Are you I/O bound? RAID0 off a couple NVMe drives.
It's immensely flexible and I'm surprised it's not more common.
It's more cost upfront, which is often a show-stopper. "Why can't you spin a beefy instance on EC2 when you need it for these 2-3 hours of intense work? It's so much cheaper!"
Also, a cloud VM, if it breaks, is replaced "for free" and "instantly"; fixing a physical machine has a cost and is a delay. Unless you're big enough to keep a spare (probably makes sense for many dozens of machines), it also looks a bit uneconomical from the business's POV.
Never mind that your cloud costs in a year end up being comparable to said workstation.
That was my experience at previous employers, but not my current one. We have some shared test environments in the cloud, but individualized test environments are discouraged. They're in the process of rolling out more powerful workstation laptops to provide us with enough horsepower to do testing locally.
That's why I'm tempted to build a custom workstation at home: so that I always have access to the compute resources I need to do my job.
I'm using a 4-year-old Thinkpad with a dual core U-series i7 and 32GB or RAM. The results are...adequate.
A typical compile for me takes about twice as long compared to colleagues with newer H-series processors, but the individual projects are pretty small so that often isn't a huge deal.
However, once you try and run nearly a dozen docker containers for local testing, things slow down quite a bit. Multitasking while those tests are running is very difficult.
I picked up the new 2018 Air right after the redesign, 16gb and 512 ssd model. I have had no problems with xcode doing iOS dev (even using simulator on occasion), vscode and nodejs/redis/mysql working fine, also no problems running windows in parallels desktop, it’s super fast and quiet!
Great machine, only wish I had waited for the true tone and keyboard redesign but those are minor, have grown to love the keyboard.
I did the same but got 32GB. It is a fantastic machine, but it is even better with the blackmagic eGPU. Heavy recommend on that if considering a home dock setup.
Can someone explain chip speed and relevance in a way I'd understand pretty please? I dialled out of this stuff years ago - at a point when bigger numbers just meant better, but now they just seem the same?
I'm still using a 2013 MBA as my primary machine day to day. It has an i7 and 8gb of RAM. I've been waiting for the keyboard change before I upgraded, but today I see my options are i5 and i7 (as 2013). I'd get 16gb of RAM, but literally have no clue about the chip stuff. I'm assuming we've moved on somewhat from 2013?!
For Intel chips i3/i5/i7 have generally represented feature ranges, so to really identify the chip you have to find out their product code or at least processor generation. Once you have that you can use ARK to distinguish between them. Here's an example of an older 2017 era desktop processor of i5 and i7: https://ark.intel.com/content/www/us/en/ark/compare.html?pro...
Complicating things recently is the introduction of the Xeon W and i9 lines, but for the most part the Xeon ones have yet more a new features and i9 are higher core counts to compete with AMD.
In broad strokes in the laptop space the majority of enhancements since 2013 I would say is higher RAM capacity (that Apple as a manufacturer has not to pursue compared to Lenovo, etc.), still yet lower overall power consumption to improve battery life, DDR3 to DDR4 ram, bigger L1/L2/L3 caches, and normal improvements to embedded graphics to support bigger onboard and external displays, and PCI-E enhancements for drives as NVMe-style drives are what I would consider the new standard.
That is it. Apple has kept it fairly simple most of the time.
Worth nothing the i7 is only very very very slightly faster as you are limited by thermals. I dont think it is worth the price. If you ever need the processor power then the current ( and future ) MBA isn't for you.
And in all honestly the CPU performance between your 2013 MBA and this MBA isn't all that different if you are getting the Dual Core. After all your 2013 CPU had 15W of thermals to play with, this newer CPU only has 9W peak, ( and 7.5W average ).
But the overall package of MacBook Air still many times better. Screen Quality, Ports, SSD Speed. Speakers. Not sure if they have updated the WebCam, if not it will be worst than your 2013 MBA.
Same question here. My 2011 Air is basically unusable now, so I've been waiting for this machine. i3, or $300 more for the i5 (which was my first inclination). Then you get into the i5 machine and they offer an i7 for another $150.
My work won't be processor intensive, just a personal machine. But given how long I kept my old one, I intend to keep this one for a long time, and I'm not sure where the best price:performance tradeoff is. I could argue going low-end and replacing it sooner, or high end, and keeping it forever.
Great to hear others getting same mileage as me - my late 2012 Air bought in March 2013 still going strong with only a battery replacement done so far. 4Gb RAM, 128 Gb SSD. Used daily for browsing, watching videos online, building IOT software, playing with unikernels. The only time it hasnt performed was when used last year as a work machine while waiting for macbook to arrive - and that was down to the collection of corporate stuff required. Once got new work machine, i removed all the work stuff, and the Air went back to being the solid daily hack it has always been. In the same time my friends with other laptops have had several cycles.
It's largely turned into a doorstop after the battery was dying a year or so ago. I bought a $70 battery from a random seller on advice from a friend who had good luck with fit and instructions. Replacement was a breeze, and it seemed great at first, but pretty quickly started just dying on me. I use the laptop so rarely, it just seems like, no matter what the SOC was last time I used it, next time I pick it up (usually a week or more later), it was dead and needed power.
that said, about a year or two ago was also when it just started feeling too slow to be usable (beachballs all the time). I blame web bloat, mostly :-/
I bought it new, so I definitely got my money's worth out of it.
Well I'm on a 2016 macbook air, 8gb RAM and doing web programming everyday. 3 phpstorm windows, sometimes Photoshop open, more than 20 tabs in Chrome, iterm, figma, paw... Honestly it runs pretty well and I was waiting for this update.
I'll wait for some benchmarks then. Maybe there's not much improvement. I see Apple is comparing last model that had 2 cores with this new 4 cores and saying there's a 2x improvement.
Yeah, but that's a lot of us. Obviously there are some programmers out there doing computation-bound work, but an awful lot of us aren't. If you're doing web stuff or making apps, then basically any computer manufactured in the last 5-8 years is going to be fine. I do iOS side work on a 2012 quad-core Mac Mini. It's totally fine.
It's OK to geek out about hardware, if that's your thing. But it's not really necessary for most people -- even programmers.
I think you’ll be happy with the update, especially if you were happy with the previous machine. Remember you can try it out for 14 days risk free, so if you have any doubts you have time to test it thoroughly before committing. https://www.apple.com/shop/help/returns_refund
I'm on a 7200u (i5, 15W) and I could really use the extra oomph (rounding things, it'd be 40% extra performance assuming a cubed increase in power consumption vs performance improvement -- which is pretty close to what I've seen in practice) of it being 45W.
Been looking at the newer gaming notebooks, and I'm waiting the "real world confrontation" of 10th generation i7s (specially those that have AVX512) vs Renoir (lots of cores, AVX2 on a single instruction and (not as much as desktop) lots of cache), since I run some numeric heavy code and it's not quite clear how they'll stand up to each other.
Why do I run this on a notebook and not a desktop? Because I'm always on the move, and I can pack the notebook and go anywhere I have to and be able to work with or without internet connection (which is sometimes the case).
You might be interested in the ThinkPad P1 (Gen 2). It has the option of not including a dedicated GPU, which will save a considerable amount on battery power, while still packing in a 6-core i7 H-series processor.
But I am also interested in the upcoming processor generations.
Thanks for the suggestion. I was willing to wait a bit more to see that standoff: What will get me more bang out of the processor, the "super" architecture of Zen 2, or the coveted AVX512?
The P1 comes with a 9th gen i7, I assume they'd eventually get a gen 3 with 10th gen i7 and then I'll be able to compare.
> What will get me more bang out of the processor, the "super" architecture of Zen 2, or the coveted AVX512?
Bychance, have you seen the article from puget systems on how to use MKL with Ryzen[1]? It explores some of the different trade-offs between AMD and Intel for numeric computing.
Some people need to carry their computer around, so the laptop has to be a compromise between stationary computing power and portability. The MacBook Pro strikes this balance quite well all around. Of course there are bigger and heavier laptops, if you require even more performance and are not limited by portability.
They don't advertise their memory models or disk manufacturer either. Probably typical customer does not care. Wait for teardowns if you're interested in those details.
I'm a FE developer (Mainly React/Node/React Native Development) and I'm still using a 2012 13" Macbook air (8Gb) as my daily driver at home. There's a noticable difference between it and my work (granted still relatively old) 2017 MBP 17". Running builds and an entire test suite maybe take twice as long but it's overall very usable and I see no reason yet to upgrade.
>You can get away with this machine if your workflow primary involves a text editor and remote servers, but otherwise I would still opt for the pro.
I bought a 2018 15 inch MBP with 6 core i9, 32 GB of ram and vega GPU with the idea that it would be my one stop development machine. The system is terrible, especially considering the outrageous price - it does have sufficient compute power but the thermals are insanely bad - it's constantly turning the fans to 100% and I can't even tune the power usage - when I run a VM/emulator along the IDE people in the office start turning their head in my direction because of the fan noise.
And the performance still isn't even close to a mid range desktop machine which would have cost 1/4 the price.
I'm seriously considering selling my MBP buying this Mac Book Air top config (I need an OSX machine to develop iOS/OSX apps, otherwise I would go for Lenovo X1 Carbon) and building a desktop which will be always on VPN for heavy lifting + it can replace my gaming console.
And the performance still isn't even close to a mid range desktop machine which would have cost 1/4 the price.
I'm seriously considering selling my MBP buying this Mac Book Air top config (I need an OSX machine to develop iOS/OSX apps, otherwise I would go for Lenovo X1 Carbon) and building a desktop which will be always on VPN for heavy lifting + it can replace my gaming console.
I have a MacBook Pro 2018 and Intel NUC running Linux with exactly the same CPU. In typical C/C++/Rust builds, the Linux NUC is perceptibly much faster than the MacBook.
Didn't dive much deeper, but I'd guess the difference is thermals (despite the NUC having a small enclosure) and much higher system call overhead in macOS.
That said, I still like having a Mac as well, since there are so many great applications that I use frequently. But a sweet spot (as you suggest) is a reasonably powerful MacBook for the things that macOS is good at and a powerful Linux workstation or server for compute.
Unlike most other Y-series devices (including the MacBook), the MacBook Air has a fan. This allows it to dissipate more heat and sustain higher clock rates.
This is why there is a huge difference between 8-gen or greater Intel CPUs on mobile. As of 7th gen, the only mobile CPUs that had quad-core were the HQ-series 45W TDPs which required a totally different cooling design. Then, with 8th gen, all the regular low-TDP CPUs got quad core and the 45Ws started offering six-core.
For example, the only ThinkPad of that generation's T-series that had one was the T470p which had a significantly different design in terms of batteries, thunderbolt features, etc. to accommodate the significantly larger cooling system. With the very next year's model, all the models got quad-core with their usual low-voltage/low-power (and low battery usage) design.
MacbookAir always seemed like a tablet with keyboard to me. I have seen ordinary people complaining that they can't multitask apps there, that they have to run one app at the time. I think Air is not supposed and never was to be used as a development or video processing machine.
Where do you even find this info? Even on the Tech Specs page for the macbook air on apple.com it says "3.0GHz 6-core Intel Core i5 Turbo Boost up to 4.1GHz 9MB shared L3 cache" and doesn't give the model number.
Practically, that means that the new Air will not be capable of running sustained workloads much above its base clock speed, which makes it unsuitable for many programming-related tasks.
Does that still holds when using the power adapter?
OP's comment was about thermal dissipation, which, if anything, would be a more difficult problem with the power adapter (since the processor doesn't need to be throttled to conserve battery, for example). So, yes.
So for peak load, it’s an upgrade compared to the previous generation MacBook Air. But for sustained load, it seems to actually be a downgrade, as the previous gen still had processors of the U series.
I've been issued a MBP at work and now thinking of picking a 5 year old MBP to work at home, since switching two OSes is overwhelming at this point. Is this a good idea?
> The Pro is still a much better choice for programmers.
I'm not totally sure I agree. I typically do a lot of programming on my MacBook Air. It's simply a box I use to ssh to something else more powerful to do the actual runs.
Yes, if you're running computationally intensive stuff on your MacBook Air you're going to be disappointed, but if what you're mostly doing is typing into a terminal screen then it's probably perfect. And probably even over-powered? :)
The refined scissor mechanism with 1 mm of travel delivers a responsive, comfortable, and quiet typing experience. The inverted-T arrow keys help you fly through lines of code, spreadsheets, or game environments.
This isn't progress. This is the baseline. Apple have gone from bad to OK, and they're celebrating as though they've achieved something amazing.
We all (most of us anyway) wanted them to go back to the scissor design. Are we going to now complain that they did what the community has been begging them to do? Was butterfly a mistake? Yes. Were they slow to correct the issue? Yes. Now that they fixed it we should be happy about it.
As far as talking about it being amazing, its called marketing spin. This is how it works. However, those two sentences do not say anything about it being amazing. It simply focuses on the positive features of the keyboard. The two sentences above clearly communicate to mac users that the company has fixed the problems that people wanted fixed. Did you really expect a bunch of public self-flaggelation? They are telling us clearly that they did what we asked for. Perfect.
I think it's the marketing copy most people are taking issue with with.
They tried a new design, which was horrible to use and had a high failure rate. They continued to claim the new keyboard was amazing, and stubbornly continued to use this crappy keyboard long after the problems were apparent.
And now they are touting a "normal" keyboard mechanism as if they've invented something new and wonderful... only Apple could get away with such transparent BS.
I assumed that was because it’s the same key-mech that’s in their (external, Bluetooth) Magic Keyboard. Where it itself was branded “magic” just in reference to its two sibling peripherals, the Magic Mouse and Magic Trackpad.
IIRC, the peripheral line itself was started with the original Magic Mouse, which was branded as such because it didn’t have separate external actuatable buttons, but rather was just a smooth surface with a multitouch digitizer on the top half + a single actuatable microswitch underneath the shell†. Apple wanted the image of it “magically” figuring out when you left/right/middle-clicked (despite no L/M/R buttons) or scrolled (despite no scroll-wheel.) Also, the “plug in to pair” experience might have contributed to the claimed “magic”—it was a fairly unique approach to pairing at the time.
† Which is a design with some real benefits, like being easily disinfectable, with no crevasses close to the hand for filth and germs to accumulate in. (There is a crevice on the Magic Mouse, but it’s on the bottom, where your hands will never touch it.)
There is also a bit of “magic” in the Magic line of peripherals that’s not in the hardware itself, but rather in the OS: when the Magic line of peripherals—Apple’s first Bluetooth line of peripherals—was introduced, Apple added a feature to macOS where macOS will “train” the Apple EFI firmware to recognize devices paired in macOS itself, such that the firmware will later attempt to connect to such paired devices on boot. This means that e.g. holding Option on your Bluetooth keyboard to select an alternate boot device on an iMac would actually work. Which was kind of necessary, as those are the peripherals iMacs shipped with.
The previous sentence: "Now features the new Magic Keyboard."
"Refined" is also an adjective that has a definition close to "new and wonderful". It's not just a word to bring out an emotional response to make people feel like buying the thing.
They refined the scissor mechanism. It's an improved scissor mechanism with better key stability than previous scissor keyboards on their laptops (so sort of the "best of both worlds" between the butterfly keyboard and their old scissor keyboard). The new mechanism is also used in the Magic Keyboard and 16" MacBook Pro.
"Refined" means made better. This is better than the previous keyboard, and also features 1mm of travel, making it better in that sense than any of their previous keyboards.
Imagining a word to mean what you want it to, and then reacting negatively to that, that doesn't say as much about Apple as it does the observer.
Yes, that's by design; it's a purposefully ambiguous choice of words that be read either way depending on what the reader's subconscious wants to hear. Either way, they don't have to admit that they were wrong, customers that hated the old now now feel relieved and vindicated, and people are probably more likely to buy the new one. That particular choice of words is probably the result of millions of dollars of marketing psychology, focus groups and A/B testing.
Where does it say "we stuffed up, and in response to your feedback we've gone back to basics"? Sometimes, people want to hear acknowledgement of error.
A good value trade in program would have been nice. I managed to sell my 2017 15” for a bit under $2000 CAD to upgrade to the current 16”. I would have rather dealt with Apple than deal with hagglers and low ballers.
Last time I remember that happening was Porsche's recent launch of the 992 generation 911, in which they poked fun at the fried-egg headlights on the 996. If they can do it, so can Apple. They are both Jedi-level marketing orgs.
They did basically say that in the live announcement of the 16” MBP, but remember they still sell some models with the old keyboard, plus millions of people still will be using that one for years to come, so there’s no way they will disparage it for the next few years at least.
They said that when they announced the extended warranty program on the old keyboard. People who want to hear an acknowledgement of error can go back and read that, if they missed it.
People want to hear wailing and gnashing of teeth, which is silly.
It’s marketing man. Next you will be complaining that the burgers look a lot better on the billboards compared to when you unwrap them, or that the shirt looked a lot better on the mannequin than when you put it on, or that car ads always show their car speeding on an open road instead of stuck in a traffic jam during the morning commute.
Ignore the marketingese and don’t let it bother you so much. The important thing is that they listened to their customers and created a better product as a result.
We do not expect people to be honest in other aspects of their life. Just look at the cosmetics industry. Politicians with their campaign promises. Management with their corporate right-sizing and synergies. Kids and Santa Claus. (haha!)
Anyway, I consider fake reviews to be dishonest .. I consider this to be more like "putting a positive spin on it".
Well it isn't normal, and it isn't the old Keyboard either.
Despite having Double the Key Travel of Butterfly ( 0.5mm to 1mm ) It still felt the same as butterfly. The old Scissor had 1.3mm, while only 0.3mm difference, it felt night and day.
The new scissors also claims higher stability. Although I doubt this has anything to do with the design but rather of the "height" of the key be lower.
It is indeed new, but I know if it is wonderful yet. I haven't had a long period of time to try and use it.
Marketing aside, thank god we're back to normal. Some things are just better left alone and don't need innovation (vim comes to mind). Refinement sure. I love my 2014 MBP keyboard.
I dunno, I used to buy Apples because they had great hardware for a fair price (premium, sure, but if you're in the market for a premium machine...), but my last two machines have been a Dell and a Surface. It's different trade offs, but '\_o_/` I was never wedded to their operating system anyway.
I held off buying a new MBP replacement for two years now thanks to the massive criticism over the butterfly fiasco. I'd say the complaining was useful and had an impact. Not wasteful at all.
To be honest, yes, I'd like them to say "we messed up, and we finally recognised that, so we're fixing it".
I think it would be amazing if the most valuable, most design-focused company in the world admitted to everyone that they made a mistake. It would do a lot towards allowing everyone else to make mistakes without beating themselves up over it. After all, if the thousands of specialist engineers, paid billions in salaries, given the best equipment in the world, in a company that really (and I mean really) values design, can make a mistake, then it's kinda OK that your home page looks a little crappy on mobile.
Wasn't that the dust problem, though? Admitting that there's a problem when people are taking out class-action lawsuits against you seems a little late to me. And again, rather than a "we're so sorry, we made a huge mistake" speech, it's more of a "we're such an amazing company, we're going to fix your faulty laptops for free!" speech.
>To be honest, yes, I'd like them to say "we messed up, and we finally recognised that, so we're fixing it".
Exactly. In Steve Jobs days he would have either jokingly admit there was a mistake or at least say something that people dont like it ( hence admitting there is a problem ).
The new Apple put up a big middle finger and didn't act until there are class action.
Even with the fix, I'm ddisappointed in their response. II had a 2013 MBP that worked great for years and years, and was excited to finally get a maxed out MBP about eiighteen months ago. The ffirst year was great and then this damn keyboard started doing its thing. I'm deliberately leaving the keyboard errors in place for this comment, they aren't typos. Yes, they havee the keeyboard replaceement program for the enext 3-4 years. But then you have to be wiithout your workhorse for a week while they replace it, and then what, you'll probably have the same problems a year later. (Incidentally, I have a job switch coming up wiith some tiime off - my plan was to use that time to send in the laptop for repairs theen when my clieint isn't relying on my availability, but now that plan is shot wiith the Apple Stores and malls being closed.) And yes, I can fix this by just buying the 16", but this computer was expensive and was supposed to last me at least 3-5 years. I'm supposed to iincrease my spending to Apple? A sane program would be to buy back this lemon at a heffty price so I can buy the new one and be made whole.
Marketing spin is not clear communication. It does not deserve praise. It's not mandatory, either.
None of:
> MacBook Air now features the new Magic Keyboard, first seen on the 16-inch MacBook Pro. The refined scissor mechanism with 1 mm of travel delivers a responsive, comfortable, and quiet typing experience. The inverted-T arrow keys help you fly through lines of code, spreadsheets, or game environments.
says anything about "fixing problems that people wanted fixed" or "doing what people asked for." That would read:
> We used a lower-travel keyboard with a butterfly mechanism and alternative arrow key layout on recent models, and you said you didn't like it. We listened. The MacBook Air now features a proven scissor mechanism with a return to 1mm travel and classic inverted-T arrow key layout.
They're implying that they've come up with something new, which is a lie. That's not perfect.
As far as I can tell, this is the first keyboard with 1mm of travel that Apple has made. Before the recent butterfly debacle, their keyboards had as low as 1.2mm travel (on the iMac keyboard) or more. The much-maligned butterfly design got down to 0.5mm travel.
So this scissor design with 1mm of travel seems to be something new. The inverted-T arrow keys are not new, but the scissor mechanism itself is, a refinement of previous designs.
To be completely honest, they should describe the keyboard as "a lot less bad than the previous keyboards, but not quite as good as the ones before them".
The two sentence above clearly oversell the product: "refined", "delivers", "help you fly", plus all the details for something that is already known... It's not because everybody is doing glorified marketing, which often results in being deceptive, that we have to be okay with it.
Does refined not mean that they took out some of the bad parts? That's about as close as you can get to admission of a fuckup as you can get in sales copy
Honestly, Apple's marketing has always been worse than the other companies when it comes to sell basic things as a revolutionary, life changing progress.
IMO, when you see car companies 2020 model gets 3 more horsepower or 1 more MPG, it’s better to consider how difficult that was rather than simply expecting it. Even a slightly better toothbrush is a positive life change. We have normalized progress, but even seemingly incremental progress is making the world a better place.
Further, it can take revolutionary change to maintain incremental improvements. Filling HDD with say Helium has a lot of knock on effects hidden by the spec sheet.
"We've gone back to where we were three years ago after a mistake, sorry."
I don't understand what the issue is with Apple trying to sell their products. It sounds like people here are upset that marketing and sales exists, and that they use language to try to make their products seem impressive, and that consumers aren't rational when it comes to buying things.
To be fair the butterfly keyboard was pretty nice after you got used to it but unnforrtunattelyyyyy itt enndddeed iin fffaiiilurree foorrr moosst off usss. I batted mine back to the apple store after 3 weeks. Thank goodness it was the Christmas no questions asked period.
> Ok. How should they communicate this? "We've gone back to where we were three years ago after a mistake, sorry."
Yes. We expect adults to own up to their mistakes, so why don't we hold corporations to the same standard and instead just accept corporate bullshit from them?
> "We've gone back to where we were three years ago after a mistake, sorry."
Yep this is the right move, and as someone else says, this would be worthy of respect.
I think we've all just about had it with corporate bullshit -- and to be sure that says more about this moment in time than anything else.
Apple is consistently guilty of blowing smoke up our collective asses. It would be nice if they could give it a rest and simply be honest. But here we are.
"Half truths are lies". The pp is saying they're lying because they didn't admit fault. What's wrong with a company trying to sell a product by outright honestly saying "yeah, we tried to balance usability and thinness, and we went too far, we heard your complaints, and we've gone back and refined our old designs". I mean, they can tell the truth the whole truth and nothing but the truth in an even more markety way if direct honesty is too much.
This seems like nonsense. Must they admit fault in every communication for the next three years? Must they use specific words while admitting fault?
There's nothing wrong with a company saying "we screwed up." Which Apple has done already. There'a also nothing wrong with a company saying, "This keyboard is great," if it is in fact great.
No, they're not saying it's innovative or amazing, they are simply calling it a "responsive, comfortable, and quiet typing experience", which I guess is true.
Their (very) old keyboards are still amazing to type on. They went from high quality mechanical switches to bad rubber domes to okish rubber domes. The current magic keyboard is actually not bad, but I would still prefer the old alps switches.
I completely agree. My 2014 MacBook Pro purchased refurbished still has the best keyboard of all the devices I use. The Lenovo X1 I use for work is a close second.
What I noticed in the image that made me excited about this device is the function keys. If the 16in MBP had function keys, I probably would have purchased one already. I do wonder if the top spec of the new MBA is going to hold up to my usage though.
Whole world seems to be celebrating that Javascript desktop turds runs 10 times slower and consumes 10 times more resources than 20 year old native applications. So I guess everything is worth celebrating.
In the olden days, we complained about apps that used all our ram. Nowadays, we complain about Firefox or proprietary web browsers using all our ram, when in reality they're doing the best they can. The task manager can't show users who's actually to blame, so lazy devs get the glory while hardworking browser developers cop the flack.
Except that when I use the old apps that used to use all my RAM, now they don't because I have more RAM.
I don't blame the browser vendors (except maybe that V8 made JS juuuuuust good enough to make something like Node viable). They took a thing that run slow for everyone, and they made it faster.
I do blame application developers for writing everything to the web because it's there. There's instances of folks doing better than most in Electron/JS land, but it's still nothing close to the native or even managed Java/C# apps of yesteryear.
Nope, that's exactly what I wanted to hear before buying. Rather than just saying they improved it, they explicitly pointed out it has the keyboard you want from the 16" MBP.
>MacBook Air now features the new Magic Keyboard, first seen on the 16-inch MacBook Pro.
That saved me from having to google "hey, is the 'new magic keyboard' the thing in the 16" MBP that I've been waiting for in the Air and 13" MBP, or is it something else entirely".
Some people just seem to choose a target for life (like the Favored Enemy of a Ranger from Dungeons & Dragons) and never give it a break, no matter what.
I suspect that even if Apple comes out with the best keyboard that mankind is ever going to make, some of you are still going to be angry about how they removed optical drives 4000 years ago.
- Apple knows that at the very least, some set of vocal people don't like the previous keyboard. They also know that many of their customers had to get repairs, even if they liked the keyboard. Those customers might understand that "butterfly = bad"
- They need to tell people that they've fixed the problem but don't want to do so in a way that says "the last product was bad" (so they can't just say nothing about it)
I think we should also place some fault on other manufacturers for just blindly attempting to do what Apple does without thinking: check out the latest XPS 13. They've implemented the same arrow setup as Apple's butterfly keyboards. And yet, I haven't seen a single review online that criticizes the XPS for this choice.
So much this- I struggled with my butterfly mbp for about 18 months all under the guise of "its a stout machine, the keyboard isnt that bad" or "I can use an external keyboard".
Then I grew tired of the macbook fan noise when running windows 10 and debugging with the touch bar. I ordered a surface 3 laptop and immediately realized how important a nice keyboard is to me. Its tactile, its got enough travel, the keys feel nice. I type with fewer errors and I work faster. Anyone want to buy a 2018 macbook pro with 6 core and 32gb ram?
I think you're underselling how truly revolutionary the inverted-T arrow keys are. I hear they help you fly through lines of code, spreadsheets, or game environments.
So is this a not-broken keyboard design? I have the (now) previous-generation Air, and need a keyboard repair.
Would be seriously tempted to just buy a new one if I was confident the keyboard wasn't absolute garbage. Typing on it was fine when it worked, but it double-spaces, and a couple of the other keys are now wonky.
Never had a keyboard die on me before this -- Mac or otherwise.
This is essentially the same type of keyboard from before the crappy design you ended up with. I've had a MBP 2014 since release with that keyboard and I love it. I can't express enough how good it feels to type on (although some personal preference).
I also tried out the new version of that keyboard on the recently-released MBP. It feels almost the exact same as the old one, just a slightly more shallow depth.
Conversely, I went from a MBP 2015 to the 16" model, and I think the "fixed" keyboard is still terrible. I've used all sorts of keyboard, and it's the first one where I regularly have doubled or missing letters when typing. Maybe I'm not hammering the keys hard enough?
I mean, this has a quad-core option, too. It's a no-brainer better machine.
I have the 16-inch that has the same keyboard mechanism as this. It's 100% an evolution on the previous design. While it is early, it's a proven design and there's no chance of it having any of the issues that the butterfly keyboard did.
I feel like my 16-inch is the computer I intended to buy in 2016.
If you can afford the upgrade cost after you sell your current one, you won't regret it.
I use my laptops for my business for 3 years under AppleCare and Joint Venture protection, buy the top of the line, and migrate the old one to the rest of the family. For children, the laptops are fine for another 5-7 years. The current laptop has the old scissor keyboard, and I'm waiting for Apple Stores to open again to pick it up from depot-sent-parts Genius-repair under an AppleCare that just ran out end of February.
Unfortunately, Apple will only replace the butterfly keyboard for 4 years after initial retail purchase of the laptop.
I dread the eventual breakdown of the new butterfly keyboard after the end of February next year, when I'm on my own. I hope iFixIt will start selling DIY repair kits, or the cost to repair at Apple makes such a frequent failure a non-starter (in which case I'll turn it into a fixed-place computer in the house, with an external keyboard).
Don't hold your breath on repair kits. It's a very difficult repair and you can't separate the butterfly keyboard from the top case. You have to replace the whole top case. The best you could maybe do is not replace the battery - but it's all glued. Good luck. I wouldn't attempt that sort of thing, and Apple doesn't.
I wouldn't be surprised if Apple bended that 4 year rule, though. But also realize that Apple will eventually stop repairing all computers just due to time passage. And also, eventually, from a financial perspective it's more cost effective to find a working used computer (after heavy depreciation) than to actually do a repair even if that repair is doable.
I do think that their replacement keyboards have better reliability than the early models. I didn't have any problems after getting mine replaced - I believe I used the replacement keyboard for around two years. But I still wasn't willing to keep the computer longer. I went straight from the 2016 to the 2019 16-inch MacBook Pro.
In any event, it was a nice opportunity for an upgrade. My 2016 was still worth $1000, and I ended up with 4 more cores, double the storage, and an absolutely massive increase in graphics performance. Using the education store and picking up in a state with no sales tax did a lot to make that price more palatable.
If you liked the stability of the butterfly switches, but the travel and reliability of the scissor switches, the new thing is really pretty nice, and is arguably an advance over both.
It's amazing how well received the "I'm cynical and world-weary" angle plays on Hacker News.
"This is the baseline. Apple have gone from bad to OK"
Apple's scissor keyboard is pretty broadly considered the best in the industry by a country mile. Their butterfly mechanism was a bad misstep (I mean...almost indistinguishable from my Yoga 720, but compared to prior Apple keyboards), but saying that they went from "bad" to "OK" is just nonsense.
"and they're celebrating"
Advertising doing what advertising does. So brave on HN to point out that marketing is marketing-ee. Are you also telling me that the new car isn't going to make me an adventure seeking extrovert?
> Advertising doing what advertising does. So brave on HN to point out that marketing is marketing-ee. Are you also telling me that the new car isn't going to make me an adventure seeking extrovert?
Why is that acceptable? If you lie or misrepresent the truth in almost any other field, you get criticised. But when marketers do it, they're immune. That's just weird.
False advertising is, in fact, illegal. If in your opinion they've crossed the line into actual factual inaccuracy, it's your right to take legal action against them, or request that your state's attorney general do so on your behalf.
It is progress. Apple runs a monopoly, people seem to not be able to escape (I'm on Windows) and Apple has struggled to fix an utterly broken keyboard for 3 or more years. So, finally they did this on an entry model.
Even I who gave up on Apple thought, hey what a nice machine.
Imagine a country where a terrible leader comes to power, and the nation regresses for years. Then a new leader arises and reverses course. Does the country celebrate and boast?
This is below baseline. I still don't understand why it's important for them to make compromises at the keyboard, which is an essential part of the notebook experience. A keyboard has keys, keys have a certain height. Get over it Apple. I've tried the latest 3 generations of keyboards and the 2015 still comes out on top. It looks as if Apple's engineers are trying to fight the keyboard. Macbook's keyboard were almost unquestionably the best among notebooks. Now we're happy if they're not crap. Of course this is all personal opinion and I'm sure Apple tests these things extensively and only release them if they make for a significant improvement. They wouldn't release a broken keyboard and deny they're at fault for years, right?
After using the latest Macbook Pro 13" for over a year I have recently had my 2015 Macbook Pro 13" repaired. Both are max specs. It was a $600 bill, but it has been so worth it. The keyboard just works, it doesn't run hot and the fan doesn't blast under the slightest load, the performance is much better, the battery lasts longer and the external screen + keyboard and mouse are detected every single time without having to re-plug the USB-C or open and close the computer lid. Also, no dongles required to connect USB-A or SD cards. Yes, it does look a bit clunky and not as elegant as the newer one, but seriously it actually just works.
I almost can't believe how much shit I put up with on a daily basis for over a year. If you replaced a dying 2015 Macbook Pro with a new one, I very much urge you to reconsider getting it fixed at pretty much any price. It is so very worth it.
Ha-ha, reading this from the 2015 MPRO 13", exact same experience. I've tried to move to DELL XPS 13 2 years ago on Linux, did not work for me, however with a new line of Dell's I am starting to thinking about repeating this experiment.
Apple's 2016-2019 laptops were pretty unusable, hope they revert the touchbar too..
A question, has anybody tried System76 comparing to Dell/Apple?
Wow, this is a common theme.
I have a 2015 13" macbook which wasn't max specced out and I feel like I'm hurting these days RAM wise and HardDrive wise. I bought a super specced out 2020 XPS 13 2 in 1 because of reservations about the latest macbook pro 13 and I kind of hate it.
I think this is personal taste, but I just don't love the build as much, the trackpad, and ubuntu is acting pretty flaky. The camera and wireless chipsets are not immediately working right.
I have both the 2020 2-in-1 XPS 13 (32GB, core I-10) and I love it because it's a beautiful 2-in-1, but yes, camera and fingerprint scanner don't work at all, but only in the 2-in-1 version -- works great in the laptop version, which seems to have a different camera. Wireless works great, though, as does the trackpad. (Kubuntu 19.10, not LTS -- LTS didn't work very well at all for me.)
I also have a lower-spec'ed 2020 XPS 13 laptop (8GB, core I-10) that works perfectly across the board (except for the fingerprint scanner, which I wouldn't use anyway.)
I love the hardware on both. Absolutely love it -- more than my Macbook Air; no sharp edge where I rest my wrists, touch screen, phenomenal screen (4k, which is higher than Retina, or 1080p, which gives better battery life) etc. The 2-in-1 has a taller screen shape, so you can fit more lines in your coding windows.
If you care about camera, get the laptop. It's also a little bit cheaper -- I got it for about $950 from Costco. The only thing I don't like about the laptop is that the screen doesn't tilt all the way back. The keyboard is better on the laptop instead of the 2-in-1 as well -- more depth of travel, and the Delete key is in the right place instead of offset by the fingerprint/power sensor as on the 2-in-1.
I love the new Dell XPS 13 and think both the 2-in-1 and the laptop are better pieces of hardware than the Macbook Air, and I prefer the new Kubuntu 19.04 over Mac OSX as well. (I don't have a MBP so can't compare to the Pro.)
I felt the same way as you about my 2015 13” so I reformatted the drive with apfs and did a fresh install of Mojave. Got a ton of storage back and it runs like a new machine. One of the best things I’ve ever done.
I think System76, Lenovo, or Tuxedo is a better choice than Dell XPS for Linux from a reliability standpoint. Additionally, if you plan to do full disk encryption, the XPS line has all sorts of issues. https://www.dell.com/community/Linux-Developer-Systems/XPS-1...
Still using PureOS, just so I know any problems are caused by me and not some strange hardware/software issue. It's a perfectly serviceable Debian-based distro, haven't had an issue with it so far.
Weird - I've got a newer Dell XPS and am making this post with WDE enabled and have had 0 problems. Anecdotal I know, but I've ran pretty much every major Linux distro (Ubuntu/Fedora/Manjaro) and am currently running Regolith linux.
I've not noticed any hiccups or problems at all around encryption.
I've run Linux on my xps 13 since 2015. Issues at first when the hardware was brand new and not supported in the latest Ubuntu kernel but excellent ever since. Fully intend to buy another if it ever dies, no worries about reliability here.
Tuxedos aren't cheap, but the coolest thing about them is they have a way to preinstall Linux and setup full disk encryption with LUKS and you just set the encryption key when you first turn on the machine. Never seen anyone else offer that.
I use a MacBook and a fully loaded (with GPU) System76 laptop. I really like both, for different purposes. The System76 is much faster for large builds and is obviously much better for deep learning training. The MacBook is light and portable. I like having two laptops, at extremes of the weight/portability vs. computational power spectrum.
I had a 2014. It is a great laptop, no doubt but I recall before several patches I had issues w/ USB 2.0 port connectivity. I had the screen replaced once. So it wasn't without its own set of bugs. Working beautifully now.
My only beef is no NVME and it stutters a tad. If I could find a 2015 w/ 16gb/1 tb spec w/ NVME...
I'm hoping the glitches in the 16" can be worked out, eager to see how the 14.1 will look...
and I know Apple kremlinology is never the most accurate way to look at things, but boy were the Mansfield/Forstall years great...
I tried dual-booting an XPS15, which did not go well [0]. I understand the 13's are more Linux-compatible, but I got so badly annoyed by Dell Support I'm never buying Dell again.
[0] the drive management software didn't like GRUB, and would complain that there was no drive if GRUB was installed. Repeated reboots later, it would suddenly say "oh look, a hard drive!" and boot into Windows, removing GRUB in the process. Dell Support were worse than useless at diagnosing the problem, let alone fixing it (still unfixed, the machine is now a rather shitty games machine)
Typing this from Fedora 31 running on a dual-boot, 4-month old XPS 15 (LCD screen, not 4K OLED). The only configuration change that has to be made on a new XPS is switching the SATA mode to AHCI. Fedora signs the kernel correctly--you don't even have to disable secure boot.
Signing 3rd party kernel modules under secure boot isn't difficult, but documentation on it is sparse. So I've kept my notes for next time: https://lawler.io/scrivings/linux-cookbooks/
I don't know if that changed recently. I gave up on it a year ago. I went through all this (though with Ubuntu not Fedora), switched modes, disabled secure boot, read every web page I could find. Nothing worked. I'm kinda tempted to see if I can get it working now, but honestly, also kinda traumatised by the whole experience and never want to touch it again.
I'm not familiar enough with Ubuntu's installer capabilities, but Fedora's has made huge advances in the past 5 years. I went in to the setup fearing the worst (that is, the struggle that was doing a secure boot, UEFI Windows 8 + Fedora 20 setup in 2014), but was shocked at how good Fedora 31's install routine was.
I shrank the 1TB SSD partition in Windows, disabled the proprietary fake-RAID, then booted Fedora's XFCE spin from a USB key. The installer handled everything else--no having to manually tinker with the UEFI partition. Post-install, the XPS's UEFI correctly booted to GRUB, which had successfully detected the Windows bootloader (again, no tinkering like with FC20) and can boot either OS.
Disabling hibernate in Windows and figuring out how to mount the encrypted Windows partition in Fedora was all that remained.
The Linux Kernel 5.5 has a bunch of improvements for the i7-9750H (and other Coffee Lake processors), which is another reason to consider giving Fedora 31 a spin. :)
I had this pleasant experience the same time I was trying to figure out how to dual-boot FC31 on a touch bar MacBook Pro. The short answer is: 80% of the MBP's hardware doesn't work in Linux. Broadcom won't release drivers that are compatible with Linux and blame FCC rules for their reason. Insane. https://github.com/Dunedan/mbp-2016-linux
I suspect this would be due to mixing legacy and UEFI booting, where grub-legacy would interfere with whatever the disk management was doing and fixing it would involve removing grub.
For grub-uefi, it is just another file on a EFI system partition.
possibly. I couldn't get a clear answer from Dell on what their disk management system was doing, so it got really hard to work out what was going on and where the problem was. I tried everything I could think of and the internet could suggest.
oddly enough I have a system76 ibex pro and a 2015 macbook pro 13". I'm almost a year into switching over. (Everything except lightroom). The system76 is much bigger, clunkier and not as well built. But honestly its fine, the software is updated frequently and besides a couple of times where the keyboard didn't come back after sleep the systm76 has been a trooper (sleep awake cycle brought it back).
Meh, the camera was on the left bottom side of the screen. On every online meeting, I was looking like a parent talking to kids from the corner :D
Also, I had a windows edition which works pretty bad with linux. A friend of mine got a new Linux edition with fixed camera and I think perfect XPS is 15" 32Gb for linux. That's my alternative #1
I'm still rocking my 2015 MBP as well, and having the luxury of working on newer models through various jobs, I'm still not convinced to upgrade 5 years later. That said, the new 16" MBP has been the best Apple laptop experience I've had in a while.
I have had every single MacBook Pro update since the 2015 and had either been returned or sold off. It was simply crap. I cant believe they have been making that crap for 5 years. Although judging from the 7 years of Mac Pro I guess MacBook Pro already got a better treatment.
At the end I stuck to my MacBook Pro 2015. Perfect Size Trackpad with Zero false positive. Most people put up with a few false positive on the larger TrackPad. But I dont see any reason why this have to be this way. The keyboard felt way better, even the new Magic Keyboard on MBP 16 felt exactly the same as butterfly. Touch Bar is Junk, again it sometimes freeze. And some people put up with it thinking it is a non issue.
The old Apple and its users used to be perfectionist. Nowadays a lot of people settle for mediocre.
I just wish they make MacBook Pro Classic. Just throw in a new CPU will do.
I wish I got this advice! I just replaced my 2015 MBP 15" max spec with a Mid-2019 version and that touch bar is annoying the heck out of me. At least it has a physical escape key I guess (I keep hitting F1/brightness controls on the Bootcamp side of things for everything).
Hah! I'm not sure what the situation is like for the current 15" MPB. The specs are quite different compared to the 13" one. It may not be as severe as my experience. I think I'd try to hold on to that 2015 model just in case.
I'm in a similar position, bought a 2019 13" Pro. I've got to say, I'm really happy with my 2020 Thinkpad Carbon X1 and won't be looking to upgrade again for a long time.
Or maybe a thinkpad, depending on your budget. I normally use a maxed out macbook pro but I didn't want to bring a $5k machine on vacation. So, I bought a $600 dell laptop last summer. The wifi range/reception was dreadful. Sitting beside my wife on the couch in the hotel and she was on her $1000 macbook air getting perfect reception, me wishing I had spent the extra $400.
I'm not sure if it was the wifi chipset or the antenna design, and maybe some of the higher end Dells with better radio chipsets would perform better, but I returned the laptop to Dell and I was pleasantly surprised by how easy that was. At least they're doing something right. After that, I picked up a $700 ThinkPad T-series for travel and the reception is great on it.
Last year I upgraded my main development machine - a 2013 11" Air (yes, seriously) to a new-to-me maxx'd out 13" 2015 MBP. I do "heavyweight" local development - IntelliJ, Java, Typescript, React, Postgres, etc.
It's faster than my Air, mostly from going 8 -> 16GB of RAM, and the high-resolution screen is great. But it still feels pretty slow and IntelliJ can get sluggish at times. The MBP has a 3.1GHz i7 so at least on paper it doesn't seem materially slower from the current gen of processors. Maybe memory speed is the issue? 1867 MHz DDR3 vs 3733MHz LPDDR4X in the new Air?
I miss the 11" form factor, but I guess that's a lost cause.
The 2015 MBP bought me (cheap) a couple years while Apple sorts out their keyboard issues, but I'm already looking forward to a replacement. It's just ok.
I very much feel your pain about the form factor. Apple seems to be moving away from the smaller form factors which is a shame. I like the look of the new 16" MPB but there is no way I'm lugging that around everywhere. Hmpf.
You would benefit from a 4-core machine. Even the dual core current models are a good 20% faster than what you have now, but you'd be amazed at how much better the 4 core models feel.
So happy to hear this! I just dropped CAD~800 to have the logic board on my late-2014 MBP replaced. I figured "it still works, why spend 3k on a new one?" Plus the new ones with those stupid bar things (I have one for work I never open because I always use workstation).
I recently got a 2019 MBP for work (with the ESC key!!!), and I have to say that a lot of the things you are encountering with your 2013 I encounter quite a bit with this new one.
Overall, I found the speed of this model far exceeded my 2017 MBP from my last job, but it's kinda surprising how frequently the monitors / keyboard / mouse encounter connection issues.
That being said, I feel like I'm being a little picky, especially when I think of how nightmarish it was to get any of this shit to work with my Windows machines. It's the exception rather than the rule that anything would work the first time as expected (monitors / keyboards / mice / printers / programs).
> That being said, I feel like I'm being a little picky, especially when I think of how nightmarish it was to get any of this shit to work with my Windows machines.
It would be different if it had always been like this with Apple. The older models show that it used to work pretty much flawlessly. It is such a big step backwards and just because other manufacturers are crap at it doesn't make it okay. :)
I feel like I could live with the performance issues. But I'm doing a lot of switching between offices and having external screen/mouse/keyboard not working straight away is such a major pain. Even sometimes during the day after walking off with the laptop and coming back to the desk it doesn't work. It could be my specific setup but then Apple doesn't sell any official docking setups either AFAIK.
I honestly never understood the hate for the touchbar. It allows me to be much more granular with volume and brightness, and I never really used F-keys anyway.
Volume adjustment is actually a great example of why I hate the touchbar.
- I can't adjust the volume without looking at it. Because the touchbar is flat with no haptic feedback when I land on a button, it's hard to remember the exact position of the volume 'button' without looking. Sounds trivial - but combined with point 2....
- The way the volume control expands - it actually moves the 'volume down' button AWAY from your finger, which again requires me to keep looking at the control.
This means that when a loud song comes on, it can take 2-3 seconds to quickly turn the volume down in total. I could do that with one single keypress in half a second or less on a keyboard, without needing to look at the keyboard.
That can also be the difference between missing a key detail from a quiet speaker on a Hangout.
Flashy, but it's a terrible user experience by every metric other than looks, I guess.
You can actually just tap the volume icon on the Touch Bar and slide your finger back and forth immediately and it works; you don't have to tap, then move your finger to the volume slider and move back and forth.
(This is clever, but basically undiscoverable unless someone tells you in, for example, a comment on Hacker News, which is how I found out.)
This is assuming the touch bar isn't asleep and you can even see where the volume button is in the first place. Often I have to touch the bar once just to wake it up, then find the button and touch and hold and slide.... ech I hate it personally.
> You can actually just tap the volume icon on the Touch Bar and slide your finger back and forth immediately
No you can't! There is a pretty long delay. If you move your finger during the delay, nothing happens. Then when it finally decides to switch modes, you have to move your finger again for it to change the volume. Hope you didn't hit the edge of the touchbar yet. Combined with the phantom button presses when using the top row of the keyboard, especially the Siri button, plus other small issues, the whole thing is bafflingly terrible.
It's potentially that I'm using a 2019 MPB, but I can absolutely touch and slide to change volume immediately. Just press and slide on the icon for volume or brightness.
Also pro-tip: you can change the buttons that show up in the touch bar. Settings > Keyboard > Customize Control Strip. I swapped out Siri for a "Sleep" button, which is super convenient when I walk away from my desk.
On my 2017 with Catalina there is an animation that occurs to show the volume slider. Any sliding of your finger that occurs before the animation completes is definitively ignored. Additionally, there is a significant delay before the animation even starts.
I just timed it at ~580 milliseconds, more than half a second from finger hitting the bar to the time when it stops ignoring touch input. It's easy to slide your finger more than the entire length of the volume bar in that time. It's absurdly bad. It would be weird and pretty lame if they fixed this only on newer models.
Just go all in with BetterTouchTool + GoldenChaos-BTT (https://community.folivora.ai/t/goldenchaos-btt-the-complete...) -- I don't know why Apple hasn't bought BTT and made this the default, it's truly the best way to use the Touch Bar and the reason why I miss it when I'm using any other keyboard.
I only hate that it replaced the top row of keys. If it were an addition instead of a replacement, I'd be okay with it. It has it's moments, but so do the keys it replaced.
I fully agree with ESC, as that's a universal key and used very often, but the other ones are more specific and having them be adaptable always made sense. Now that they returned the ESC key, that part is solved.
As a developer, how would I step into, step over, step out in Xcode without function keys?? (Continue being ctrl-cmd-Y is the worst shortcut ever). It truly hampers my development because I have to look at the touchbar to see where on earth those keys are (F6, F7) or step in/continue in Chrome (F10, F11).
The escape key is back on the new 16 inch, but even on older Touch Bar Macs you can tap anywhere on the left side of the Touch Bar (doesn't have to just be the escape button area) and it will still work.
Different strokes for different folks, but I've never liked using the function keys for debugging. I just click the buttons on the screen. I'm a little surprised they don't have a way to set the Touch Bar buttons up to do that in Xcode though.
I will try the "left of the escape key" trick - thanks!
Moving the mouse cursor up to the toolbar always seems a lot of travel and swishing around if you're hovering over variables to see their contents in the source code.
I have found the auto/local/all view in Xcode to be a bit dumb and unable to properly expand some template objects in C++ so it's all just an exercise in frustration anyway!
In Intellij IDEA when debugging the touchbar has a debugging-specific menu with all those controls. I don't find myself needing to look at the touchbar all that often. My muscle memory has adjusted over the past couple years I guess.
Want granularity? Just hold alt+shift while pressing the volume buttons to adjust the volume in quarter-box increments. You can do it without looking and it’s way easier than moving that slide on that gimmick touch bar. Works for brightness, too.
Much of the hate is that the touchbar wasn't optional, at least not unless you wanted to opt out of an Apple laptop. If the touchbar had been something users could choose, Apple users wouldn't have minded so much.
Supporting more options is expensive, so it's understandable that Apple didn't want to give their customers a choice. Still, it seems like a gimick. And it appeared at the same time as the butterfly keyboard, cementing the notion that Apple had lost its way.
I appreciate the touchbar every day (esp. with bettertouchtool) but the soft escape is horrendous as it's used in so many of my workflows and isn't 100% responsive and doesn't give any tactile feedback.
It doesn't add any benefit to my experience. I'd prefer real keys that I don't need to look at. I could hit volume up/down easily on the previous models.
Using Terminal, I use the Esc key a lot for navigating and having a touch bar Esc key is not a great experience since you also don't feel feedback that you're touching the right key.
I've also accidentally hit the touch bar a few times while hovering one of my fingers above it as I press down on one of the number keys.
You can be just as granular by using shift + alt + volume/brightness. That way your changes will be in 0.25 step increments rather than the default full steps.
keyboards are meant to be used without looking at them. with the introduction of the touchbar, you have to look at what you're pressing. it's like a giant touch screen in a car, it works, but you have to look at it, where as if you have buttons, you can find what you want to do by feel/memory.
on a personal note, i've randomly refreshed webpages because i've overreached on the number row with the touch bar.
I hope I didn't get permanent damage but I hurt myself badly with it. I was trying to put the volume up a bit while wearing earplugs (it was very low) so I pressed the "up" volume key. I accidentally pressed few pixels to the left from where I should have pressed and it went to FULL VOLUME without a warning, blasting audio and hurting myself badly.
This could not have happened without the touchbar. This is horrible UX and I will never trust that (work) computer again.
Because wasting energy on a 4 GHz base clock is completely pointless (and so is your criticism) if you can adjust frequency as needed. 99% of the time 1GHz is sufficient. It’s only when you launch the browser that a faster CPU is useful, not when you’re reading through a website
> Because wasting energy on a 4 GHz base clock is completely pointless
That's not how base clock works. Base clock is not min clock. Base clock is what it's "guaranteed" to hit under sustained load if TDP is respected. It's the TDP clock. A 4ghz base clock CPU will still be far, far below that when idle.
The problem is that it ramps up by 250 Mhz increments, over a period of several seconds, and that can be extremely noticeable in some workflows.
I went from an 6700HQ (2.6 Ghz base) to a 10710U (1.1 Ghz base) and the difference is definitely there, and it's jarring enough to the point where I kind of regret it. It feels like a huge step backwards, despite the latter CPU being four generations ahead.
The first thing I notice in that comparison is that one chip is rated for a 45W TDP and other is 15-25W. While I think these cross-segment comparisons are exciting and show great progress, it's just not fair to the electrons.
This does not seem to be the case for my i7-8565U nor my older m3-something. The 8565U for example feels just as snappy as my desktop i7-6700, except it’ll throttle after about 30s.
With modern noteboock CPUs the base clock is only a loose indicator for how a CPU will perform. The CPU will still be downclocked and undervolted depending on the load.
Definitely going from 2015 MacBook Air to this new one for my personal at-home coding laptop, as long as I like the keyboard when trying it out.
I had really been wanting to upgrade for Retina & better processor but I knew they would upgrade the processor and fix the keyboard if I waited for 2020... no reason to wait now.
I don't run any crazy fat Docker stacks for my own stuff at home, so this is perfect.
why not the pro for a few more hundred dollars? or wait for the upgraded version in the summer? you get a noticeable performance boost, dedicated graphics card, touch bar, and so on?
My personal laptop is an 11" Air mid-2013 and I still use and love it. I especially love the keyboard on it because the keys have height, feel closer to a mechanical keyboard, and don't capture as much dust and dirt as the flat keys on my newer touchbar 2016 pro work laptop
This is good news from Apple as I was not into any of their more recent laptops but I'll probably upgrade to this one
I only wish they had a 11" version but not a deal breaker
you actually make use of 4 usbc connections? I'm genuinely curious on the use case. These days you can get 12 in 1 dongles from china that cover the last 25 years of input device standards into one usbc connection, and it will charge the thing.
Why though? I find buying one $30 dongle that has 10 inputs for cables I already own is a better deal than buying 10 $20 usbc cables. As a mac user I've been used to dongles for a while, mac laptops never seemed to have the standard AV out aside from that fluke generation with HDMI. Always some weird connector for the sake of being weird, it seemed.
I can't tell if you're joking, but I have a 2018 Mac Mini with just 8GB of RAM, and I often run Eclipse, IntelliJ, and PyCharm at the same time (along with multiple browsers and other stuff), and performance is fine.
I was actually surprised by this--when I first started using this computer, I thought for sure I would need to add more RAM, which for the 2018 model is too complicated to do yourself (at least to me it seemed too risky).
Semi-joking, but the problem is real for me. I've a 2013 13' MacBook Pro with 8GB RAM, and my system can't cope with my workflow ... tens of tabs in Safari, webapps in Chrome (YouTube, Google Docs, ...), Eclipse with Scala / Java, ... it's a huge struggle.
I was handed a 2017 MacBook Pro with 8GB of RAM at my current job while waiting for my actual laptop to be delivered, and it was a nightmare.
I keep a lot of tabs open to look things up, but nothing excessive on that machine. I also run VSCode or Pycharm and would also bring up 5-10 containers at times.
It seriously hurt not only my productivity but also my mood afterwards just by having to put up with it for weeks.
Unless you're a very basic user I don't get why you would settle for 8GB in 2020. 8 gigs of RAM cost basically nothing, it's not worth changing your workflow in the slightest to work around that artificial limitation.
It is odd because these memory issues are very real, but if you ever say "wow, devs are getting lazy and these 'desktop' apps that rebundle Chrome are really killing my machine (eg. Slack, Skype) with inordinate quantities of logic in javascript" you get shouted down.
It's bizarre. If everyone used the native toolkits we'd have far less memory usage and everyone (even the memory-constrained) would have a good experience.
Also, with these memory hogs they will do a lot of allocation and deallocation. This is also a problem with interpreted languages. And allocation is the enemy of speed, and energy usage. It'll destroy your daily battery expectancy as everything gets interpreted.
I remember feeling the same when I was forced to upgrade from 32 mb of ram to 128 mb of ram to run the combination of browser, chat and IDE on windows NT4, back when they moved from hand-optimized assembly to mass-produced C++ for most software.
With every layer of abstraction added to ease development the hardware requirements go up. You can build things fast, or you can build fast things, doing both is tricky.
I think OSs in general just eat a portion of whatever memory you give them. Right now I'm puttering around with a dozen tabs in firefox, in fact my biggest memory hogs right now are firefox with 3.5gb and apple mail with ~500mb, not really doing anything else, and somehow 12GB/16GB are in use. Better for the ux to keep things open in memory if you have it to spare, I suppose.
When you are memory constrained, you can definitely tell. Everything comes to a halt and you just twiddle your thumbs between commands. This 16GB machine I have shipped with 4GB which was painful even 8 years ago when it was released, and I upgraded myself to 8GB 6 months into ownership. A few years later when javascript became more pervasive on the web, I hit memory constraint on 8GB a lot just from having tabs open in chrome, back when it was perhaps more of a memory hog, so I opted for 16gb and haven't had issues since.
I think at 16gb you should be set for at least 5 years. Most people, even a lot of devs on company issued equipment, are working with 8gb complaining about it right here in this very thread.
If you have larger requirements, a lightweight, thin laptop with a teensy fan isn't for you. Even if it had the hardware specs, the physics of heat dissipation don't work for you and you are better off spending the same money for more hardware sitting in a box under your desk. Me and my sore back are eyeing this up, all my computing is done on a cluster anyway.
Same. I have a MacBook Pro (Retina, 13-inch, Early 2015) with 8GB of RAM and I've been doing fine. Sure, there are some hiccups every now and then but it works. I have Spotify, VS Code, Slack, Kitty tmux sessions and more open 24/7.
What? It's probably swapping like a bastard, which with SSDs is probably not that horrible. Even 16GB for me is low (I do run in a Linux VM guest). I got myself a Mac Mini with 64 gigs, for great justice.
If I open up PyCharm and IntelliJ and Spotify and SourceTree and Docker and three different browsers and iTerm and Remote Desktop and a few other apps all at once, I will get an occasional hiccup, but it's really not as bad as I would have expected. I think 16GB would be nice though.
For comparison, I also have a 2012 Mac Mini at home with an SSD and 16GB RAM, and it's still chugging along pretty well too, although it's noticeably slower than the 2018 model with 8GB RAM.
I'm curious with regard to swapping if that might mean my SSD is going to wear out sooner. Maybe investing in more RAM would be worth it even if I don't feel like I need it.
Exactly. I'm deciding between 32GB or even 64Gb, just to be on the safe side. Because nowadays you're running Slack, Spotify, several messengers, Firefox, Chrome, IntelliJ, Docker and Kubernetes on your local machine.
Which is kind of horrifying, if you stop and think about it. You're wondering whether you need another 32G of RAM to run a basic working environment, a glorified text editor, and some communications software. I used a BBC that could do that in 32K of RAM in the 1980s! Obviously I'm not really suggesting the functionality today is equivalent, but the idea that ultimately you're meeting the same basic needs yet it now takes a million times as much space is... unsettling.
Its certainly amazing how much memory consumption has grown. I like to think of it in terms of economics, we could never write today's software using 80s methods. Slack in assembler? Impossible. Kubernetes in C++? Maybe, but there will be security holes, and Go is just more productive. Developers are expensive, very expensive.
Such is the accepted wisdom in much of the industry, but I'm a bit of a sceptic on this score. Of course developer time is expensive, particularly if you're in somewhere like the Bay Area where salaries are an extra 0 compared to most of the world. But we live in an era of virtualisation and outsourcing (sorry, "cloud computing") when businesses will knowingly pay many times the cost of just buying a set of servers and sticking them in a rack in order to have someone else buy a much bigger server, subdivide it into virtual servers, and lease them at a huge mark-up. All kinds of justifications have been given for this, many of which I suspect don't stand up to scrutiny anywhere other than boardrooms and maybe golf courses.
There's a nice write-up somewhere, though regrettably I can't immediately find it, of the economics of cloud-hosting an application built using modern trends. IIRC, it pitched typical auto-scaling architectures consisting of many ephemeral VMs running microservices and some sort of orchestration to manage everything against just buying a small number of highly specified machines and getting on with the job using a more traditional set of skills and tools. Put another way, it was the modern trend for making everything extremely horizontally scalable using more hardware and virtualisation against a more traditional vertical scaling approach using more efficient software to keep within the capacity of a small number of big machines. The conclusion was astonishingly bad for the modern/trendy case, to the point where doing it was looking borderline insane unless your application drops into a goldilocks zone in terms of capacity and resources required that relatively few applications will ever get near, and those that do may then move beyond it on the other side. And yet that horizontal scaling strategy is viewed almost as the default today for many new software businesses, because hiring people who are good enough to write the more efficient software is assumed to be too expensive.
We live in a world where any 1 man startup thinks and their investors hope they will have 10k employees by year end. Therefore, if you are going to be burning money anyway, what's another line item on the monthly outflow if it means you don't have to spend 3 months hiring someone to toil in the server room and a couple months ordering and assembling your farm that might crash the day your startup gets linked on hacker news.
There are technical reasons for this, being able to handle sudden load, but mostly it's for ideological reasons. We aren't building companies, we are building stock pumps guised as the utopian future. If you are wondering what a blue chip company looks like in tech, they are the ones that own their own infrastructure.
Maybe there is a middle road for cash poor companies, where you keep latent demand in house for the sake of cost and sense, but have some sort of insurance policy with a cloud service to step in if demand surges.
We don't have the same basic needs. People are running VMs on their laptops so they can test things in an environment similar to production without having to run extra servers for every developer/sysadmin to test with. Back in the 80s you're QA and Production environment were very likely the same!
I'll admit that modern text editors and communication software have grown resource hungry, but a lot of that comes from being able to deliver a strong, cross platform experience. I remember desktop Java doing much of the same with just as bad resource usage. Same with applets.
People are running VMs on their laptops so they can test things in an environment similar to production without having to run extra servers for every developer/sysadmin to test with.
Sure, but that immediately raises the next question of why those VMs are so big...
Yeah, the fixed cost of a VM context is on the order of kilobytes in the host kernel, megabytes in the guest kernel. And with VM balloon paging a guest VM acts much like a regular process in terms of memory usage. It's not VM usage that hogs memory, it's the applications, regardless of VMs.
Why does it matter when you can afford that RAM? Just buy and forget about it, it's cheap enough. We used to land to the moon with CPU less performant than Apple's HDMI adapter cable, it's fun comparison but not very useful, that's just the way things are and it's not going to change anytime soon.
I realise it's how things are today and not going to change any time soon, but it still feels like we as an industry have moved all too easily in a very wasteful direction. Sure, with RAM you can just buy more, but it's symptomatic of a wider malaise. Other capacities, particularly CPU core speeds, have long since stopped increasing on a nice exponential-looking curve to compensate for writing ever more layers of ever more bloated software in the name of (presumed) greater programmer efficiency. It just feels like we've lost the kind of clever, efficient culture that we used to have, and I'm not sure we weren't sold a bill of goods in return.
I'm not sure whether curve is still exponential or not, but it's there. Single-thread performance is increasing every year a little bit and core count is increasing like never before. 16 cores consumer CPU is not a dream anymore.
RAM size slowly increases as well. 4 GB was enough 10 years ago. 8 GB was enough few years ago. Today I would suggest 16 GB as a bare future-proof minimum and one can buy 64 GB for a reasonable price.
We still have room for more layers. And it's not only about efficiency, it's also about security. Desktops are still not properly sandboxed, my calc.exe still can access my private ssh key.
Once performance growth will really stop, we will start to optimize. Transistor density will double every few years until at least 2030 and AFAIK there are plans beyond that, so probably not soon.
I have 8GB on my work laptop with almost all of this (except Kubernetes, but I fail to understand why you would need a local Kubernetes) and it's fine, I usually have 2GB free memory.
Don't exagerate your memory requirements, you would be more than fine with 16GB.
That's not even close to an exaggeration. I'm running only half those things (or their competitive equivalents) right now on a Windows box. I just checked and I've got 14.8 GBs in use.
Fortunately, I have a Dell XPS 15 with 32 GBs of RAM, but the second I start up a single VM, one more messaging app, a small handful of Docker containers, or any IDE (of which I'm running none right now), I'm going over 16 GBs.
Realistically, most of us on HN probably need around 20-24 GBs, but laptops don't come in those increments.
I develop for a living. I use 6 GB including a browser, a VM and an IDE.
Some of you greatly exaggerate the needs. Some workflows require 16+ GB of RAM, but most people complaining about RAM mismanage it or do not understand that caches are not mandatory.
Right now on macOS I'm running Firefox, Outlook, 2 VSCode instances, Postman, 1 Electron chat app and another chat app and I'm under 5GB. Uptime 4 days.
I'm still using a 2015 MacBook Pro that I'm clinging to, despite the screen having developed an annoying flicker every few minutes. I know 4 others in my office with Apple laptops, and they're all 2015 MBPs because until now, nothing else has been acceptable.
This seems to be like an answer to the "Can we just have the 2015 MBP with updated hardware?" question. Pending benchmarks, this could be the light at the end of the tunnel for a lot of users that don't need more than 16GB RAM (I'm still scraping by with Docker on my 8GB Mac and a lot of SWAP).
I love my 16" MBP, aside from the dongles situation (though I rarely need peripherals at all). It has the same keyboard mentioned here; feels great to me. The Touch Bar doesn't really bother me.
Recently upgraded from my 2015 15 inch MBP to the new 16 inch MBP (32 gigs of ram). Overall it big upgrade - the higher resolution and bigger screen are awesome, and my XCode builds are compiling faster. That being said, I hate the touchbar for a few reasons
- Having to look at the keyboard to change volume is not an upgrade in user experience
- The touchbar freezes for me once a week or so and I have to go to the command line to kill the process
- The icons don't make any sense - (xcode dustcan anyone)?
Also, maybe its because I am a rock climber and my skin gets roughed up, but the finger print sensor has never worked for me on any apple products and its a waste of space. All and all though, the faster computer and bigger screen is worth it for me.
> Recently upgraded from my 2015 15 inch MBP to the new 16 inch MBP (32 gigs of ram)
Hah, that's exactly the upgrade I made except from a 2013
> Having to look at the keyboard to change volume is not an upgrade in user experience
Agreed; I wouldn't call the touch bar an upgrade, just a very tiny downgrade that doesn't matter much with the Escape key now separated back out
> The touchbar freezes for me once a week or so and I have to go to the command line to kill the process
Yikes, can't say I've had that happen :/
> The icons don't make any sense - (xcode dustcan anyone)?
You can turn off the application-specific touch bar buttons in your preferences and also customize what buttons appear. I have mine configured almost exactly like the old button top-row, except with an optional volume slider and a dedicated sleep button (which is nifty). But yes, just having the physical buttons instead would still be slightly preferable.
> the finger print sensor has never worked for me on any apple products and its a waste of space
It's just part of the normal power button, though? It doesn't take up any extra space
You can also go even further and do basically whatever you want with the touch bar if you use third-party tools like BetterTouchTool. Making your own buttons with your own icons, mapping them to scripts, whatever. I haven't gone that far myself but it's an option
...and yes, it's the same layout of the arrow keys that was there 14 years ago. If they were advertising new full size arrow keys that would definitely be worth mentioning, but I'm not sure why they had to draw attention to that particular aspect... especially given that half-height arrow keys are one of the more annoying things about laptop keyboards in general.
I still mis-hit one of the arrow keys about 20% of the time with the annoying 'full size left and right' arrow key arrangement. Going back to inverted-T makes perfect sense. It was an experiment that failed.
It was an awful experiment that should have never left the lab. What could possibly have been the thinking for the entire butterfly/touchbar line, "People should be looking at the keyboard more when they type. Also, there's not nearly enough wrist pain going around. I wonder what important part of the Macbook we can ruin to address both of these problems."
They're drawing attention to it because so many people complained about the old layout. For me, the arrow key layout was the sole reason I did not buy a new MacBook Pro in 2016, and one of the reasons I stayed away for 4 years.
This is a close Apple gets to saying "we fucked up".
Good. I'm hoping this will hurry along Dell's new XPS 13 Developer Edition configs with 32 GB of mem, 1080 display (I prefer battery over 4K) and linux. I have been itching to throw money at Dell since that was announced in January and here we are. Granted, the pandemic obviously hit their supply chain so I can understand.
If you like the XPS 13, I also recommend to look out for their "business laptop" Latitude 7xxx series (12", 13", 14"). They tend to have more ports, physical buttons on the touchpad, and slightly better keyboards. YMMV
Be warned about the whole DisplayLink display driver drama though.
Go for one with ports doing both Thunderbolt 3 and USB-C.
Docks that use DisplayLink drivers for video are absolute junk on Windows and Linux both. Am actually switching to an older 5450 to get back the ease of having dual displays over a PRO3X dock at desk and a whole host of peripherals (keyboard, mouse etc)
Nice. Thanks. I do indeed want more ports than the current new XPS 13s. Looks like they come at a premium. 8th-Gen CPUs (vs. 10th in new XPS) with a few more ports for an extra few hundred bucks? I will keep looking though for their latest model. Thanks again.
No prob. Another option for ultraportable with good ports, 10gen cpu and touchpad buttons (my personal obsession lol) is the Vaio: https://us.vaio.com
It's pricey but their SX14 is almost surely my next laptop. I'd buy one but they don't sell directly to Canada, and -- as of the border closure this morning -- I couldn't even go to the US to get one if I wanted to! Crazy times indeed.
(2019 darter pro) 4-6 hours of web browsing/vscode + light docker, depends a bit on what all i'm running. That's been fine for me but I wouldn't get it if you're in a scenario when you're generally unplugged for a whole work day / international flight.
4-6 hours would be doable. Thanks! I kind of wanted to stay on the 13-14" side but this looks like a nice machine. The smaller sys76 just have too small of a battery for me. Sounds like you had the same concern.
Same thoughts, I have the MBA 2018 and the keyboard is a deal breaker, I regret buying it and this is probably the last laptop from apple I will every buy
I bought the 2018 MBP and also agree. The keyboard has been a disaster. I’ve had it replaced 4 times now and won’t be looking back at their laptops anytime soon. The resale value has completely gone away for a 3500 purchase.
I was looking at this (2017 13" mbp with 512gb) - they offered me £380 for an 18 month old machine that cost me nearly £1700!
I bought the mac because it was an investment when I got made redundant and to hopefully shift careers.
Sold it for £800 privately (worst loss I've taken on any item), and now think I'd be lucky to get £700 for it as the price plummeted further once the 16" came out as people expect a 14" with a fixed keyboard.
Currently using a mid 2015 15". Better than the 2047 in every way, other than appearance.
They should have offered a lot more than price gouging resale value for it, imo.
Your trade-in would cover over 1/3 of the price of a brand new machine. That's a fairly significant discount.
If you used it for a year would you say you got $959 of value out of it in that time?
If I purchase a new car, I can either trade it in for lower Kelly Blue Book value to dealership and avoid having to interact with eBay or Craigslist lurkers, or put up with strangers haggling and try to get more from selling it to private strangers.
Yep the resale value for the current MacBooks are going to be junk, now that new ones are out with decent keyboards. I wouldn’t consider upgrading my 2017 MBP based on performance, but the keyboard literally hurts my hands so I’ll probably replace it.
There are tons of people like me who will be flooding the used market and driving down the price.
> 720p FaceTime HD camera
> 8GB 3733MHz LPDDR4X memory
I know cost usually isn't the top priority for people buying Apple products, but both of these starting specs seem pretty disappointing considering the price and for 2020. Perhaps someone realized keeping the cam resolution down is an effective method of capping FaceTime call costs. I will say, I am happy to see the transition to LPDDR4.
When Macbook Pro 16 premiered last year (using 720p camera too), I've read comment here on HN, saying that nobody produces higher resolution sensors that can fit into such a slim form factor (as notebook cover is slimmer than even thinnest smartphones).
Their competition like Dell XPS 13 2020 also seems to still use 720p camera.
> nobody produces higher resolution sensors that can fit into such a slim form factor
That seems like a very flimsy excuse. These seem to be the exact same cameras they've been shipping since 2011 and nobody would excuse them for using 2011 era cameras in their phones (I remember being disappointed in the quality back in 2011). Apple has added bigger and multiple camera bumps to their phones to accommodate space. That seems like it would work even better in a laptop since it doesn't have to lay flat on a table.
> Dell XPS 13 2020 also seems to still use 720p camera
It's not the resolution, the quality and low-light performance hasn't changed in Mac laptops at all. I don't think the XPS is a great comparison since they've had the camera below the screen for years (I think they moved it in 2020). Having the camera positioned so low you literally see fingers on the keyboard and get a very unflattering angle. So in my search it was eliminated very early.
I'm guessing it's just not a high priority for most purchasers even as remote working increases. The Dell XPS 13 reviews barely mention the camera and I couldn't easily find a sample from the camera. I've been working remotely, so it has been a huge priority for me. I've also been FaceTiming and find using a laptop is easier than holding a phone.
Totally with you on the 720p camera.. here a comment that was posted when the new 16 inch MB Pro was announced concerning the same topic: https://news.ycombinator.com/item?id=21524522
Anyone here who has experience working with JS/TS development tools (vscode, yarn, webpack, Node.js etc.) on a recent MacBook Air? I wonder if the performance is noticeably worse than a 13” MacBook Pro for this type of work.
Up until late 2018, my MBA 2013 was doing ok with a stack transitioning to full js/ts. Now it’s close to unusable as our projects build multiple containers, load hundreds of node packages etc. Unfortunately, the 2019 MBA isn’t much better. I’m still wondering whether to hold until a new MBP 13/14”, or give this a try.
That's correct. Not building/re-building the containers constantly. However, the initial build takes forever and some containers tend to crash (e.g., Prisma). Plus, in our case where we're using Angular, the recompiling of the app on change sends the fan into overdrive and I get noticeable lag on application switching. Again, this is on a 2013 MBA, which is almost ridiculous, but I'm that type of person. Still proudly sporting my iPhone SE. My development partner is on the 2019 MBA and has noticed improvement, but albeit rather underwhelming.
As an aside, I haven't taken the time to pin point the exact problem, but as much I prefer to use Firefox, it doesn't seem to do a great job of managing resources for the SPAs we're building.
Make sure your node_modules directory is mapped to a local volume inside docker (not synced to your Mac). This dramatically improves build performance in Docker for Mac. If using docker compose, try this out https://stackoverflow.com/questions/29181032/add-a-volume-to...
JS build times? I guess that I haven't touched JS development in a while... My Early 2011 MBP (with SSD and RAM upgrade to 16GB) still rocks with intensive information retrieval and other big data tasks (Go, Java, Python).
This is a dumb question, but you're talking about using docker, right? Doesn't docker spin up an entire vm for containers? That does seem pretty heavy for a MBA of any generation?
It's almost impressive that Docker-on-macOS works so seamlessly that people are downvoting this comment. Yes, Docker containers run in a Linux VM on macOS. It's just one though, and generally seems fine on low resource workstations, unlike Vagrant setups from a few years ago that tried to run a VM for every application.
> Doesn't docker spin up an entire vm for containers?
(It generally does not. Instead it uses more lightweight isolation primtives. That has always been one of its main selling points.)
EDIT: see reply
> That does seem pretty heavy for a MBA of any generation?
It very much depends on what you are running inside the containers. I regularly have a dev environment with ~8 relatively lightweight containers running on my Early 2014 MBA without any problems.
kasey: For deployment that makes sense, but for development, that seems like a lot of work to be happening.
If you change a small bit of css, the whole container is rebuilt? That seems super inefficient to me?
My backends arn't JS, but I still couldn't imagine rebuilding a container everytime I changed a piece of code. Right now I change a file, django picks up the change and reloads. No noticeable delay.
This has got nothing to do with JS. Plenty of devs (me included) work on frontend and backend JS without containers and everything is instant for us. OP has a special use case or build pipeline necessitating such complexity.
Optimizing your local workflows is worth the effort at every team size.
I think the attempts to make local workflows mirror production 1:1 (local k8s, for example) are misguided. They miss too hard on the productivity story. Especially once you start having to swap out local debugging for remote debugging (which only a handful of languages support effectively).
This is the biggest miss for many companies striving for microservice architecture. The local development story is terrible, and it stays terrible, without a lot of opinionated decisions around what your dev workflow needs to look like past the ~3-5 services mark. At a past gig I had to change 5 different repositories (5 separate PRs, all needing approval) to send an email. In a rails app that would have been a trivial task.
> At a past gig I had to change 5 different repositories (5 separate PRs, all needing approval) to send an email. In a rails app that would have been a trivial task.
That sounds like no thought was put in the system before development. Well thought out scopes & contracts between services should not create a situation like that. Reminds me of a commenter on HN who were rolling back changes to all microservices if one service needed a rollback. These kinds of issues are dead giveaways that devs are paying lip-service to "microservices." Also, devs need to understand that they don't need to be on microservices or on the latest fad. Use what works for the team. A 10-team of devs cannot necessarily adopt workflows of a FAANG.
5. (Something in the monolith to trigger the workflow, probably?)
This wasn't an entirely small engineering org - 300-400+ devs?
I agree, though, is the point. I think folk don't put enough thought into the development story when going this route. I think monorepos are the only sensible way to go about it once folk have cross-service work to accomplish, and monoliths are a _good_ way to build applications for longer than folk give them credit. That said, some monoliths are more productive than others. Slow startup/compilation can destroy productivity.
While this MBA turned out really well, current Windows featuring WSL2 is quite impressive though. Having a full Ubuntu or Debian on your machine is priceless. Or you develop on a remote server, then you just need a good terminal. So currently both macOS and Windows are good for JS dev. It's a matter of taste and price.
Do you mean the start menu? I never open it, just use Win + number for pinned apps.
Windows is not perfect but it excels in many regards nowadays and the last years, Macs were really expensive and risky. They were too expensive to just have a fall-back machine lying around (if you make serious money with you gear, every hour counts). I have 3x 2.5lb notebooks here, if one needs to be sent in, so what.
Edit: just opened it but there are no ads... what do you mean?
Regardless of using the start menu or not, there's a lot of background tracking (for user behaviour and ads) that needs to be disabled by registry edits. In addition to those, there's probably more that cannot be disabled.
Replacing /Windows/System32/Drivers/etc/hosts isn't black magic and truly works wonders even for the more paranoid ;)
What I don't understand about this, however, are people being paranoid about W10 while they're happily accepting Google and friends to infiltrate their lives ¯\_(ツ)_/¯
You are of course exempt if you don't use any cloud-based services that you don't host yourself :D
> Regardless of using the start menu or not, there's a lot of background tracking (for user behaviour and ads) that needs to be disabled by registry edits.
That stuff also has a nasty tendency to ""accidentally"" get re-enabled after updates from many reports I've heard.
Last fresh Windows install I did came pre-loaded with a bunch of loot box style gaming crap, Candy Crush, Xbox stuff, Netflix, Amazon Prime, loads of other foistware that should not exist on a fresh install.
A new Windows install requires at least an hour of de-cluttering and shit removal, while for a new Mac full setup to my liking takes about 5-10 minutes and I generally don't have to uninstall much of anything. The only thing I sometimes remove are apps I never personally use like Garage Band to save a bit of disk space, and that takes like ten seconds.
Then there's the almost Android level of obviously spyware telemetry going on. Yes I know Apple has telemetry but I trust them quite a bit more than Microsoft both not to do anything deliberately sketchy with my data and to be competent with their security. They also do a lot less and a lot less invasive telemetry. Redirecting local search, sending file list dumps, etc. to the mothership is unforgivable unless I have explicitly opted into that kind of behavior e.g. for tech support or debugging. That's the kind of stuff I associate with borderline malware, not an OS by a supposedly reputable company that I paid for.
Microsoft's recent behavior is pegging them in the low end of the market and ceding the high end of the market to Apple. Then of course there's Linux. If Apple did something barking stupid that forced me to ditch the platform that's where I'd go. I'd miss the Apple degree of trouble-free operation but at least I'd keep my privacy and security and lack of foistware.
I asked specifically about ads as I do not have those so curious why others do. I do not need generic Windows criticism as I am using it without any problem for like eternity, so do not really give a hoot if somebody else does not like it. I have whole bunch of Windows and Linux computers (laptops and desktops/servers) and am happy like a clam.
"Ceding high end of the market to Apple". I would not go into much details but for example my gaming grade laptop runs circles around Mac that costs twice as much. So sure I'd rather be a "low end" peon with the decent hardware.
I just looked at mine, no ads. In anyways I do not use start menu to launch apps. I just do Ctrl+Esc or Win key and proceed to type name of the app. 2-3 latters is normally enough to auto-fill result and off you go. I can't remember last time I used mouse to hunt apps in a menu
Even Chrome OS is pretty great. I found Crostini to be wonderful (esp in v82) and the Pixelbook Go has fantastic haptic impressions to me. The Keyboard is perfection.
Webpack, very very large Angular project with over 2500 jasmine tests. I'm using a 2017 Air i5 8GB (First gen of new design, Came from a 2016 Macbook Pro 15 i7 (so other end of the spectrum)
Node performance has gotten a lot better since some updates, there was a period where CPU use and performance was just bad. I think it was a webpack bug. But now things are pretty good! Runs Sketch and other graphics apps not too bad.
As a comparison, It seems to be much faster at compile and test running than running the project on Chrome OS Crostini on my ChromeBook Pixel LS (higher wattage, older, i7 + 16GB ram) I imagine the Quad core in this would smoke the LS.
It's perfect from a performance point of view.
I own the last generation (so no scissor keyboard) and I work the whole day in VSCode with JS/TS, Python, Go, Rust, etc.
I have a 2017 Macbook Pro and the main project I work on brings it to its knees. This is a front end project with a nodejs graphql server, TypeScript, eslint, VS.Code, webpack. I think actually the linting is the worst CPU drain, but all in all with the graphql server and front end tooling running at once, it makes my laptop slower than running the .NET core back end it connects to.
Kind of annoying. Working with a simple create-react-app project with TypeScript is fine though.
There's something wrong with your setup &| stack. I've run a full stack locally, a build setup with a watcher, a django/gunicorn setup with file watching, mysql/redis/rabbitmq, and i barely nudge the cpu.
I'm betting your if you cpu is hot constantly while developing, it's file watching that's the issue. I had to do some work to get things to use proper file system events in macos and not just polling with my legacy gulp project. try installing fsevents if it's not already?
There is something wrong. I’m running two separate Rails apps, MySQL, Oracle, RabbitMQ as a broker between the two apps, Redis, and both have Angular FEs that I use VSC with. Same year MBP, and my CPU load rarely reaches 5%.
Check your package updates! We had the SAME issue in serving the app on macOS and no issue on windows. Updating node and updating to latest packages produced a 80% cpu drop. When I had this issue I couldn't serve the project while running VS code on my Air, the lag from input was way too bad, now it's snappy.
NodeJS is, but neither modules nor backends have to be.
It's quite common to spin up a bunch of docker containers that provide databases and backends locally.
This is the point at which the MBA stops being a productive environment. I'm not saying this has to be the case for every project, but if your team uses local dev environments, it's quasi standard procedure at this point.
Right, but the GC was talking about developing JS applications on their computer, so presumably they were referring to compiling frontend JS projects, which multiple cores would not help, I think.
This announcement should mean that the current 13" Pro will soon be replaced by a 14" Pro with magic keyboard and higher power processor options, otherwise there will be too much overlap between product lines. But Apple has been content in the past to have overlapping product lines, so they might not.
True -- the next few months should be really interesting in this space. I was thinking of ditching OS X entirely for a linux box but I can't get over how great the Mac hardware is -- it just doesn't run Linux all that well and costs an arm-and-a-leg such that buying one to nuke OS X just seems like a waste
Speaking of which -- how does one go about buying an QWERTY layout XPS 13 dev edition with ubuntu in Germany? Navigating teh dell website is impossible to find it -- I just find the windows based units.
1.1GHz dual-core Intel Core i3, Turbo Boost up to 3.2GHz, with 4MB L3 cache
Configurable to 1.1GHz quad-core Intel Core i5, Turbo Boost up to 3.5GHz, with 6MB L3 cache; or 1.2GHz quad-core Intel Core i7, Turbo Boost up to 3.8GHz, with 8MB L3 cache
The previous one ran an i5-8210Y which was also Apple specific 23% higher clocked version vs the more common i5-8200Y but that was only one, this time it would need two, both the i5 and i7 CPUs are special but the i3 seems to be the run-of-the-mill version. Note how weird the cadence is because usually the frequency drops with the number of cores. However, the Turbo Boost is the same 3.8 GHz which is way too low for the 10th gen Amber Lake Y CPUs.
Also, at first I thought the video resolutions weird but what's going on here is that there is a single USB C output and even with Thunderbolt, that's only a 40Gbps bus. So the ICL-Y DP 1.4 support is somewhat less useful here. If there would be two USB C outputs, two 5K monitors could be driven easily, for example. Which the previous generation Intel CPUs couldn't because to drive an 5K from DP 1.2 the monitor eats up two display outputs even if physically that's delivered on a single Thunderbolt cable.
If anyone can solve this mystery, it would be great to compare to other laptops like the Surface Pro, whose i7 is "Quad-core 10th Gen Intel® Core™ i7-1065G7 Processor"(1.3 / 3.9)
I travel a lot and use a MacBook 12 for development. I like to keep it light, so the Air and MacBook Pro are too heavy, and the iPad pro doesn't work yet as a development platform.
So I got a 2019 Air just before the COVID-19 emergency in my area. I was looking for a small and light machine to carry around, with a little bit of coding and browsing mostly. I would have probably picked the most basic retina model with the dual-core processor (not like my 15m load average is ever bigger than 2.5) - I don't know about the keyboard and the graphics though, they seem ok so far.
Did I screw up? I don't live in a country with a Apple store it usually takes months for these to reach our market. I'm consoling myself that I got a good deal and I got a new machine that I needed before a long stint working from home.
if you contact them, they will likely upgrade it, Apple tends to let you bring a machine back if one has been released 3 months after you bought yours.
I remember when I bought my last Air I skipped Apple Care b/c I knew Apple made quality products. Things have certainly changed. Might as well add another $200 to that price, b/c you know something is going to go wrong.
What's it like to develop on a MacBook Air these days? I have a 2016 13" MacBook Pro and am thinking of upgrading, but I don't know how it'll handle JS/Docker/Rails.
More CPU cores - 4 up from 2
More storage on base model - 256GB up from 120GB, odd that the rMBA ever launched with 120GB when the 12" rMP shipped with 256GB as standard.
1mm of key travel (not "length") is what they have in the external keyboards they've been selling since 2015, and it's just fine. While this is obviously subjective, I'd argue it's the best keyboard Apple has made since the days of the Apple Extended keyboards. The 2011 laptop keyboards are the same keyboards they used up until the butterfly switches, they don't have as nice a "feel" as the Magic Keyboards, and it's not as if they have a lot more key travel, anyway -- only about 1.3mm.
I mean, okay, percentage-wise you're absolutely correct, and if you subjectively feel that the difference between 1.3mm and 1.0mm key travel is the difference between typing bliss and "might as well not be moving at all," who am I to tell you otherwise. For me, subjectively, the Magic Keyboard is the nicest-feeling "low travel" keyboard I've ever used, and as a technical writer and a novelist on the side who has a closet full of mechanical keyboards I'm pretty picky about my keys. Do I think the Matias Tactile Pro keyboard that I'm typing on at this very moment and the Vortex Race 3 with Cherry MX Clears that's usually my office keyboard are better? Yes. Do I want a laptop with them? No, in fact, I do not. Do I want a laptop with what I consider to be the nicest-feeling "low travel" keyboard I've ever used? Yeah, that sounds kinda nice. As always, your mileage may vary, and if you're willing to go back to having inch and a half thick laptops so you can get in that awesome mechanical clicky monster, more power to you.
> every year they get worse
So you are saying that the Magic Keyboard, with its 1mm travel, has too low key travel, but it is also worse than last year's butterfly keyboard with 0.7mm key travel? The keyboard you prefer may be subjective, but math is not.
The key travel has absolutely nothing to do with the thickness of the current MacBook as proven by many other vendors. It was a compromise / invention for the MacBook 12" which was light weight and thin.
And the despite the small difference, Magic Keyboard felt the same as the Butterfly keyboard. It is similar enough to group them in the same Category while the old Scissors keyboard being a different one.
I just switched to Linux on an lightly used LG gram after being on Mac for 10 years. $1300, 1TB nvme, 24gb ram, 15" edgeless screen, weighs less than 3lbs, and charger is lighter. Also has a full touch screen. Do I miss retina display? Yes, although I get just as much on screen since it's 15". Would I go back for 2x the price? No way.
What distribution and desktop environment are you running?
I switched to Manjaro with XFCE almost 2 years ago and it's been a joy to setup and use coming from Mac/Windows. As a matter of fact - setting up a Manjaro workstation is actually way less hassle than I've ever had with Mac or Windows. I don't have to fight against the OS to do what I need to.
I've a MacBook Pro (Retina, 13-inch, Early 2015) with an i7 at 3.3 Ghz. Is the new Air with a quadcore i7 going to be a decent CPU upgrade compared to what I got in practical terms, or should I wait for the next Macbook Pro? Consider that I don't want to go > 13" screen size. Thanks.
Relayed story: when I run the Redis test in my Macbook, it is able to stop everything, like if the kernel scheduler was not able to balance between the test and the other tasks, even if I've the Chorome window open and using it. If I open Virtual Box and run the test into a virtualized Linux system, the test runs in a similar time without any usability problem for the other Mac apps. This is how bad scheduling can go, but also how much more efficient the Linux implementation of certain system related stuff can be.
The use of LPDDR4X memory with the memory clock rate at 3733MHz is pretty cool, this is enabled by the 10th gen Intel processor [1] that's in it.
Not that memory speeds matter too much when it comes to perceivable differences in most user based compute workloads, however IMO this almost 2x jump in clock rate makes it noteworthy.
Edit: I just noticed that the processors are of Intel’s new Ice Lake line (currently only for ‘mobile’ processors as Intel is seemingly having scaling issues with their 10nm node yields) which is from Intel’s 10nm node.
If you have an external 'magic keyboard' the feel is similar. Less mushy than the old 'chicklet' keys on the 2015-and-earlier laptops, but less stiff/unmoving than the butterfly keys on the 2016-2018 laptops.
Still puttering along with my 2012 Air, only replaced the battery once. I've definitely been waiting to upgrade and this looks promising, but I also have a company laptop again, so the pressure is lower to get my old Air replaced.
I have a 2012 air as well (i7, 8gb) that still runs quite well. I dont do much development on it anymore but I'm still impressed with how well it has held up.
I bought one, maxed out everything but storage -can anybody really justify the 2TB option?
Fingers crossed it will work for lighter web dev work. I figure I'll know well within the 14 day return period. As a consultant, I work with a pretty diverse set of tools, but I figure I can at least do frontend work and throw together the odd lambda here and there. Also running a mac mini currently with a couple 4k screens as a main environment. Can anybody recommend a way to take both of these screens and switch their inputs easily between the mini and a macbook?
I want to replace my aging 15" MBP (2014) with possibly a Mac mini but am scared of its seemingly underpowered graphics. What Mac mini do you have and does it handle two 4K screens well?
hm, yeah. I work with a lot of graphic designers, they'll be working on multiple 20+mb Photoshop files that save over the course of the day as "SomeClient_2020_03_18_rev13.psd" etc. That stuff adds up fast. Adobe could help out a lot of their user base by making some kind of built-in Git-for-graphics tool…
If you're looking for a USB-C KVM, you're not going to find it. I would look into some kind of cloud sync for the Air's data to the Mini, and then just put the Air to sleep when you're at the home base or something.
Anybody know what's the cheapest setup you could get away with for iOS/xcode development? Would it be a (used) Macbook Air, or can you not really run xcode effectively on them?
It would definitely be painful to run XCode and iOS Simulator on a used MacBook Air, but not impossible. Price-wise, however, it's hard to beat a Hackintosh. Here, for example, is a $300 build: https://hackintosher.com/builds/cheap-hackinosh-2020-300-cat...
I remember being broke out of college and setting one up out of necessity. It's still going! I'm typing on it now. It needs a motherboard/processor upgrade, _maybe_, these days. Still one of the best cost-saving decisions I've made in my life. And it just plain feels good to not have to throw out / recycle a laptop every five years. They're terrible for the environment.
There are shops that will build the hardware for you. Setting up the bootloader is still the hardest part, but it gets easier with each iteration of the Hackintosh software ecosystem.
I developed my first few iOS apps on a 2013 MacBook Air with 4 gigs of ram. On larger projects (especially Swift) the builds could be a bit slow and the syntax highlighting would be delayed, but overall it was totally manageable. When I upgraded to a 16gig Mac book pro, I thought it would be a game changer to speed up the builds. To my surprise, they weren't really that much faster. It seems like the slowness of XCode and Swift isn't constrained by resources - its just slow no matter how much power you throw at it. So I would say for getting into iOS and Xcode development these new Macbook Airs would be more than enough and a used older Air would suffice.
I am surviving with a 2012 MacBook Pro that I have upgraded the RAM (16GB) and put a SATA3 SSD in. Just as fast as my late 2016 MacBook Pro, with better battery life despite being on the original battery (that I can replace) and a real escape key, proper function keys and a better lit keyboard than the new models.
The only "issue" it has is the low res (I drive an external monitor for higher resolutions) and the plastic under the back of the screen being cracked in one place. Other than that, just like new.
An old MacBook Pro will work fine, as long as it has an SSD and some sensible quantity of RAM.
I've used a 2014 Macbook Air very heavily on a daily basis for years now and it still does fine on small-medium app development. Simulator's are the biggest pain point.
I recently started dabbling in video editing a bit and that's what definitely gets the fans going.
I've been buying Asus for some time and have nothing to complain: very good laptops, twice the specs of a Macbook for half of the price and with the choice of AMD processors.
Apple has a strong name in North America, but I find that to be mostly based on a perceived image, rooted more in a distant past (or on pure marketing) than in reality. I understand when this happens with the general public but it puzzles me when technical people buy into this.
I am so glad it doesn't have a Touch Bar. Nothing turned me away from the newer Pro models quite like the lack of physical keys. Sure the Touch Bar is nifty for timeline scrubbing and slider adjustment, but those are such small aspects when compared to how often the function keys are used.
Apple makes nice laptops, but I am not interested until the come out with a convertible option. I bought a HP Spectre a couple of years ago and I am very happy with it. The convertible option is very nice for traveling in coach.
What I dislike about my convertible is that you can never use it as both a tablet and a laptop at the same time.
I know it’s obvious but I didn’t realize it when I switched from a laptop + tablet to a convertible in the hopes of simplifying my setup.
For example, it’s annoying when you’re taking notes and need to look something up on the web: either put up with poorly optimized interfaces or convert the device to a laptop.
So I’d much prefer a tablet plus laptop for my particular usage as a student but am too broke to buy another new setup. What makes convertibles so desirable to you?
why does apple not publish the Intel CPU generation information. ? Dell , Lenovo etc. publish information like "8th gen Intel Core i7" for all their laptops. With Apple , its a information is not that obvious.
bought macbook pro in 2017. Due to the issue with double typing keyboard I always regret everyday. And there is screen issue also. Probably they will charge $300. Looks like I did the worst investment in my life.
I bought a 13 inch Macbook Pro in 2017 and within six months of normal use the spacebar and other other key were intermittent. I was fortunately able to get it fixed under warranty.
However I've been using since then with no need for servicing and the keys are at this moment perfect. Various keys have become intermittent since then but I found a trick that so far has fixed it every time.
The key mechanism seems to be VERY robust and can take a lot of pounding. Usually the particles jamming the mechanism (e.g. food crumbs) can be broken up by repeatedly banging hard on the offending key.
I've had keys that were intermittent or even not come all the way up and this has always fixed them to be literally as good as new. It seems that once the particles are broken small enough they either remain in place but are harmless or maybe they are small enough to fall out by themselves.
> The key mechanism seems to be VERY robust and can take a lot of pounding. Usually the particles jamming the mechanism (e.g. food crumbs) can be broken up by repeatedly banging hard on the offending key.
I've done something similar by holding the base of an electric toothbrush against a misbehaving key that couldn't be cleared by some other technique. The high-frequency vibrations seem to do a decent job of breaking up or dislodging whatever's stuck.
This has worked for me as well. Whenever keys got stuck I could press them quite hard and seemingly crunch whatever is causing the issue. For example the tab key would only react intermittently. Press it, and almost massage it with lots of force and it works fine again.
The unintended benefit of this is that the battery is soldered to the keyboard. So, when your battery is failing, you can take your keyboard in to get fixed under the service programme and get a new battery for free.
I wish they provided an easy way to compare it to older MacBook Air models. I have a 2018 model - I’d like to be able to easily compare them and see the exact differences and where they’re the same.
Almost $2k Australian for the quad-core model. Pretty steep for a machine that still needs an upgrade of 16g of ram for my use, then the Y-series processor is still too low performance.
How future proof is it? Even if it's quad core, it's still 1.1GHz I mean, will a browser need multiple cores to scroll smoothly through pages in 2-3 years from now?
Yes, I do, and I cannot believe they took every step of the life cycle into account, else the number would be 3-4 times higher.
That's why I was hoping for more details. But obviously this is not something we like to talk about, and it's so easy to sweep it under the carpet, so why bother?
Looking forward to trying out the keyboard. The fact that there is NOT a Touch Bar on this is a big plus. I’ve used a Mac book with a Touch Bar for work and it’s so awful.
just got a 2019 16in Macbook Pro with i7 & 16GB Ram at work and this thing flies.
It is definitely faster than my personal 2013 i5 macbook pro with 8gb of ram, but im going to make it last through this recession for as long as i can.
I am salivating over forthcoming 14in macbook pro though
I am not familiar with such conspiracies, from what I seen he is accusing Apple of incompetence a lot and the keyboard case confirms that Apple can do sometimes stupid designs and he is accusing Apple of anti-repair shit and this is also true see the right of repair laws that are debated in US states and EU.
We do not have any emails or other internal Apple materials to know for sure why Apple kept hidden the fact they crippled your iPhone CPU (again for the fanboys the issue was the HIDDEN part) but for now presuming Apple is hidden things for your own good is let's say naive.
How long would it realistically take for Apple to pivot to AMD? If things continue as they have been, it's hard to imagine the "premium brand" sticking with Intel.
Of course they might just shift everything to A series chips instead.
My least favorite thing about HN is that it upvotes trivial Apple hardware news to the top link. Literally no one in the world cares about Apple keyboards except Apple fans.
I said Apple fans, not just keyboard fans. There are plenty of great development machines, but somehow Apple machines are the only ones that make it to the top link.
Maybe it works for some but 256GB of storage is just insulting to me. I also don’t appreciate this focus Apple has had over the past decade to make it easier for me to shop or to identify my fingerprints, voice or face while they screw the usability.
256 definitely isn’t enough for anyone who backs up their phone to the computer. They’re pushing folks to the cloud. I am pretty insistent on having a local phone backup, but have to use a symbolic link to an external hard drive to accomplish this. Seems like a pretty extreme step to have to take just to back up to an external drive.
I'm not talking about i3/i5/i7, but rather U/Y/H. This letter determines the TDP (thermal design power/point) them machine is designed to run at. The TDP will govern the setting for the base clock speed, and, just as importantly, the throttling behabior under load.
Processor series TDPs are Y: 4.5W, U: 15W, H: 45W.
The new MacBook Air appears to have a Y series processor, like the MacBook, which means it will be aggressively throttled to keep power consumption and heat generation low.
Practically, that means that the new Air will not be capable of running sustained workloads much above its base clock speed, which makes it unsuitable for many programming-related tasks.
The Pro is still a much better choice for programmers. The 13 is suitable for many things, but the 16, with the H series processor, is really preferred for computationally intensive work.
You can get away with this machine if your workflow primary involves a text editor and remote servers, but otherwise I would still opt for the pro.