Hacker News new | past | comments | ask | show | jobs | submit login

  - Quad core processor
  - Scissor keys
  - No touch bar
MacBook Air, the new MacBook for Pros.



I honestly never understood the hate for the touchbar. It allows me to be much more granular with volume and brightness, and I never really used F-keys anyway.


Volume adjustment is actually a great example of why I hate the touchbar.

- I can't adjust the volume without looking at it. Because the touchbar is flat with no haptic feedback when I land on a button, it's hard to remember the exact position of the volume 'button' without looking. Sounds trivial - but combined with point 2....

- The way the volume control expands - it actually moves the 'volume down' button AWAY from your finger, which again requires me to keep looking at the control.

This means that when a loud song comes on, it can take 2-3 seconds to quickly turn the volume down in total. I could do that with one single keypress in half a second or less on a keyboard, without needing to look at the keyboard.

That can also be the difference between missing a key detail from a quiet speaker on a Hangout.

Flashy, but it's a terrible user experience by every metric other than looks, I guess.


You can actually just tap the volume icon on the Touch Bar and slide your finger back and forth immediately and it works; you don't have to tap, then move your finger to the volume slider and move back and forth.

(This is clever, but basically undiscoverable unless someone tells you in, for example, a comment on Hacker News, which is how I found out.)


This is assuming the touch bar isn't asleep and you can even see where the volume button is in the first place. Often I have to touch the bar once just to wake it up, then find the button and touch and hold and slide.... ech I hate it personally.


> You can actually just tap the volume icon on the Touch Bar and slide your finger back and forth immediately

No you can't! There is a pretty long delay. If you move your finger during the delay, nothing happens. Then when it finally decides to switch modes, you have to move your finger again for it to change the volume. Hope you didn't hit the edge of the touchbar yet. Combined with the phantom button presses when using the top row of the keyboard, especially the Siri button, plus other small issues, the whole thing is bafflingly terrible.


It's potentially that I'm using a 2019 MPB, but I can absolutely touch and slide to change volume immediately. Just press and slide on the icon for volume or brightness.

Also pro-tip: you can change the buttons that show up in the touch bar. Settings > Keyboard > Customize Control Strip. I swapped out Siri for a "Sleep" button, which is super convenient when I walk away from my desk.


On my 2017 with Catalina there is an animation that occurs to show the volume slider. Any sliding of your finger that occurs before the animation completes is definitively ignored. Additionally, there is a significant delay before the animation even starts.

I just timed it at ~580 milliseconds, more than half a second from finger hitting the bar to the time when it stops ignoring touch input. It's easy to slide your finger more than the entire length of the volume bar in that time. It's absurdly bad. It would be weird and pretty lame if they fixed this only on newer models.


An app called Pock replaces the touch bar with a custom one that I find a lot more useful - might work for you too


Just go all in with BetterTouchTool + GoldenChaos-BTT (https://community.folivora.ai/t/goldenchaos-btt-the-complete...) -- I don't know why Apple hasn't bought BTT and made this the default, it's truly the best way to use the Touch Bar and the reason why I miss it when I'm using any other keyboard.


I only hate that it replaced the top row of keys. If it were an addition instead of a replacement, I'd be okay with it. It has it's moments, but so do the keys it replaced.


I fully agree with ESC, as that's a universal key and used very often, but the other ones are more specific and having them be adaptable always made sense. Now that they returned the ESC key, that part is solved.


The volume control keys are also pretty universal.


As a developer, how would I step into, step over, step out in Xcode without function keys?? (Continue being ctrl-cmd-Y is the worst shortcut ever). It truly hampers my development because I have to look at the touchbar to see where on earth those keys are (F6, F7) or step in/continue in Chrome (F10, F11).

Also, where on earth is the escape key??


The escape key is back on the new 16 inch, but even on older Touch Bar Macs you can tap anywhere on the left side of the Touch Bar (doesn't have to just be the escape button area) and it will still work.

Different strokes for different folks, but I've never liked using the function keys for debugging. I just click the buttons on the screen. I'm a little surprised they don't have a way to set the Touch Bar buttons up to do that in Xcode though.


I will try the "left of the escape key" trick - thanks!

Moving the mouse cursor up to the toolbar always seems a lot of travel and swishing around if you're hovering over variables to see their contents in the source code.

I have found the auto/local/all view in Xcode to be a bit dumb and unable to properly expand some template objects in C++ so it's all just an exercise in frustration anyway!


On the 16”, esc is right there to the left of the Touch Bar. I’m waiting for the next 13”, which I expect will have the same change.


In Intellij IDEA when debugging the touchbar has a debugging-specific menu with all those controls. I don't find myself needing to look at the touchbar all that often. My muscle memory has adjusted over the past couple years I guess.


Want granularity? Just hold alt+shift while pressing the volume buttons to adjust the volume in quarter-box increments. You can do it without looking and it’s way easier than moving that slide on that gimmick touch bar. Works for brightness, too.


Much of the hate is that the touchbar wasn't optional, at least not unless you wanted to opt out of an Apple laptop. If the touchbar had been something users could choose, Apple users wouldn't have minded so much.

Supporting more options is expensive, so it's understandable that Apple didn't want to give their customers a choice. Still, it seems like a gimick. And it appeared at the same time as the butterfly keyboard, cementing the notion that Apple had lost its way.


I appreciate the touchbar every day (esp. with bettertouchtool) but the soft escape is horrendous as it's used in so many of my workflows and isn't 100% responsive and doesn't give any tactile feedback.


It's a mandatory and expensive feature.


It occasionally freezes. I occasionally touch the buttons by accident. Mine is missing the physical escape key.


It doesn't add any benefit to my experience. I'd prefer real keys that I don't need to look at. I could hit volume up/down easily on the previous models.

Using Terminal, I use the Esc key a lot for navigating and having a touch bar Esc key is not a great experience since you also don't feel feedback that you're touching the right key.

I've also accidentally hit the touch bar a few times while hovering one of my fingers above it as I press down on one of the number keys.


It's just not worth it money/usefulness. To me it's going too far with tech.


It's not awful. It's just that I wish I had actual keys every time I use it.


I’ve had it locked up a couple of times, and couldn’t mute a loud sound.


You can be just as granular by using shift + alt + volume/brightness. That way your changes will be in 0.25 step increments rather than the default full steps.


keyboards are meant to be used without looking at them. with the introduction of the touchbar, you have to look at what you're pressing. it's like a giant touch screen in a car, it works, but you have to look at it, where as if you have buttons, you can find what you want to do by feel/memory.

on a personal note, i've randomly refreshed webpages because i've overreached on the number row with the touch bar.


I hope I didn't get permanent damage but I hurt myself badly with it. I was trying to put the volume up a bit while wearing earplugs (it was very low) so I pressed the "up" volume key. I accidentally pressed few pixels to the left from where I should have pressed and it went to FULL VOLUME without a warning, blasting audio and hurting myself badly.

This could not have happened without the touchbar. This is horrible UX and I will never trust that (work) computer again.


>It allows me to be much more granular with volume and brightness

I press the physical button a few more times? I find that 10 times easier.


"I don't need F-keys, so I don't understand why anyone else would ever need them"

Does that statement strike you as reasonable at all?


Actually:

  - 2 cores
  - 1.1 GHz
In 2020...


Intel’s 10nm chips have very low base clocks. They’re almost always in some sort of boost mode.


You can get a quad core i7


Which is still a 1.2 GHz base clock.


Because wasting energy on a 4 GHz base clock is completely pointless (and so is your criticism) if you can adjust frequency as needed. 99% of the time 1GHz is sufficient. It’s only when you launch the browser that a faster CPU is useful, not when you’re reading through a website


> Because wasting energy on a 4 GHz base clock is completely pointless

That's not how base clock works. Base clock is not min clock. Base clock is what it's "guaranteed" to hit under sustained load if TDP is respected. It's the TDP clock. A 4ghz base clock CPU will still be far, far below that when idle.


The problem is that it ramps up by 250 Mhz increments, over a period of several seconds, and that can be extremely noticeable in some workflows.

I went from an 6700HQ (2.6 Ghz base) to a 10710U (1.1 Ghz base) and the difference is definitely there, and it's jarring enough to the point where I kind of regret it. It feels like a huge step backwards, despite the latter CPU being four generations ahead.


Did you make sure that your turbo button was pressed? Honestly, this is why I just stick to naturally aspirated CPUs.


The first thing I notice in that comparison is that one chip is rated for a 45W TDP and other is 15-25W. While I think these cross-segment comparisons are exciting and show great progress, it's just not fair to the electrons.


Surprising. What device are you using?

This does not seem to be the case for my i7-8565U nor my older m3-something. The 8565U for example feels just as snappy as my desktop i7-6700, except it’ll throttle after about 30s.


> The problem is that it ramps up by 250 Mhz increments

Isn't this completely down to the cpu frequency governor in the OS?

My impression was that this "stepping" is customisable, at least in linux, I regularly step my CPU manually even.

I'm not sure what Apple has here, but maybe it's not what you expect.


I'm on Windows 10 and I'm not aware of a built-in method to change the CPU multiplier.


How windows does things and how other operating systems do things, in the wise words of Jayne Cobb "ain't exactly but similar".


With modern noteboock CPUs the base clock is only a loose indicator for how a CPU will perform. The CPU will still be downclocked and undervolted depending on the load.


Don't you know? Higher is always better! 5 Ghz in my laptop, plz!


And turbos up to 3.8 GHz. We’ll see, but I suspect it’ll spend a lot of time above 3 GHz


At a 9W TDP the chip wont spend any more than a few seconds at boost clock.


Sure, for the same price I can get a 8-core, 3 GHz base clock in a non-Apple laptop.


With 2 hours of battery life


Which is fine.


Exactly what I was thinking when I read the spec sheet: 'I wonder how many developers will go for the maxed out version.'


Definitely going from 2015 MacBook Air to this new one for my personal at-home coding laptop, as long as I like the keyboard when trying it out.

I had really been wanting to upgrade for Retina & better processor but I knew they would upgrade the processor and fix the keyboard if I waited for 2020... no reason to wait now.

I don't run any crazy fat Docker stacks for my own stuff at home, so this is perfect.


why not the pro for a few more hundred dollars? or wait for the upgraded version in the summer? you get a noticeable performance boost, dedicated graphics card, touch bar, and so on?


I really don't need that much extra memory or performance, I thoroughly dislike the touch bar, and I want a slimmer/lighter form factor.

My at-home hobby work is in Golang and Python, and not particularly compute-intensive stuff. Neither of those have huge heavy toolchains.

Really the main thing driving an upgrade from my minimum-spec 2015 Air is the Retina display.


My personal laptop is an 11" Air mid-2013 and I still use and love it. I especially love the keyboard on it because the keys have height, feel closer to a mechanical keyboard, and don't capture as much dust and dirt as the flat keys on my newer touchbar 2016 pro work laptop

This is good news from Apple as I was not into any of their more recent laptops but I'll probably upgrade to this one

I only wish they had a 11" version but not a deal breaker


Same for me, although it’s a 2012 11". The 13" model here is actually somewhere in between the old 11" and 13" for size.


I probably would except for the lack of ports and weak CPU.

Still just waiting on a new 14" MBP.


you actually make use of 4 usbc connections? I'm genuinely curious on the use case. These days you can get 12 in 1 dongles from china that cover the last 25 years of input device standards into one usbc connection, and it will charge the thing.


Not a big fan of dongles.


Why though? I find buying one $30 dongle that has 10 inputs for cables I already own is a better deal than buying 10 $20 usbc cables. As a mac user I've been used to dongles for a while, mac laptops never seemed to have the standard AV out aside from that fluke generation with HDMI. Always some weird connector for the sake of being weird, it seemed.


I don't love coding on a tiny screen when I'm not using my laptop keyboard to awkwardly type with an external monitor.


Seriously considering trading my maxed out 2018 13" for this, but in the end probably not going to do it.


I am.


Except

  - 16GB RAM
Can you even run Eclipse with just 16GB?!


I can't tell if you're joking, but I have a 2018 Mac Mini with just 8GB of RAM, and I often run Eclipse, IntelliJ, and PyCharm at the same time (along with multiple browsers and other stuff), and performance is fine.

I was actually surprised by this--when I first started using this computer, I thought for sure I would need to add more RAM, which for the 2018 model is too complicated to do yourself (at least to me it seemed too risky).


Semi-joking, but the problem is real for me. I've a 2013 13' MacBook Pro with 8GB RAM, and my system can't cope with my workflow ... tens of tabs in Safari, webapps in Chrome (YouTube, Google Docs, ...), Eclipse with Scala / Java, ... it's a huge struggle.


I was handed a 2017 MacBook Pro with 8GB of RAM at my current job while waiting for my actual laptop to be delivered, and it was a nightmare.

I keep a lot of tabs open to look things up, but nothing excessive on that machine. I also run VSCode or Pycharm and would also bring up 5-10 containers at times.

It seriously hurt not only my productivity but also my mood afterwards just by having to put up with it for weeks.

Unless you're a very basic user I don't get why you would settle for 8GB in 2020. 8 gigs of RAM cost basically nothing, it's not worth changing your workflow in the slightest to work around that artificial limitation.


It is odd because these memory issues are very real, but if you ever say "wow, devs are getting lazy and these 'desktop' apps that rebundle Chrome are really killing my machine (eg. Slack, Skype) with inordinate quantities of logic in javascript" you get shouted down.

It's bizarre. If everyone used the native toolkits we'd have far less memory usage and everyone (even the memory-constrained) would have a good experience.

Also, with these memory hogs they will do a lot of allocation and deallocation. This is also a problem with interpreted languages. And allocation is the enemy of speed, and energy usage. It'll destroy your daily battery expectancy as everything gets interpreted.

Sad.


I remember feeling the same when I was forced to upgrade from 32 mb of ram to 128 mb of ram to run the combination of browser, chat and IDE on windows NT4, back when they moved from hand-optimized assembly to mass-produced C++ for most software.

With every layer of abstraction added to ease development the hardware requirements go up. You can build things fast, or you can build fast things, doing both is tricky.


That's actually very surprising to me.

I'm running linux with an anemic window manager, and with nothing but chrome and slack open (20tabs in chrome) I am consuming 6GiB.

If you add teams, pycharm and outlook (electron) it consumes 9GiB... Actually, that's also less than I expected.

Well done pycharm.


I think OSs in general just eat a portion of whatever memory you give them. Right now I'm puttering around with a dozen tabs in firefox, in fact my biggest memory hogs right now are firefox with 3.5gb and apple mail with ~500mb, not really doing anything else, and somehow 12GB/16GB are in use. Better for the ux to keep things open in memory if you have it to spare, I suppose.

When you are memory constrained, you can definitely tell. Everything comes to a halt and you just twiddle your thumbs between commands. This 16GB machine I have shipped with 4GB which was painful even 8 years ago when it was released, and I upgraded myself to 8GB 6 months into ownership. A few years later when javascript became more pervasive on the web, I hit memory constraint on 8GB a lot just from having tabs open in chrome, back when it was perhaps more of a memory hog, so I opted for 16gb and haven't had issues since.

I think at 16gb you should be set for at least 5 years. Most people, even a lot of devs on company issued equipment, are working with 8gb complaining about it right here in this very thread.

If you have larger requirements, a lightweight, thin laptop with a teensy fan isn't for you. Even if it had the hardware specs, the physics of heat dissipation don't work for you and you are better off spending the same money for more hardware sitting in a box under your desk. Me and my sore back are eyeing this up, all my computing is done on a cluster anyway.


Same. I have a MacBook Pro (Retina, 13-inch, Early 2015) with 8GB of RAM and I've been doing fine. Sure, there are some hiccups every now and then but it works. I have Spotify, VS Code, Slack, Kitty tmux sessions and more open 24/7.


What? It's probably swapping like a bastard, which with SSDs is probably not that horrible. Even 16GB for me is low (I do run in a Linux VM guest). I got myself a Mac Mini with 64 gigs, for great justice.


If I open up PyCharm and IntelliJ and Spotify and SourceTree and Docker and three different browsers and iTerm and Remote Desktop and a few other apps all at once, I will get an occasional hiccup, but it's really not as bad as I would have expected. I think 16GB would be nice though.

For comparison, I also have a 2012 Mac Mini at home with an SSD and 16GB RAM, and it's still chugging along pretty well too, although it's noticeably slower than the 2018 model with 8GB RAM.

I'm curious with regard to swapping if that might mean my SSD is going to wear out sooner. Maybe investing in more RAM would be worth it even if I don't feel like I need it.


Exactly. I'm deciding between 32GB or even 64Gb, just to be on the safe side. Because nowadays you're running Slack, Spotify, several messengers, Firefox, Chrome, IntelliJ, Docker and Kubernetes on your local machine.


Which is kind of horrifying, if you stop and think about it. You're wondering whether you need another 32G of RAM to run a basic working environment, a glorified text editor, and some communications software. I used a BBC that could do that in 32K of RAM in the 1980s! Obviously I'm not really suggesting the functionality today is equivalent, but the idea that ultimately you're meeting the same basic needs yet it now takes a million times as much space is... unsettling.


Its certainly amazing how much memory consumption has grown. I like to think of it in terms of economics, we could never write today's software using 80s methods. Slack in assembler? Impossible. Kubernetes in C++? Maybe, but there will be security holes, and Go is just more productive. Developers are expensive, very expensive.


Developers are expensive, very expensive.

Such is the accepted wisdom in much of the industry, but I'm a bit of a sceptic on this score. Of course developer time is expensive, particularly if you're in somewhere like the Bay Area where salaries are an extra 0 compared to most of the world. But we live in an era of virtualisation and outsourcing (sorry, "cloud computing") when businesses will knowingly pay many times the cost of just buying a set of servers and sticking them in a rack in order to have someone else buy a much bigger server, subdivide it into virtual servers, and lease them at a huge mark-up. All kinds of justifications have been given for this, many of which I suspect don't stand up to scrutiny anywhere other than boardrooms and maybe golf courses.

There's a nice write-up somewhere, though regrettably I can't immediately find it, of the economics of cloud-hosting an application built using modern trends. IIRC, it pitched typical auto-scaling architectures consisting of many ephemeral VMs running microservices and some sort of orchestration to manage everything against just buying a small number of highly specified machines and getting on with the job using a more traditional set of skills and tools. Put another way, it was the modern trend for making everything extremely horizontally scalable using more hardware and virtualisation against a more traditional vertical scaling approach using more efficient software to keep within the capacity of a small number of big machines. The conclusion was astonishingly bad for the modern/trendy case, to the point where doing it was looking borderline insane unless your application drops into a goldilocks zone in terms of capacity and resources required that relatively few applications will ever get near, and those that do may then move beyond it on the other side. And yet that horizontal scaling strategy is viewed almost as the default today for many new software businesses, because hiring people who are good enough to write the more efficient software is assumed to be too expensive.


We live in a world where any 1 man startup thinks and their investors hope they will have 10k employees by year end. Therefore, if you are going to be burning money anyway, what's another line item on the monthly outflow if it means you don't have to spend 3 months hiring someone to toil in the server room and a couple months ordering and assembling your farm that might crash the day your startup gets linked on hacker news.

There are technical reasons for this, being able to handle sudden load, but mostly it's for ideological reasons. We aren't building companies, we are building stock pumps guised as the utopian future. If you are wondering what a blue chip company looks like in tech, they are the ones that own their own infrastructure.

Maybe there is a middle road for cash poor companies, where you keep latent demand in house for the sake of cost and sense, but have some sort of insurance policy with a cloud service to step in if demand surges.


I’m not well-versed at all in Go and Kubernetes? Isn’t the large RAM usage from the usage of containers? Is Go a memory-hog?


Its a GC language, so yes it needs more than C++, but less than Java. I believe that is because often structs can live on the stack.


We don't have the same basic needs. People are running VMs on their laptops so they can test things in an environment similar to production without having to run extra servers for every developer/sysadmin to test with. Back in the 80s you're QA and Production environment were very likely the same!

I'll admit that modern text editors and communication software have grown resource hungry, but a lot of that comes from being able to deliver a strong, cross platform experience. I remember desktop Java doing much of the same with just as bad resource usage. Same with applets.


People are running VMs on their laptops so they can test things in an environment similar to production without having to run extra servers for every developer/sysadmin to test with.

Sure, but that immediately raises the next question of why those VMs are so big...


Yeah, the fixed cost of a VM context is on the order of kilobytes in the host kernel, megabytes in the guest kernel. And with VM balloon paging a guest VM acts much like a regular process in terms of memory usage. It's not VM usage that hogs memory, it's the applications, regardless of VMs.


Why does it matter when you can afford that RAM? Just buy and forget about it, it's cheap enough. We used to land to the moon with CPU less performant than Apple's HDMI adapter cable, it's fun comparison but not very useful, that's just the way things are and it's not going to change anytime soon.


I realise it's how things are today and not going to change any time soon, but it still feels like we as an industry have moved all too easily in a very wasteful direction. Sure, with RAM you can just buy more, but it's symptomatic of a wider malaise. Other capacities, particularly CPU core speeds, have long since stopped increasing on a nice exponential-looking curve to compensate for writing ever more layers of ever more bloated software in the name of (presumed) greater programmer efficiency. It just feels like we've lost the kind of clever, efficient culture that we used to have, and I'm not sure we weren't sold a bill of goods in return.


I'm not sure whether curve is still exponential or not, but it's there. Single-thread performance is increasing every year a little bit and core count is increasing like never before. 16 cores consumer CPU is not a dream anymore.

RAM size slowly increases as well. 4 GB was enough 10 years ago. 8 GB was enough few years ago. Today I would suggest 16 GB as a bare future-proof minimum and one can buy 64 GB for a reasonable price.

We still have room for more layers. And it's not only about efficiency, it's also about security. Desktops are still not properly sandboxed, my calc.exe still can access my private ssh key.

Once performance growth will really stop, we will start to optimize. Transistor density will double every few years until at least 2030 and AFAIK there are plans beyond that, so probably not soon.


> Why does it matter when you can afford that RAM?

Why should everyone have to afford that RAM?


I have 8GB on my work laptop with almost all of this (except Kubernetes, but I fail to understand why you would need a local Kubernetes) and it's fine, I usually have 2GB free memory.

Don't exagerate your memory requirements, you would be more than fine with 16GB.


That's not even close to an exaggeration. I'm running only half those things (or their competitive equivalents) right now on a Windows box. I just checked and I've got 14.8 GBs in use.

Fortunately, I have a Dell XPS 15 with 32 GBs of RAM, but the second I start up a single VM, one more messaging app, a small handful of Docker containers, or any IDE (of which I'm running none right now), I'm going over 16 GBs.

Realistically, most of us on HN probably need around 20-24 GBs, but laptops don't come in those increments.


> of us on HN probably need around 20-24 GBs

I develop for a living. I use 6 GB including a browser, a VM and an IDE.

Some of you greatly exaggerate the needs. Some workflows require 16+ GB of RAM, but most people complaining about RAM mismanage it or do not understand that caches are not mandatory.


I'm running firefox and mail in macOS, and 12gb are in use. The OS keeps things loaded in memory if you have it to spare.


Right now on macOS I'm running Firefox, Outlook, 2 VSCode instances, Postman, 1 Electron chat app and another chat app and I'm under 5GB. Uptime 4 days.


Cache does not count when discussing memory requirements.


I love it when people tell me my requirements.


Java people problems


"quad core" @ 1.4GHz




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: