Together with next-generation ML accelerators in the CPU, the high-performance GPU, and higher-bandwidth unified memory, the Neural Engine makes M4 an outrageously powerful chip for AI.
In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).
Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity and hence minimal latency.
If (and that's a big if) they keep their APIs open to run any kind of AI workload on their chips it's a strategy that I personally really really welcome as I don't want the AI future to be centralised in the hands of a few powerful cloud providers.
> In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).
Their primary business goal is to sell hardware. Yes, they’ve diversified into services and being a shopping mall for all, but it is about selling luxury hardware.
The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.
Somewhat true but things are changing. While there are plenty of “luxury” Apple devices like Vision Pro or fully decked out MacBooks for web browsing we no longer live in a world where tech are just lifestyle gadgets. People spend hours a day on their phones, and often run their life and businesses through it. Even with the $1000+/2-3y price tag, it’s simply not that much given how central role it serves in your life. This is especially true for younger generations who often don't have laptops or desktops at home, and also increasingly in poorer-but-not-poor countries (say eg Eastern Europe). So the iPhone (their best selling product) is far, far, far more a commodity utility than typical luxury consumption like watches, purses, sports cars etc.
Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury. Especially since the M1 launched, where performance and battery life took a giant leap.
Engineers use MacBook pros because it’s the best built laptop, the best screen, arguably the best OS and most importantly - they’re not the ones paying for them.
> Engineers use MacBook pros because it’s the best built laptop, the best screen, arguably the best OS and most importantly - they’re not the ones paying for them.
I am the one paying for my MacBook Pro, because my company is a self-funded business. I run my entire business on this machine and I love it. I always buy the fastest CPU possible, although I don't max out the RAM and SSD.
Amusingly enough, I talked to someone recently about compilation speeds and that person asked my why I don't compile my software (Clojure and ClojureScript) on "powerful cloud servers". Well, according to Geekbench, which always correlates very well with my compilation speeds, there are very few CPUs out there that can beat my M3 Max, and those aren't easily rentable as bare-metal cloud servers. Any virtual server will be slower.
So please, don't repeat the "MacBooks are for spoiled people who don't have to pay for them" trope. There are people for whom this is simply the best machine for the job at hand.
Incidentally, I checked my financials: a 16" MBP with M3 and 64GB RAM, amortized over 18 months (very short!) comes out to around $150/month. That is not expensive at all for your main development machine that you run your business on!
For a fair comparison, what about comparing against the cheapest "power cloud server"?
I mean Hetzner has a reputation for renting bare metal servers at the cheapest price in the market. Try AX102 which has very close performance to a M3 Max (CPU only): https://www.hetzner.com/dedicated-rootserver/matrix-ax/
While the OP's solution has a lot of advantages like being able to own the device and including GPU, but at least we do have cloud servers with comparable costs available.
I tried a lot to use remote servers for development when I had an Intel MacBook and I found the experience to always be so frustrating that I upgraded to the M series. Have the tools gotten any better or is vscode remote containers still the standard?
I did use them several years ago, for Clojure and ClojureScript development. Docker and docker-compose were my main tools, with syncthing helping synchronize source code in real time, Emacs as the editor. Everything worked quite well, but was never as easy and smooth as just running everything locally.
vscode remote containers are still the standard, but I find them very usable nowadays. My setup is a MBP M2 that I use to remote into a Windows WSL setup at home, a Linux desktop at work, and various servers. Nesting remote SSH + remote Docker works seamlessly, that was previously a major headache.
In your case it makes sense to get the most performant machine you can get even if it means you're paying a ton more for marginal gains. This is not usually true for the general public.
As a Clojure/ ClojureScript developer myself I just wonder what You do that compilation is such an important part of Your workflow and at the same time don't need as much RAM as possible? Yes, the MacBook Pro isn't bad at all for Clojure(Script) development. I was pretty angry that the Lenovo ThinkPad T14 Gen3 has a full channel of soldered RAM and just a single slot for expansion since I really use a lot of RAM and would prefer to go with 64 GB full dual-channel and not a hybrid 48 GB with 32 GB being dual-channel and 16 GB being single channel. (Yes, it does actually work.)
Most builds that I do are done asynchronously using GitHub Actions or similar. Yes, it does take some time but the build+deploy isn't that time sensitive.
In addition to the hardware, the OSX software is so much better with flawless speed, productivity, and multitasking with gestures. Try doing the desktop switching on the windows. On a flip note, I would gladly use the cloud if internet speeds and latency comes down to negligible level - we developer are an impatient lot.
"Engineers" - ironically the term used in the software industry for people who never standardize anything, solve the same problem solved by other "engineers" over and over again (how many libraries do you need for arrays and vectors and guis and buttons and text boxes and binary trees and sorting, yada yada?) while making the same mistakes and learning the hard way each time, also vehemently argue about software being "art" might like OSX, but even that is debatable. Meanwhile actual Engineers (the ones with the license) the people who need CAD and design tools for building bridges and running manufacturing plants stay far away from OSX.
I did EE in college but we mostly just used Windows because the shitty semi-proprietary SPICE simulator we had to use, and stuff like that, only supported Windows. The company that makes your embedded processor might only support Windows (and begrudgingly at that).
I think engineers using software should not be seen as an endorsement. They seem to have an incredible tolerance for bad UI.
You seem to be suggesting that a chunk of the hundreds of millions of people who use a UI that you don't like, secretly hate it or are forced to tolerate it. Not a position I'd personally want to argue or defend, so I'll leave it at that.
What an oddly aggressive and hostile response to such a banal observation. Yes, millions of people use software they hate, all the time, that’s wildly uncontroversial.
Making up what? Go drop by your nearby shop.
My hair styling constantly complains about management software that they use and quality of payment integration.
At work I constantly hear complaints about shitty, slow IDEs.
At optician store guy been complaining about inventory system.
People hate software that they're forced to use. Professionals are better at tolerating crapware, because there's usually sunk cost fallacy involved.
This is not a reasonable way to infer the sentiment of hundreds of millions of people in different countries, different business, different situations, etc, etc.
Disguising it as an "observation" is even more ridiculous.
Indeed I’m not ready to defend it, it is just an anecdote. I expected the experience of using crappy professional software to be so universal that I wouldn’t have to.
>They seem to have an incredible tolerance for bad UI.
Irelevant.
Firstly, it's a tool, not a social media platform designed to sell ads and farm clicks, it needs to be utilitarian and that's it, like a power drill or a pickup truck, not look pretty since they're not targeting consumers but solving a niche set of engineering problems.
Secondly, the engineers are not the ones paying for that software so their individual tolerance is irelevant since their company pays for the tools and for their tolerance to those tools, being part of the job description and the pay.
Unless you run your own business , you're not gonna turn down lucrative employment because on site they provide BOSCH tools and GM trucks while you personally prefer the UX of Makita and Toyota. If those tools' UX slows down the process and makes the project take longer it's not my problem, my job is to clock in at 9 and clock out at 5, that's it, it's the company's problem to provide the best possible tools for the job, if they can.
It was figuratively. Obviously everyone has different working hours/patterns depending on job market, skill set and personal situation.
But since you asked, Google is famous for low workloads. Or Microsoft. Or any other old and large slow moving company with lots of money, like IBM, Intel, SAP, ASML, Airbus, DHL, Siemens, manufacturing, aerospace, big pharma, transportation, etc. No bootstrapped "agile" start-ups and scale-ups, or failing companies that need to compete in a race to the bottom.
If you look at creative pros such as photographers and Hollywood ‘film’ editors, VFX artists, etc. you will see a lot of Windows and Linux as people are more concerned about getting absolute power at a fair price and don’t care if it is big, ugly. etc.
Oh, I'm sure there are lots of creatives who use OSX, so I don't mean to suggest nobody uses OSX, so I'll admit it was a bit in jest to poke fun at the stereotype. I'm definitely oldschool - but to me It's a bit cringe to hear "Oh, I'm an engineer.." or "As an engineer.." from people sit at a coffee shop writing emails or doing the most basic s/w dev work. I truly think silicon valley people would benefit from talking to technical people who are building bridges and manufacturing plants and cars and hardware and chips and all this stuff on r/engineeringporn that everyone takes for granted. I transitioned from s/w to hardcore manufacturing 15 years ago, and it was eye opening, and very humbling.
I’d assume a lot of this is because you can’t get the software on MacOS. Not a choice. Who is choosing to use Windows 10/11 where you get tabloid news in the OS by default? Or choosing to hide the button to create local user accounts?
Who is choosing to use macOS, where non-Apple monitors and other 3rd party hardware just stops working after minor updates and then starts working again after another update, without any official statement from Apple that there was a problem and a fix?
I do. Because for all issues it has, it is still much better than whatever Windows has to offer.
> where non-Apple monitors and other 3rd party hardware just stops working after minor updates and then starts working again after another update, without any official statement from Apple that there was a problem and a fix?
At least my WiFi doesn't turn off indefinitely during sleep until I power cycle whole laptop because of a shitty driver.
So what, Windows does the same. Printers [1], WiFi [2], VPN [3], Bluetooth devices [4], audio [5] - and that's just stuff I found via auto-completing "windows update breaks" on Google in under 5 minutes.
The only problem is that Apple is even worse at communicating issues than Microsoft is.
The big difference is that Microsoft - at least usually - confirms and owns the issues.
With Apple, it's usually just crickets... nothing in the release notes, no official statements, nothing. It's just trial and error for the users to see if a particular update fixed the issue.
So the same software exists on multiple platforms, there are no legacy or hardware compatibility considerations, interoperability considerations, no budget considerations, and the users have a choice in what they use?
I.e the same functionality exists with no draw backs and money was no object.
More choice in hardware. More flexibility in hardware. UI preferences. You can't get a Mac 2 in 1 or a Mac foldable or a Mac gaming notebook or a Mac that weighs less than a kilogram. You can't get a Mac with an OLED screen or a numpad. Some people just prefer the Windows UI too. I usually use Linux but between MacOS and Windows, I prefer the latter.
We use the sales metrics and signals available to us.
I don't know what to say except resign to the fact that the world is fundamentally unfair, and you won't ever get to run the A/B experiment that you want. So yes, Windows it is !
You seem to have some romanticized notion of engineers and deeply offended by someone calling themselves engineer. Why do you even care if someone sits at a coffee shop writing emails and calls themselves engineer? You think it somehow dilutes prestige of word "engineer"? Makes it less elite or what?
"deeply offended" - My default response to imposters is laughter. Call yourself Lord, King, President, Doctor, Lawyer whatever - doesn't matter to me. I'd suggest you to lighten up.
Not that the degree means much, I learnt 90% of what I know on the job. It certainly helped get my foot in through the university brand, and alumni network.
You can call yourself anything you want Doctor, Lawyer, Engineer. I have the freedom to think my own thoughts too.
I always likened "engineers"[1] to "people who are proficient in calculus"; and "computers"[1] to "people who are proficient at calculations".
There was brief sidestep from late 1980s to early 2010s (~2012) where the term "software engineer" came into vogue and completely ran orthogonal to "proficiency in calculus". I mean, literally 99% of software engineers never learned calculus!
But it's nice to see that ever since ~2015 or so (and perhaps even going forward) proficiency in calculus is rising to the fore. We call those "software engineers" "ML Engineers" nowadays, ehh fine by me. And all those "computers" are not people anymore -- looks like carefully arranged sand (silicon) in metal took over.
I wonder if it's just a matter of time before the carefully-arranged-sand-in-metal form factor will take over the "engineer" role too. One of those Tesla/Figure robots becomes "proficient at calculus" and "proficient at calculations" better than "people".
It looks like ever since humankind learned calculus there was an enormous benefit to applying it in the engineering of rockets, aeroplanes, bridges, houses, and eventually "the careful arrangement of sand (silicon)". Literally every one of those jobs required learning calculus at school and applying calculus at work.
Why pointing out Calculus as opposed to just Math?
Might be just my Eastern Europe background where it was all just "Math" and both equations (that's Algebra I guess) and simpler functions/analysis (Calculus?) are taught in elementary school around age 14 or 15.
Maybe I'm missing/forgetting something - I think I used Calculus more during electrical engineering than for computer/software engineering.
In my central european university we've learned "Real Analysis" that was way more concerned about theorems and proofs rather than "calculating" something - if anything, actually calculating derivatives or integrals was a warmup problem to the meat of the subject.
Calculus, because all of engineering depends critically on the modeling of real world phenomena using ordinary or partial differential equations.
I don’t mean to disregard other branches of math — of course they’re useful — but calculus stands out in specific _applicability_ to engineering.
Literally every single branch of engineering. All o then. Petrochemical engineering to Biotech. They all use calculus as a fundamental block of study.
Discovering new drugs using Pk/Pd modeling is driven by modeling then drug<->pathogen repo as cycles using Lotka models.
Im not saying engineers dont need to learn stats or arithmetic. IMO those are more fundamental to _all_ fields, janitors or physicians or any field really. But calculus is fundamental to engineering alone.
Perhaps, a begrudging exception I can make is its applications in Finance.
But every other field where people build rockets, cars, airplanes, drugs, or ai robots, you’d need proficiency in calculus just as much as you’d need proficiency in writing or proficiency in arithmetic.
True, we learnt calculus before college in my home country - but it was just basic stuff. But I learnt a lot more of it including partial derivatives in first year of engineering college.
>I think I used Calculus more during electrical engineering than for computer/software engineering.
I think that was OPs point - most engineering disciplines teach it.
Yeah computer science went through this weird offshoot for 30-40 years where calculus was simply taught because of tradition.
It was not really necessary through all of the app developers eras. In fact, it’s so much so the case that many software engineers graduating from 2000-2015 or so work as software engineers without a degree in BS. Rather, they could drop the physics & calculus grind and opt for a BA in computer science. They then went on to become proficient software engineers in the industry.
It’s only after the recent advances of AI around 2012/2015 did a proficiency in calculus become crucial to software engineering again.
I mean, there’s a whole rabbit hole of knowledge on the reason why ML frameworks deal with calculating vector-Jacobian or Jacobian-vector products. Appreciating that and their relation to gradient is necessary to design & debug frameworks like PyTorch or MLX.
Sure, I will concede that a sans-calculus training (BA in Computer Science) can still be sufficiently useful to working as an ML engineer in data analytics, api/services/framework design, infrastructure, systems engineering, and perhaps even inference engineering. But I bet all those people will need to be proficient in calculus the more they have to deal with debugging models.
That 99% guess seems high considering calculus is generally a required subject when studying computer science (or software engineering) at most universities I know of.
You’re right it’s a total guess. It’s based on my experience in the field.
My strong “opinion” here comes from an observation that while calculus may have been a required subject of study in awarding engineering degrees, the reality is, people didn’t really study it. They just brushed through a couple of pages and wrote a few tests/exams.
In America there’s plethora of expert software engineers who opt for a bachelors degree in computer science that is a BA not a BS.
I think that’s complete totally reasonable thing to do if you don’t want to grind out the physics and calculus courses. They are super hard after all. And let’s face it, all of the _useful to humanity_ work in software doesn’t require expertise in physics or calculus, at least until now.
With AI going forward it’s hard to say. If more of the jobs shift over to model building then yes perhaps a back to basics approach of calculus proficiency could be required.
Most software engineering just doesn’t require calculus, though it does benefit from having the understanding of functions and limit behaviors that higher math does. But if you look at a lot of meme dev jobs they’ve transitioned heavily away from the crypto craze of the past 5 years towards “prompt engineering” or the like to exploit LLMs in the same way that the “Uber for X” meme of 2012-2017 exploited surface level knowledge of JS or API integration work. Fundamentally, the tech ecosystem desires low skill employees, LLMs are a new frontier in doing a lot with a little in terms of deep technical knowledge.
Hmm, that is an interesting take. Calculus does seems like the uniting factor.
I've come to appreciate the fact that domain knowledge has a more dominant role in solving a problem than technical/programming knowledge. I often wonder how s/w could align with other engineering practices in terms of approach design in a standardized way so we can just churn out code w/o an excessive reliance on quality assurance. I'm really hoping visual programming is going to be the savior here. It might allow SMEs and Domain experts to utilize a visual interface to implement their ideas.
Its interesting how python dominated C/C++ in the case of the NumPy community. One would have assumed C/C++ to be a more a natural fit for performance oriented code. But the domain knowledge overpowered technical knowledge and eventually people started asking funny questions like
there was some old commercial that had the tagline "performance is nothing without control". If you can't put the technology to work on your problems then the technology, no matter how incredible, is worthless to you.
This checks out. I'm a software developer who took math all through high school and my first three years of college. I barely scraped through my calculus exams, but I excelled at combinatorics, probability, matrix math, etc. (as long as it didn't veer into calculus for some reason).
I guess I just enjoy things more when I can count them.
For this engineering, I think calculus is not the main proficiency enhancer you claim it to be. Linear Algebra, combinatorics, probability and number theory are more relevant.
Calculus was important during the world wars because it means we could throw shells to the enemy army better, and that was an important issue during that period.
Nowadays, calculus is just a stepping stone to more relevant mathematics.
Todays ML frameworks grapple with the problem of “jacobian-vector products” & “vector-jacobian product” as a consequence of understanding the interplay between gradients & derivative; and the application of the “chain rule”. All of those 3 concepts are fundamentally understood by being proficient in calculus.
While I’m being the hype-man for calculus I don’t mean to say proficiency in linear algebra or statistics is in any “less necessary” or “less useful” or “less challenging” or “less..” in any way.
I’m merely stating that, historically, calculus has been the unique branch of study for engineering. Statistics has always found value in many fields — business, finance, government policy etc.
Sure Linear algebra is one of those unique fields too — I kinda like to think of it as “algebra” in general and perhaps its utility has flowed in tandem with calculus. Idk. I haven’t thought super hard about it.
From what I've heard (not an OSX user) Windows is the best operating system for multiple screens; OSX and Linux glitch way more. Most anyone doing 3D sculpture or graphics/art on a professional level will eventually move to working with 2-3 screens, and since there are no exclusively Mac design programs, OSX will be suboptimal.
There's little things too, like some people using gaming peripherals (multi-button MMO mice and left hand controllers, etc.) for editing, which might not be compatible with OSX.
And also, if you're mucking around with two 32 inch 4k monitors and a 16 inch Wacom it might start to feel a little ridiculous trying to save space with a Mac Pro.
Besides Windows having more drivers for USB adapters than Linux*, which is a reflection of the market, I find Linux having much fewer glitches using multiple screens.
Once it works, Linux is more reliable than Windows. And virtual desktops have always worked better on Linux than on Windows. So I disagree with you on that front.
* In my case, this means I had to get an Anker HDMI adapter, instead of any random brand.
I'd say a lot of engineers (bridges, circuit boards, injection mouldings) are kept far away from OSX (and linux). Honestly, I'd just love a operating system that doesn't decide its going to restart itself periodically!
Yes. I'm pretty sure my wifes 2014 Macbook Air has been 6 months without restart. My windows 11 workstation however has never done a week. I power down now daily to avoid dissapointment.
IETF RFCs soon number over 10K; Java, win32, the Linux kernel syscall API are famous for backward compatibility
not to mention the absurd success of standard libraries of Python, Rust, PHP and certain "standard" projects like Django, React, and ExpressJS
> (how many libraries do you need for arrays and vectors and guis and buttons and text boxes and binary trees and sorting, yada yada?)
considering the design space is enormous and the tradeoffs are not trivial ... it's good to have libraries that fundamentally solve the similar thing but in different context-dependent ways
arguably we are using too many libraries and not enough problem-specific in-situ DSLs (see the result of Alan Kay's research the STEPS project at VPRI - https://news.ycombinator.com/item?id=32966987 )
I'd argue almost all NEW library development is about politics and platform ownership. Every large company wants to be the dependency that other projects tie into. And if you don't want to hitch your wagon to google or facebook or whoever, you roll your own.
Many if not most computational problems are fundamentally about data and data transformation under constraints - Throughput, Memory, Latency, etc, etc. And for the situations where the tradeoffs are non-trivial, solving this problem is purely about domain knowledge regarding the nature of the data (video codec data, real-time sensor data, financial data, etc) not about programming expertise.
The various ways to high level architect the overall design in terms of client/server, P2P, distributed vs local, threading model, are, IME are not what I would call crazy complicated. There are standard ways of implementing various variations of the overall design which sadly because of a overall roll-your-own mindset, most devs are reluctant to adopt someone elses design. Part of that is that we don't have a framework of knowledge that allows us to build a library for these designs in our head where we can just pick one thats right for our usecase.
I don't agree with your characterization of the design space as 'enourmous'. I'd say most programmers just need to know a handful of design types because they're not working on high performance, low latency, multi-million endpoint scalable projects where as you say things can get non-trivial.
I'll give a shot at an analogy (I'm hoping the nitpickers are out to lunch). The design space for door knob is enormous because of the various hand shapes, disability constraints, door sizes, applications, security implications, etc. And yet we've standardize d on a few door knob types for most homes which you can go out and buy and install yourself. The special case bank vaults and prisons and other domains solve it their own way.
I challenge you to take those people who make bridges to build full software.
I am not meaning software is engineering or not.
It is a fact, in terms of cost, that software and bridge building are, most of the time very different activities with very different goals and cost-benefit ratios.
All those things count when taking decisions about the level of standardization.
About standards... there are lots also and widely used, from networking to protocols, data transfer formats... with well-known strengths and limitations.
In my 30± year career I can confidently say that Software Engineers look towards standardisation by default as it makes their lives easier.
It feels to me that you're bitter or had more than one bad experience. Perhaps you keep working with, or come across, bad Engineers as your generalising is inaccurate.
Maybe we need a new moniker "webgineer". The average HN/FAANG web programmer does appear to vastly overestimate the value of their contributions to the world.
When I started doing this "Internet stuff" we were called "webmasters", and job would actually include what today we call:
- DevOps
- Server/Linux sysadmin
- DB admin
- Full stack (backend and frontend) engineer
1999 indeed! I haven't heard that term since around 1999 when I was hired as a "web engineer" and derisively referred to myself as a "webgineer". I almost asked if I could change my title to "sciencematician".
People who cobble together new printers or kettles overestimate the value of their contributions to the world too. The delineation isn't between JS devs and JPL or ASML engineers.
You can shit all you want on so called "engineers", but they are the one who make the CAD you're talking about that "real engineers" use. So get off your high horse.
You're kidding yourself if you don't think that mechanical, structural or any other engineers don't do the same thing. They do.
I worked for one of the UKs leading architecture / construction firms writing software and also am an amature mechanic.
You'd be amazed at how many gasket types, nuts, bolts, fasteners, unfasters, glues, concretes, bonding agents and so on ... all invented for edge preferences and most of which could be used interchangably.
Also standards? Hah. They're an absolute shitshow in any engineering effort.
And they can typically setup their dev environment without a VM, while also getting commercial app support if they need it.
Windows requires a VM, like WSL, for a lot of people, and Linux lacks commercial support. macOS strikes a good balance in the middle that makes it a pretty compelling choice.
I was thinking more about software like the Adobe suite, Microsoft Office, or other closed source software that hasn’t released on Linux. Electron has made things a bit better, but there are still a lot of bigs gaps for the enterprise, unless the company is specifically choosing software to maintain Linux support for end users.
Sure, Wine exists, but it’s not something I’d want to rely on for a business when there are alternatives like macOS which will offer native support.
Most people don't need the Adobe Suite, and the web version of M$-Office is more than Ok for occasional use. Most other enterprise software are web apps too nowadays, so it's much less relevant what OS your machine is running than it was ten years ago...
Excel is essential and in most businesses that I worked with, most of the accounting and business side is run on it. I switched to Windows from Linux just because of Excel when WSL came out. If Linux would have Excel and Photoshop that would be a no brainer to choose it, but that will never happen
Apple fanboys like to talk about how cool and long lasting a MacBook Air is but a 500 bucks Chromebook will do just as well while allowing pretty much 90% of the use cases. Sure, the top end power is much lower but at the same time considering the base RAM/storage combo Apple gives it is not that relevant. If you starting loading it up, that puts the pricing in an entirely different category and in my opinion the MacBook Air becomes seriously irrelevant when compared to serious computing devices in the same price range...
There's still a huge market for people who want higher end hardware and to run workloads locally, or put a higher price on privacy. For people who want to keep their data close to their chest, and particularly now with the AI bloom, being able to perform all tasks on device is more valuable than ever.
A Chromebook "does the job" but it's closer to a thin client than a workstation. A lot of the job is done remotely and you may not want that.
Yes, but for those people if you consider the price of a fully loaded MacBook Pro it is a rather small win considering all the limitations.
If the only thing you care about are battery life (only if you plan to use it lightly on the go, because even the high-end Apple Silicon sucks decent amount of power at full tilt) and privacy I guess they are decent enough.
This is my argument: the base models are at the same time overkill and too limited considering the price and the high-end models are way too expensive for what they bring to the table.
Apple has a big relevancy problem because of how they put a stupid pricing ladder on everything, but that is just my opinion, I guess.
As long as they keep making a shit ton of cash it doesn't matter, I suppose.
But if the relevant people stop buying Macs because they make no sense, it will become apparent why it matters sooner or later...
Not at all, a Chromebook let's you run Linux apps. I can run full blown IDEs locally without problems. And yes, that is with 8Gb ram, ChromeOS has superb memory management.
Well, Google developed and deployed MGLRU to Chromebooks long before upstreamed it. Plus they use some magic to check the MGLRU working set size inside the VMs and balance everything.
Are you seriously arguing about mini-LED displays only found in expensive MacBook Pro when I mention a cheap 500 dollars Chromebook.
There is at least a 4x difference in price for those machines, it is ridiculous to even pretend they are somewhat comparable.
And if we are talking about expensive high-end hardware, mini-LED is worse than OLED found in those machines anyway so it's not like if that would be a strong argument.
My argument isn't about Chromebooks vs any MacBook.
My argument is against a base MacBook Air that is too expensive for relatively limited added utility against something like a cheaper Chromebooks.
Sure, the MacBook Air is better built and will do some things better but those things are not extremely relevant for someone who would be satisfied by an entry level MacBook Air, because while an MBA has some very nice attributes, in the end everything is limited by its RAM/storage (and to a lesser degree, ports).
For a concrete example, in my country the cheaper MacBook Air you can get is the old M1 design at 1k€, then there is the M2 refresh at 1.2k€ and M3 variant at 1.3k€.
But you can get an Asus Chromebook Plus for 600€ that has either the same amount of RAM and storage or more RAM (16Gb) or more storage (512Gb) depending on the variant you end up with.
The display is worse (100 nits less bright and worse resolution) but slightly bigger (14") and that may matter more to many people. It has an older Intel i5 (you can find some AMD options for better efficiency) but it hardly matters for the vast majority of people who just want a laptop to do the basics (basically the target of a MacBook Air). Its battery life would be a bit worse than an MBA but not in a way that can be relevant for the vast majority of customers.
One major advantage it has over an MBA is better ports selection, with an HDMI port, a USB A port and an SD card reader on top of the 2 Thunderbolt/USB C ports the MBA has, allowing a dongle free life without having to buy anything else, providing much better utility. That can be way more relevant for many peoples than a better build quality (that I would argue do not even bring better longevity, since with Apple you are hostage of the software support anyway).
You see I am not against MacBooks; in fact, I would advise purchasing a MacBook Pro for some specific use case.
But the reality is that the entry level Apple hardware is rather compromised for its price, and if someone would be satisfied by that choice, I'm arguing that there is another choice (worse on paper, better in some ways) but at half the price (40% off minimum).
If you start loading up a MacBook Air, you end up in MacBook Pro price territory and it doesn't make a lot of sense to not add the 100-200 more to get the much better machine.
I know from experience that entry level Apple hardware is a terrible deal, both because I made the mistake myself or I had to help/fix the issues for other people that made those choices. I have a cousin who remind me every time how much he fucking hates his entry level iMac (an old one with a compromised Fusion Drive and minimum RAM) even though it was rather expensive compared to most computers.
My answer is always the same: you spent a lot, but not enough, because with Apple you do not deserve a good experience if you don't go above and beyond in your spending.
In my opinion it is way more disingenuous to advocate for entry-level Apple hardware to people who would be very much satisfied with products costing half as much. The value is just not there, Apple went way too far in the luxury territory in locking down everything and creating a pricing ladder that is engineered to confuse and upsell customers to extract as much money as possible.
For someone who really needs a computer to do more intense work, provided they can work around the tradeoffs of Apple Silicon/macOS and they are willing to spend a large amount of cash, Apple has some great hardware for sure.
For everyone else the value is extremely questionnable, especially since they are going full steam ahead into services subscription and the software can be lacking in some ways that will require purchasing even more stuff, the total cost of ownership doesn't make sense anymore.
For example, their iPhone SE is absolutely terrible, at 500€ you pay for 6 years old technology with small screen compared to the footprint, terrible battery life, etc. A 500€ mid-range Android is so much better in so many ways that it is just stupid at this point...
As for OLED, I don't think burn-in is a significant concern anymore, and if it is I would argue that you are using it too much like a desktop. In my country you could buy 2 decent OLED laptops for the price of an entry-level MacBook Pro anyway so it doesn't matter as much (and replacing displays of hardware manufacturers other than Apple is much easier and cheaper, so there is that).
I think the MacBook Pros are very good for some niche applications, but at viable minimum 2.23k€ price (16Gb RAM/512GB storage) there are a lot of good options so it really requires a careful analysis of actual use case. If you do things related to 3D or engineering it is probably not worth it...
You usually don't need either for software development though, and if you do the free or online alternatives are often good enough for the rare occasions you need them. If you are a software developer and you have to spend significant time using Office it means you either are developing extensions for Office or your company management is somewhat lacking and you are forced to handle things you should not (like bureaucracy for instance).
Where I’m at my email is in Outlook. Having to use the web version sounds annoying. I also end up getting a lot of information in spreadsheets. Having to move all that to the online version to open also sounds annoying. The online version is also more limited, which could lead to issues.
I could see a front end dev needing Photoshop for some things, if they don’t have a design team to give them assets.
There are also security software the company says laptops must have which isn’t available for Linux. They only buy and deploy this stuff with Windows and macOS in mind.
A couple weeks ago on HN I saw someone looking for a program to make a demo of their app (I think). The comments were filled with people recommending an app on macOS that was apparently far and away the best option, and many were disappointed by the lack of availability elsewhere. I find there are a lot of situations like this, where I might be able to get the job done on another OS, but the software I actually want to use is on macOS. Obviously this one is a matter of taste to some degree.
It’s not as big an issue as it was 20 years ago, but it’s still an issue for in many environments.
I would love to buy Apple hardware, but not from Apple. I mean: M2 13 inch notebook with access to swap/extend memory and storage, regular US keyboard layout and proper desktop Linux (Debian, Alpine, Mint, PopOS!, Fedora Cinamon) or windows. MacOS and the Apple eco system just gets in your way when you're just trying to maintain a multi-platform C++/Java/Rust code base.
WSL for normal stuff. My co-worker is on Windows and had to setup WSL to get a linter working with VS Code. It took him a week to get it working the first time, and it breaks periodically, so he needs to do it all over again every few months.
I'm developing on Windows for Windows, Linux, Android, and web, including C, Go, Java, TSQL and MSSQL management. I do not necessarily need WSL except for C. SSH is built directly into the Windows terminal and is fully scriptable in PS.
WSL is also nice for Bash scripting, but it's not necessary.
It is a check box in the "Add Features" panel. There is nothing to install or setup. Certainly not for linting, unless, again, you're using a Linux tool chain.
But if you are, just check the box. No setup beyond VS Code, bashrc, vimrc, and your tool chain. Same as you would do on Mac.
If anything, all the Mac specific quirks make setting up the Linux tool chains much harder. At least on WSL the entire directory structure matches Linux out of the box. The tool chains just work.
While some of the documentation is in its infancy, the workflow and versatility of cross platform development on Windows, I think, is unmatched.
This. I have to onboard a lot of students to our analysis toolchain (Nuclear Physics, ROOT based, C++). 10 years ago I prayed that the student has a Mac, because it was so easy. Now I pray they have Windows, because of WSL. The tool chain is all compiled from source. Pretty much every major version, but often also minor versions, of macos break the compilation of ROOT. I had several release upgrades of Ubuntu that only required a recompile, if that, and it always worked.
Unless he is doing Linux development in the first place, that sounds very weird. You most certainly don't need to set up WSL to lint Python or say JS in VSCode on Windows.
That sounds wild, you can run bash and unix utils on windows with minimal fuss without WSL. Unless that linter truly needed linux (and i mean, vscode extensions are typescript..) that sounds like overkill
> Engineers use MacBook pros because it’s the best built laptop, the best screen, arguably the best OS and most importantly - they’re not the ones paying for them.
I know engineers from a FANG that picked MacBook pros in spite of the specs and only because of the bling/price tag. Them they spent their whole time using it as a remote terminal for Linux servers, and they still complained about the thing being extremely short on RAM and HD.
One of them even tried to convince their managers to give the vision pro a try, even though there was zero use cases for it.
Granted, they drive multiple monitors well with a single USB-C plug, at least with specific combinations of monitors and hubs.
It's high time that the "Apple sells high end gear" shtick is put to rest. Even their macOS treadmill is becoming tiring.
The build quality of Apple laptops is still pretty unmatched in every price category.
Yes, there are 2k+ laptops from Dell/Lenovo that match and exceed a similarly priced MacBook in pure power, but usually lack battery life and/or build quality.
Apple devices also work quite seamless together.
IPads for example work great as a second screen wirelessly with the MBPs. I'd immediately buy a 14 inch ipad just for that, since that is so useful when not on your standard desk.
Also copy paste between devices or headphones just work...
in case Apple would come up with the idea to take an ipad as external compute unit that would be amazing... just double your ram, compute and screen with it in such a lightweight form factor... should be possible if they want
is there now a low latency solution for windows 2nd monitor? I was only aware of some software where latency is quite bad or one company that provided a wireless HDMI / Displayport dongle...
Also the nice thing for headphones within apple is, that the airpods automatically switch to where the attention is... meaning e.g., in case I watch something on the laptop and pick up an iphone call (no matter if via phone or any app) the airpod automatically switches
My 15 inch macbook which fried its display twice (didn't go to sleep properly and then put in a bagpack and overheated. There is no way to see that the sleep didn't kick in), and then had the broken display cable problem (widespread and Apple wanted $900 for a new display..) would disagree.
For comparison: The 4k touch display on my xps15 that didn't survive a diet coke bath was <$300 including labor for a guy to show up in my office and repair it while I was watching....
> The build quality of Apple laptops is still pretty unmatched in every price category.
I owned a MacBook Pro with the dreaded butterfly keyboard. It was shit.
How many USB ports do the new MacBook air have? The old ones had two. And shipped with 8GB of RAM? These are shit-tier specs.
The 2020 MacBook pros had a nice thing: USB-C charging, and you could charge the from either side. Current models went back to MagSafe, only on one side. The number of USB ports is still very low.
But the are shiny. I guess that counts as quality.
I guess we can agree to disagree, but I find the 2020 rev Macbook pros have a good number of USB-C ports (2 on the left, 1 on the right -- all can do PD), a magsafe charger, headphone jack, HDMI port and SD card slot. How many USB-C ports do you need? Sometimes I wish there was ethernet but I get why it's not there.
I agree, the butterfly keyboard was shitty but I absolutely love the keyboard on the 2020 rev. It's still not as great as my mechanical desktop keyboard, but for a laptop keyboard it's serious chef's kiss. Also, I have yet to find a trackpad that is anywhere as good as the Macbook. Precision trackpads are still way way worse.
Finally, the thing that always beings me back to MBPs (vs Surfacebooks or Razers) is battery life. I typically get a good 10+ hours on my MBP. Battery life on my old Razer Blade and Surfacebooks were absolutely comically horrible.
I'm absolutely not an Apple person. Privately own zero Apple hardware.
However there are two awesome things about my work MBP I would really want from my ThinkPad:
Magsafe charger - too many close calls!
And the track pad.
I can't work properly without an external mouse on my ThinkPad. But on the MBP everything just has the right size, location, proportions and handling on the track pad. I had a mouse for the MBP too but I stopped using it!
I don’t think it’s at all unreasonable for an engineer using a device for 8+ hours every day to pay an additional, say, 0.5% of their income (assuming very conservatively $100,000 income after tax, $1,000 extra for a MacBook, 2 year product lifespan) for the best built laptop, best screen, and best OS.
I do networking stuff and macOS is on par with Windows - I can't live on it without running into bugs or very questionable behavior for longer than a week. Same as Windows.
What stuff is weird? I have so far had very good experiences with Apple (although not iOS yet). Almost everything I do on my Linux workstation works on Mac too. Windows though is beyond horrible and different in every way.
> I do networking stuff
Me too, but probably very different stuff. I’m doing p2p stuff over tcp and am affected mostly by sock options, buffer sizes, tcp options etc.
I like apple hardware, but their OS is fucking atrocious. In the year 2024 it still doesn't have a native volume mixer, or any kind of sensible window management shortcuts. Half the things on it have to be fixed with paid software. Complete joke of an OS, if it were up to me I'd stick a linux distro on top of the hardware and be happy
The OS is not a joke since it can do some stuff better than either Windows or Linux can but I completely agree that there are some serious limitations or omissions that should have been fixed.
I think they don't because they have an incentive to not do so: they get a cut on all the software you have to purchase on the App Store to make up for it.
It might not look like a lot, but if a large portions of Mac users need to buy a 5-10 bucks app to fix the windows management problems, it becomes serious money at 15-30% cut on millions of purchases...
And this is precisely the problem with Apple today. They are not honest enough to fix or improve the stuff they sell at a very high price, both because they sell it anyway and because they put in place many incentives for themselves to not do so.
There is the running meme of the iPad calculator but macOS could also use some care on the calculator/grapher not having received serious attention in decades. At the price they sell their stuff that would seem like a given, but considering they'll make money on the apps you buy to improve that situation, they'll never do so.
After using Apple App Stores for so many years, I wish I didn't, the convenience really isn't worth the cost down the road...
Not worth it at all. I rarely use battery power, so I'd rather have an intel or AMD chip with more cores and a higher clock speed at the expense of the battery. Oh, and an OS that can actually manage its windows, and customize keyboard settings, and not require an account to use the app store
Then why are you using a Macbook in the first place? There are plenty of Ryzen 7000 and Intel Ultra laptops with similar performance out there. The key benefit of a Macbook is the battery life and sane sleeping when portable.
Apple's hardware these days is exceptional but the software left wanting in comparison. MacOS feels like it's been taking two steps back for every step forward for a decade now. I run MacOS, Linux w/ i3, and Windows all every day, and outside of aesthetics & apple integration MacOS feels increasingly the least coherent of the 3.
The same is true of the ipad which is just a miraculous piece of hardware constrained by an impotent operating system.
This statement is completely wrong. There are millions of engineers in the world and most of them live in countries like China, India and Russia. Very few of them use MacBooks.
The vast majority of the software engineers in big companies (that employ a lot more people than big tech and startups combined) who use Java and C# also have predominately Windows laptops (as their employers can manage Windows laptops a lot easier, have agreements with vendors like Dell to buy them with a discount, have software like AV that doesn't support MacOS, etc.).
On top of that MacBooks don't have the best screens and are not the best built. Many Windows laptops have OLED screens or 4K IPS screens. There are premium Windows laptops made out of magnesium and carbon fiber.
I'm an American, so maybe the situation is different elsewhere.
Every company I've worked for during the last 12 years gives out MacBook Pros. And I've been developing using Scala / Java for the last 20 years.
Employers manage Macs just fine, this isn't 1999. There have been studies showing that Macs have lower IT maintenance costs compared to Windows.
I admit that I haven't dealt with Windows devices in a long time, maybe there are some good ones available now, but I find your statements to be beyond belief. Apple Silicon Macs have blown the doors off the competition, out performing all but top-end Intel laptops, while using a fraction of the power (and I never even hear the fans come on).
I think relatively few corporations are offering Macs to people. It's all bog-standard POS Dells, with locked-down Windows images that often do not even allow you to change the screensaver settings or the background image, in the name of "security." I'd love to be wrong about that.
all two jobs I've worked, both as a backend dev using Go in data-storage companies, have offered Macs.
The first one, a small, badly run startup, only offered Macs. This gig, a larger company, offers Mac, Linux and Windows. I started with Linux and then switched to Mac because I was tired of stuff breaking.
Arch works fairly well on Apple silicon now, though Fedora is easier/recomended.
Limited emulation due to the 16KB pages and no thunderbolt display out.
Arguably the best OS? For what? For browsing the web, video editing, etc.? Maybe. For development? Jesus, macOS doesn't even have native container support. All the devs I know with macOS then either get a second Linux laptop, or spend a lot of their time SSHd into a Linux server.
For dev (at least backend and devops), macOS is not that great.
I don't know what you are talking about, I'm a back end engineer, and every company I've worked for during the last 12 years gives out MacBook pros to all devs. Even the game company that used C# and Mono gave out MacBooks (and dual booted them, which of course you can't do any more; I never bothered with Windows since our servers were written in Scala).
Not all teams run tons of containers on personal computers. All our servers are running on AWS. I rarely ssh into anything.
I like the fact that OS X is based on UNIX, and not some half-assed bullshit bolted onto Windows. I still have bad memories of trying to use Cygwin 15 years ago. Apparently WSL is an improvement, but I don't care.
Mac runs all the software I need, and it has real UNIX shells.
Yeah it's funny for all the hoopla I've heard over the glory of MacOS having a REAL UNIX TERMINAL, WSL works better in practice simply because it's running an actual Linux VM and thus the support is better.
Still, I just don't think it's that burdensome to get containers running on MacOS, it's just annoying that it happens to work worse than on Windows or Linux. Ignoring the hardware, the only real advantage to MacOS development is when you're targeting Apple products with what you're developing.
"best OS" is so subjective here. I'll concede that the MacBook hardware is objectively better than any laptop I've owned. But it's a huge leap to say Mac OS is objectively better than Linux IMO.
I have one and hate it with a passion. A MacBook Air bought new in the past 3 years should be able to use Teams (alone) without keeling over. Takes over a minute to launch Outlook.
My 15 year old Sony laptop can do better.
Even if Microsoft on Mac is an unmitigated dumpster fire, this is ridiculous.
I avoid using it whenever possible. If people email me, it’d better not be urgent.
Indeed, I have a 15-year-old desktop computer that is still running great on Linux. I upgraded the RAM to the maximum supported by the motherboard, which is 8 GB, and it has gone through three hard drives in its life, but otherwise it is pretty much the same. As a basic web browsing computer, and for light games, it is fantastic.
It also performs pretty well for the particular brand of web development I do, which basically boils down to running VS Code, a browser, and a lot of ssh.
It's fascinating to me how people are still attached to the hardware upgrade cycle as an idea that matters, and yet for a huge chunk of people and scenarios, basically an SSD, 8gb of RAM and an Intel i5 from a decade ago could have been the end of computing history with no real loss to productivity.
I honestly look at people who use Apple or Windows with a bit of pity, because those ecosystems would just give me more stuff to worry about.
Is it an Apple silicon or Intel machine? Intel macs are crazy slow - especially since the most recent few versions of macOS. And especially since developers everywhere have upgraded to an M1 or better.
You could certainly still buy new intel macbooks 3 years ago from Apple. Plenty of people did - particularly given a lot of software was still running through rosetta at the time.
The M1 air was only released in November 2020. With a bit of slop in the numbers, its very possible the parent poster bought an intel mac just before the M1 launched.
Yeah it's such a shame how much the performance has been affected by recent macOS. I kept my 2019 Mac Book Pro on Catalina for years because everyone else was complaining... finally upgraded directly to Sonoma and the difference in speed was night and day!
Sounds a bit like my Intel MBP, in particular after they (the company I work for) installed all the lovely bloatware/tracking crap IT thinks we need to be subjected to. Most of the day the machine runs with the fans blasting away.
Still doesn't take a minute to launch Outlook, but I understand your pain.
I keep hoping it will die, because it would be replaced with an M-series MBP and they are way, way, WAY faster than even the best Intel MBP.
I will pile on on MS Teams. I am on a Mac and periodically have to fight it because it went offline on me for some reason and I am no longer getting messages. Slightly less annoying is when my iPhone goes to sleep and Teams on my iPhone then sets my status to "Away", even though I am actively typing on Teams on my computer.
And while my particular problems might be partially because I am on MacOS, I observe Windows-using colleagues have just as many problems joining meetings (either total refusal, no audio, or sharing issues). So I think using Teams as a measure of any computer is probably not warranted.
I actually rejected a job offer when heard I will be given a macbook pro.
Apple, been the most closed company these days, should be avoided as much as you can, not to mention its macos is useless for linux developers like me, anything else is better.
its keyboard is dumb to me(that stupid command/ctrl key difference), can not even mouse-select and paste is enough for me to avoid Macos at all costs.
> I actually rejected a job offer when heard I will be given a macbook pro.
For what it's worth, I've had a good success rate at politely asking to be given an equivalent laptop I can put linux on, or provide my own device. I've never had to outright reject an offer due to being required to use a Mac. At worst I get "you'll be responsible for making our dev environment work on your setup".
I've had 50/50. These days I'm fairly okay with just taking the Macbook Pro. I did have one instance where I got one my first week and used my Dell XPS with Linux the entire 10 months I was at the place. I returned the Macbook basically unused.
Only one time did I interview with a place where I asked if I'd be given a choice what hardware/OS I could use. The response was "We use Windows". My response was, "no we do not. Either I will not be using Windows with you, or I will not be using Windows NOT with you". I didn't get an offer. I was cool with it.
> its keyboard is dumb to me(that stupid command/ctrl key difference)
Literally best keyboard shortcuts out of all major OSes. I don't know what weird crab hands you need to have to comfortably use shortcuts on Windows/Linux.
CMD maps PERFECTLY on my thumb.
any thing runs Linux,even wsl2 is fine,no macos is the key. and yes it costs the employer about half of the expensive Apple devices that can not even be upgraded, its hardware is as closed as its software.
Employers typically also care about costs like “how hard is it to provision the devices” and “how long is the useful life of this” or “can I repurpose an old machine for someone else”.
Provisioning is a place where Windows laptops win hands down, though.
Pretty much everything going wrong with provisioning involves going extra weird on hw (usually for cheap supplier) and/or pushing weird third party "security" crapware.
> "I don't even know what you mean by mouse-select and paste."
Presumably they mean linux-style text select & paste, which is done by selecting text and then clicking the middle mouse button to paste it (no explicit "copy" command).
macOS doesn't have built-in support for this, but there are some third-party scripts/apps to enable it.
On Windows these days, you get WSL, which is actual Linux, kernel and all. There are still some differences with a standalone Linux system, but they are far smaller than macOS, in which not only the kernel is completely different, but the userspace also has many rather prominent differences that you will very quickly run afoul of (like different command line switches for the same commands).
Then there's Docker. Running amd64 containers on Apple silicon is slow for obvious reasons. Running arm64 containers is fast, but the actual environment you will be deploying to is almost certainly amd64, so if you're using that locally for dev & test purposes, you can get some surprises in prod. Windows, of course, will happily run amd64 natively.
> the actual environment you will be deploying to is almost certainly amd64
that’s up to your team of course, but graviton is generally cheaper than x86 instances nowadays and afaik the same is true on google and the other clouds.
Arm is an ISA, not a family of processors. You may expect Apple chips and Graviton to be wildly different, and perform completely different in the same scenario. In fact, most Arm cpus also have specific extensions that are not found in other manufacturers. So yes, while both recognize a base set of instructions, thats about it - expect that everything else is different.
I know, amd64 is also technically an ISA, but you have 2 major manufacturers, with very similar and predictable performance characteristics. And even then, sometimes something on AMD behaves quite differently from Intel.
For most devs, doing crud stuff or writing high-level scripting languages, this isn't really a problem. For some devs, working on time-sensitive problems or with strict baseline performance requirements, this is important. For devs developing device drivers, emulation can only get you so far.
No, I said you won’t always be deploying on amd64. Because arm64 is now the cheapest option and generally faster than the sandy bridge vcpu unit that amd64 instances are indexed against (and really, constrained to, intentionally, by AWS).
I never said anything about graviton not being arm64.
Its not about price, its about compatibility. Just because software compiles in a different ISA doesnt mean it behaves the same way. But if that isn't obvious to you, good for you.
M* has caused nothing but trouble for most mac user engineers I know (read: most engineers I know) who upgraded. Now not only are they building software for a different OS, they're building for a different architecture! They do all of their important compute in Docker, wasting CPU cycles and memory on the VM. All for what: a nice case? nice UI (that pesters you to try Safari)?
It looks like Apple's silicon and software is really good for those doing audio/video. Why people like it for dev is mostly a mystery to me. Though I know a few people who don't really like it but are just intimidated by Linux or just can't handle the small UX differences.
I'm an engineer that has both an apple silicon laptop (mbp, m2) and a linux laptop (arch, thinkpad x1 yoga.) I choose the mac every day of the week and it's not even close. I'm sure it's not great for specific engineering disciplines, but for me (web, rails, sre) it really can't be beat.
The UX differences are absolutely massive. Even after daily-driving that thinkpad for months, Gnome always felt kinda not quite finished. Maybe KDE is better, but it didn't have Wayland support when I was setting that machine up, which made it a non-starter.
The real killer though is battery life. I can work literally all day unplugged on the mbp and finish up with 40-50% remaining. When i'm traveling these days, i don't even bring a power cable with me during the day. The thinkpad, despite my best efforts with powertop, the most aggressive frequency scaling i could get, and a bunch of other little tricks, lasts 2 hours.
There are niceties about Linux too. Package management is better and the docker experience is _way_ better. Overall though, i'd take the apple silicon macbook 10 times out of 10.
Battery life followed by heat and fan noise have been my sticking points with non-mac laptops.
My first gen ThinkPad Nano X1 would be an excellent laptop, if it weren’t for the terrible battery life even in power save mode (which as an aside, slows it down a lot) and its need to spin up a fan to do something as trivial as driving a rather pedestrian 2560x1440 60hz display.
It feels almost like priorities are totally upside down for x86 laptop manufacturers. I totally understand and appreciate that there are performance oriented laptops that aren’t supposed to be good with battery life, but there’s no good reason for there being so few ultraportable and midrange x86 laptops that have good battery life and won’t fry your lap or sound like a jet taking off when pushed a little. It’s an endless sea of mediocrity.
This echoes my experiences for anything that needs power management. Not just that the battery life is worse, but that it degrades quickly. In two years it’s barely usable. I’ve seen this with non-Apple phones and laptops. iPhone otoh is so good these days you don’t need to upgrade until EOL of ~6 years (and even if you need it battery is not more expensive than any other proprietary battery). My last MacBook from 2011 failed a couple of years ago only because of a Radeon GPU inside with a known hw error.
> There are niceties about Linux too.
Yes! If you haven’t tried in years, the Linux desktop experience is awesome (at least close enough) for me – a dev who CAN configure stuff if I need to but find it excruciatingly menial if it isn't related to my core work. It’s really an improvement from a decade ago.
I'd like to offer a counterpoint, I have an old'ish T480s which runs linuxmint, several lxd containers for traefik, golang, python, postgres and sqlserver (so not even dockerized, but full VMs running these services), and I can go the whole morning (~4-5 hours).
I think the culprit is more likely the power hungry intel CPU in your yoga?
Going on a slight tangent; I've tried but do not like the mac keyboards, they feel very shallow to me, hence why I'm still using my old T480s. The newer thinkpad laptop keyboards all seem to be going that way though (going thinner), much to my dismay. Perhaps a P14s is my next purchase, despite it's bulk.
Anybody with a framework 13 want to comment on their keyboard?
I really like the keyboards on my frameworks. I have both the 13 and the new 16, and they are pretty good. Not as good as the old T4*0s I'm afraid, but certainly usable.
Interesting. I do similar (lots of Rails) but have pretty much the opposite experience (other than battery life - Mac definitely wins there). Though I use i3/Sway more than Gnome. The performance of running our huge monolith locally is much better for Linux users than Mac users where I work.
I used a Mac for awhile back in 2015 but it never really stood out to me UX-wise, even compared to Gnome. All I really need to do is open a few windows and then switch between them. In i3 or Sway, opening and switching between windows is very fast and I never have to drag stuff around.
This is going to change once Arm on Linux becomes a thing with Qualcomm's new jazz. I am mostly tethered to a dock with multiple screens. I have been driving Ubuntu now for over 4 years full time for work.
In my experience as a backend services Go developer (and a bit of Scala) the switch to arm has been mostly seamless. There was a little config at the beginning to pull dual-image docker images (x64 and arm) but that was a one time configuration. Otherwise I'm still targeting Linux/x64 with Go builds and Scala runs on the JVM so it's supported everywhere anyway; they both worked out of the box.
My builds are faster, laptop stays cooler, and battery lasts longer. I love it.
If I was building desktop apps I assume it would be a less pleasant experience like you mention.
Interestingly enough, the trend I am seeing is all the MacBook engineers moving back to native development environments. Basically, no longer using docker. And just as expected, developers are getting bad with docker and are finding it harder to use. They are getting more and more reliant on devops help or to lean on the team member who is on Linux to handle all of that stuff. We were on a really great path for a while there in development where we were getting closer to the ideal of having development more closely resemble production, and to have developers understand the operations tools. Now we're cruising firmly in the opposite direction because of this Apple switch to arm. Mainly it wouldn't bother me so much if people would recognize that they are rationalizing because they like the computers, but they don't. They just try to defend logically a decision they made emotionally. I do it too, every human does, but a little recognition would be nice.
It's not even a problem with MacBooks as such. They are still excellent consumer devices (non-casual gaming aside). It's this weird positioning of them as the ultimate dev laptop that causes so many problems, IMO.
Because machines are tools meant to perform tasks, and part of that is being interoperable with other tools and de facto standards in the relevant field. For dev work, today, MacBook is not good at it.
Remember, though, that the binaries deployed in production environments are not being built locally on individual developer machines, but rather in the cloud, as reproducible builds securely deployed from the cloud to the cloud.
Modern language tooling (Go, Rust et al) allows one to build and test on any architecture, and the native macOS virtualization (https://developer.apple.com/documentation/virtualization) provides remarkably better performance compared to Docker (which is a better explanation for its fading from daily use).
Your "trend" may, in fact, not actually reflect the reality of how cloud development works at scale.
And I don't know a single macOS developer that "lean(s) on the team member who is on Linux" to leverage tools that are already present on their local machine. My own development environments are IDENTICAL across all three major platforms.
Virtualization and Docket are orthogonal technologies. The reason you use docker, especially in dev, is to have the exact same system libraries, dependencies, and settings on each build. The reason you use virtualization is to access hardware and kernel features that are not present on your hardware or native OS.
If you deploy on docker (or Kubernetes) on Linux in production, then ideally you should be using docker on your local system as well. Which, for Windows or MacOS users, requires a Linux VM as well.
It seems that you're trying to "educate" me on how containers and virtualization work, when in fact I've been doing this for a while, on macOS, Linux and Windows (itself having its own Hyper-V pitfalls).
I know you mean well, though.
There is no Docker on macOS without a hypervisor layer - period - and a VM, though there are multiple possible container runtimes not named Docker that are suitable for devops-y local development deployments (which will always, of course, be constrained in comparison to the scale of lab / staging / production environments). Some of these can better leverage the Rosetta 2 translation layer that Apple provides, than others.
I'm sorry that I came up as patronizing, I was more so trying to explain my confusion and thought process rather than to teach you about virtualization and containers.
Specifically what confused me in your comment was that you were saying Docker on Mac was superseded by their new native virtualization, which just doesn't make sense to me, for the reasons I was bringing up. I still don't understand what you were trying to say; replacing docker with podman or containerd or something else still doesn't have anything to do with virtualization or Rosetta, or at least I don't see the connection.
I should also say that I don't think anyone really means specifically docker when they talk about it, they probably mean containerization + image repos in general.
I don’t know a single engineer who had issues with M chips, and most engineers I know (me included) benefited considerably from the performance gains, so perhaps your niche isn’t that universal?
You must have an unusual setup because, between Rosetta and rosetta in Virtualization.framework VMs (configurable in Docker Desktop or Rancher Desktop), I’ve never had issues running intel binaries on my Mac
what's wrong w/ Rails on M chips? I don't recall having had much trouble with it (except w/ nokogiri bindings right when the M1 was first available, but that's a given for any new release of OSX)
We have to cross-compile anyway because now we're deploying to arm64 Linux (AWS Graviton) in addition to x86 Linux.
So even if all developers of your team are using Linux, unless you want to waste money by ignoring arm64 instances on cloud computing, you'll have to setup cross compilation.
1) macs are by far the best hardware and also performance running intel code is faster than running intel code on the previous intel macs: https://discourse.slicer.org/t/hardware-is-apple-m1-much-fas...
2) they should use safari to keep power usage low and browser diversity high
It is, provided that the hardware vendor has reasonably decent support for power management, and you're willing to haul around an AC adapter if not. In general, I really like AMD hardware with built-in graphics for this, or alternately, Intel Tiger Lake-U based hardware.
Asahi Linux is shockingly great on Apple Silicon hardware, though.
Apple is selling hardware and scaling AI by utilizing it is simply a smart move.
Instead of building huge GPU clusters, having to deal with NVIDIA for GOUs (Apple kicked NVIDIA out years ago because of disagreements), Apple is building mainly on existing hardware.
This is in other terms utilizing CPU power.
On the other hand this helps their marketing keeping high price points when Apple now is going to differentiate their COU power and therefore hardware prices over AI functionality correlating with CPU power. This is also consistent with Apple stopping the MHz comparisons years ago.
Also, Siri, and consider: you’re scaling AI on apple’s hardware, too, you can develop your own local custom AI on it, there’s more memory available for linear algebra in a maxed out MBP than the biggest GPUs you can buy.
They scale the VRAM capacity with unified memory and that plus a ton of software is enough to make the Apple stuff plenty competitive with the corresponding NVIDIA stuff for the specific task of running big AI models locally.
> there’s more memory available for linear algebra in a maxed out MBP than the biggest GPUs you can buy.
But this hardly applies to 95% if not more people of all people running Apple's hardware, the fastest CPU/GPU isn't worth much if you can fit any at least marginally useful LLM model on the 8GB (or less on iPhones/iPads) of memory that you device has?
>Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury.
Most CS professionals who write code have no idea what it takes to build a desktop, so the hardware that they chose is pretty much irrelevant because they aren't specifically choosing for hardware. The reason Apple gets bought is mostly by anyone, including tech people, is because of ecosystem. The truth is, nobody really care that much about actual specs as long as its good enough to do basic stuff, and when you are indifferent to the actual difference but all your friends are in the ecosystem, the choice is obvious.
You can easily see this yourself: ask these "professionals" about the details of the Apple Neural engine, and its a very high chance that they will repeat some marketing material, while failing to mention that Apple does not publish any real docs for ANE, you have to sign your code to run on ANE, and you have to basically use Core ML to utilize the ANE. I.e if they really cared about inference, all of them would be buying laptops with discrete 4090s for almost the same price.
Meanwhile, if you look at people who came from EE/ECE (who btw on the average are far better coders than people with CS background, based on my 500+ interviews in the industry across several sectors), you see a way larger skew towards Android/custom built desktops/windows laptops running Linux. If you lived and breathed Linux and low level OS, you tend appreciate all the power and customization that it gives you because you don't have to go learn how to do things.
Coming from both environments, I'd be wary of making some of these assertions, especially when you consider that any ecosystem that optimizes software and hardware together (from embedded devices all the way to general-purpose computing machines) is generally going to perform well, given the appropriate engineering focus. This applies regardless of (RT)OS / hardware choice, i.e., it's simply common sense.
The signing of binaries is a part of adult developer life, and is certainly required for the platforms you mention as well.
Unquestionably, battery life on 4090-based laptops sucks on a good day, and if you're working long hours, the last thing you want to have to do is park yourself next to your 350W adapter just to get basic work done.
>specially when you consider that any ecosystem that optimizes software and hardware together (from embedded devices all the way to general-purpose computing machines) is generally going to perform well, given the appropriate engineering focus.
Very much not true. Not to make this personal, but this is exactly what Im talking about Apple fans not understanding hardware.
Linux has been through the ringer of fighting its way to general use, and because of its open source nature and constant development. So in terms of working well, it has been optimized for hardware WAY further than Apple, which is why you find it on servers, personal desktops, phones, portable gaming devices, and even STM32 Cortex bldc control boards, all of which run different hardware.
Apple doesn't optimize for general use, it optimizes for a specific business case. In the case of Apple silicon, it was purely battery life which brings more people in to the ecosystem. Single core performance is on par with all the other chips, because the instruction set doesn't actually matter (https://chipsandcheese.com/2021/07/13/arm-or-x86-isa-doesnt-...), multi core is behind, Mac Os software is still a pile of junk (Rosetta still isn't good across the board), computers are not repairable, you have no privacy since Apple collects a shitload of telemetry for themselves, e.t.c and so on.
And, Apple has no incentive to make any of this better - prior to Apple Silicon, people were still buying Intel Macs with worse specs and performance for the same price, all for the ecosystem and vanity. And not only was the Macos still terrible (and much slower), you also had hardware failures like plugging in a wrong USBC hub would blow the chip and brick your Mac, butterfly keyboards failing, and questionable decisions like virtual esc keys.
>The signing of binaries is a part of adult developer life,
...for professional use, and the private key holder should be the person who wrote that software. I hope you understand how ridiculous it is to ask a developer to sign code using the manufacturers key to allow them to run that code on a machine that they own.
>Unquestionably, battery life on 4090-based laptops sucks on a good day,
Well yea, but you are not buying that laptop for battery life. Also, with Ryzen cpus and 4090s, most get like 6-8 hours depending on use due to Nvidia Prime, which is pretty good for travel, especially if you have a backpack with a charging brick.
If you want portability, there are plenty of lighter weight option like Lenovo Yoga which can get 11-12 hours of battery life for things like web browsing.
Any decent laptop from the same era. My parents are using both HP ProBooks and Lenovo Thinkpads from that era currently and they are working perfectly and maintenance costs are lower than the same era macbooks...
I own a MacBook Air, I won't be buying another purely because the moment I need to upgrade anything or repair anything it's effectively ewaste.
Not found any good proxy which works well with cisco VPN software. Charles and proxyman work intermittently at best and require disconnecting from the VPN and various such dances.
> Somewhat true but things are changing. While there are plenty of “luxury” Apple devices like Vision Pro or fully decked out MacBooks for web browsing we no longer live in a world where tech are just lifestyle gadgets.
I notice your use of the weasel word "just".
We undoubtedly live in a world where Apple products are sold as lifestyle gadgets. Arguably it's more true today than it ever was. It's also a world where Apple's range of Veblen goods managed to gain footing in social circles to an extent that we have kids being bullied for owning Android phones.
Apple's lifestyle angle is becoming specially relevant because they can no longer claim they sell high-end hardware, as the difference in specs between Apple's hardware and product ranges from other OEMs is no longer noticeable. Apple's laughable insistence on shipping laptops with 8GB of RAM is a good example.
> Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury.
I don't think so, and that contrasts with my personal experience. All my previous roles offered a mix of MacBooks and windows laptops, and MacBooks were opted by new arrivals because they were seen as perks and the particular choice of windows ones in comparison were not as impressive, even though they out-specced Apple's offering (mid-range HP and Dell). In fact in a recent employee's review their main feedback was that the MacBook pro line was under-specced because at best it shipped with only 16GB of RAM while the less impressive HP ones already came with 32GB. In previous years, they called for the replacement of the MacBook line due to the rate of keyboard malfunctions. Meaning, engineers were purposely picking the underperforming option for non-technical reasons.
I bought my first Apple product roughly 11 years ago explicitly because it had the best accessibility support at the time (and that is still true). While I realize you only see your slice of the world, I really cringe when I see the weasel-word "lifestyle". This "Apple is for the rich kids"-fairytale is getting really really old.
Apparently you’ve never used Apple Silicon. There’s no PC equivalent in terms of specs.
Also, I think you’re misunderstanding what a Veblen good is and the difference between “premium” and “luxury.” Apple does not create luxury or “Veblen” goods like for example, LVMH.
An easy way to discern the difference between premium and luxury — does the company advertise the product’s features or price?
For example, a Chanel handbag is almost entirely divorced from its utility as a handbag. Chanel doesn’t advertise features or pricing, because it’s not about the product’s value or utility, it’s what it says about your personal wealth that you bought it. That’s a Veblen good.
Apple heavily advertises features and pricing. Because they sell premium products that are not divorced from their utility or value.
price-performance is not a thing for a vast majority of users. Sure I'd like a $40k car but I can only afford a $10k car. It's not nice but it gets me from a to b on my min-wage salary. Similarly, I know plenty of friends and family. They can either get 4 macs for $1000 each (mom, dad, sister, brother) so $4k. Or they can get 4 windows PCs for $250 so $1k total.
The cheap Windows PCs suck just like a cheap car sucks (ok, they suck more), but they still get the job done. You can still browse the web, read your email, watch a youtube video, post a youtube video, write a blog, etc.. My dad got some HP celeron. It took 4 minutes to boot. It still ran though and he paid probably $300 for it vs $999 for a mac. He didn't have $999.
I’m not saying one or the other is better for your family members. But MacBooks last very long. We'll see about the M series but for myself for instance I got the M1 air without fans, which has the benefit of no moving pieces or air inlets, so even better. My last one, a MBP from 2011 lasted pretty much 10 years. OS updates are 8-10y.
> The cheap Windows PCs suck […], but they still get the job done
For desktop, totally. Although I would still wipe it with Ubuntu or so because Windows is so horrible these days even my mom is having a shit time with only browsing and video calls.
A random laptop however is a different story. Except for premium brands (closer to Apple prices) they tend to have garbage battery life, infuriating track pad, massive thermal issues, and preloaded with bloatware. Apple was always better here, but now with the lower power/heat of the ARM chips, they got soooo much better overnight.
> A random laptop however is a different story. Except for premium brands (closer to Apple prices) they tend to have garbage battery life, infuriating track pad, massive thermal issues, and preloaded with bloatware. Apple was always better here, but now with the lower power/heat of the ARM chips, they got soooo much better overnight.
To the person with no budget, all that doesn't matter. They'll still get let $250 laptop and put up with the garbage battery life (find a power outlet), infuriating trackpad (buy an external mouse for $10), bloatware (most users don't know this and just put up with it), etc....
I agree Apple is better. But if your budget is $250 and not $1k then you get what you can get for $250 and continue to feed your kids and pay your rent.
But also you don't have to buy new. If I had $250, an ancient MacBook might be better than a newer low-end windows laptop. Though for my purposes I'd probably get an oldish Chromebook and root it.
you can get a laptop with a much bigger screen and a keyboard for as little as 100 to 300$ and it will be much much easier to get work done on, than an apple phone. so i think apple is still very much a luxury product.
Clumsily phrased. What I meant is that iPhones or similar priced smartphones are affordable and common for say middle class in countries with similar purchase power to Eastern European countries. You’d have to go to poorer countries like Vietnam or Indonesia for iPhones to be “out of reach”, given the immense value it provides.
Heck now I see even Vietnam iPhone is #1 vendor with a 28% market penetration according to statcounter. That’s more than I thought, even though I was just there…
Speaking of India, they’re at 4% there. That’s closer to being luxury.
I think US is their main market, though. The rest of the world prefers cheaper better phones and doesn't mind using WhatsApp for messaging, instead of iMessage.
As a single market, US is probably biggest. I’m seeing numbers that say that the “Americas” is a bit less than half of global revenue, and that would include Canada and all of South and Latin America. So the rest of the world is of course very important to Apple, at least financially.
> doesn't mind using WhatsApp for messaging
Well WhatsApp was super early and way ahead of any competition, and the countries where it penetrated had no reason to leave, so it’s not exactly like they settle for less. It has been a consistently great service (in the category of proprietary messaging apps), even after Zuck took over.
It's not about price-performance value at all. Mac is still the most expensive performance. And Apple is only particularly popular in the US. Android phones dominate most other markets, particularly poor markets.
Apple is popular in the US because a) luxury brands hold sway b) they goad customers into bullying non-customers (blue/green chats) and c) they limit features and customizability in favor of simpler interfaces.
It's popular with developers because a) performance is valuable even at Apples steep cost b) it's Unix-based unlike Windows so shares more with the Linux systems most engineers are targeting.
I have never been an apple fanboy. Till 2022, I was on android phones. Work issued either Thinkpad or XPS variants. However, I have owned apple books since 2004 starting from panther era. I sincerely believe that apple provides best features and performance combination in the given price for laptops.
Here I feel that I-hate-apple crowd is just stuck with this notion of luxury overpriced brand when it is clearly not the case. Apple has superior hardware at better price points. Last time I was doing shopping for a laptop, I could get similar features only at a 30% - 40% price premium in other brands.
I am typing this on an apple M2 air and try finding similar performance under 2000 USD in other brands. The responsiveness, the (mostly) sane defaults and superior rendering and fonts make it worth it. The OS does not matter so much as it used to do in 2004 and the fact that I have a unix terminal in 2024 is just incidental. I have turned off auto updates and I do not use much of phone integration apart from taking backups and photo copying.
I switched to an iPhone in 2022 from a 200 US$ Samsung handset. Here, I would say that not everyone needs an iPhone. My old phone used to do all the tricks I need on this one. However, the camera is really and photos are really great. If I buy an iPhone next time, it would be just for the photos it takes.
> > In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).
> Their primary business goal is to sell hardware.
There is no contradiction here. No need for luxury. Efficient hardware scales, Moore's law has just been rewritten, not defeated.
Power efficiency combined with shared and extremely fast RAM, it is still a formula for success as long as they are able to deliver.
By the way, M-series MacBooks have crossed bargain territory by now compared to WinTel in some specific (but large) niches, e.g. the M2 Air.
They are still technically superior in power efficiency and still competitive in performance in many common uses, be it traditional media decoding and processing, GPU-heavy tasks (including AI), single-core performance...
This is it. An M series air is an incredible machine for most people - people who likely won’t ever write a line of js or use a GPU. Email, banking, YouTube, etc ona device with incredible battery and hardware that will likely be useful for a decade is perfect. The average user hasn’t even heard of HN.
It's great for power users too. Most developers really enjoy the experience of writing code on Macs. You get a Unix based OS that's just far more usable and polished than a Linux laptop.
If you're into AI, there's objectively literally no other laptop on the planet that is competitive with the GPU memory available on an MBP.
depends on your workload. RAM and passive cooling are the most likely issues but afaik an M2/M3 with 16GiB still performs a lot better than an similarly priced x64 laptop. Active cooling doesn't mean no throttling either.
If you don't explicitly want a laptop, a 32GB M2 Pro Mac Mini would be a good choice I think.
Personally i only have used MBPs so far.
But the M-series Air are not remotely comparable to the old Intel Airs, that's for sure :)
The alternative is Google / Android devices and OpenAI wrapper apps, both of which usually offer a half baked UI, poor privacy practices, and a completely broken UX when the internet connection isn't perfect.
Pair this with the completely subpar Android apps, Google dropping support for an app about once a month, and suddenly I'm okay with the lesser of two evils.
I know they aren't running a charity, I even hypothesized that Apple just can't build good services so they pivoted to focusing on this fake "privacy" angle. In the end, iPhones are likely going to be better for edge AI than whatever is out there, so I'm looking forward to this.
No, the alternative is Android devices with everything except firmware built from source and signed by myself. And at the same time, being secure, too.
You just can't have this on Apple devices. On Android side choices are limited too, I don't like Google and especially their disastrous hardware design, but their Pixel line is the most approachable one able to do all these.
Heck, you can't even build your own app for your own iPhone without buying another hardware (a Mac, this is not a software issue, this is a legal issue, iOS SDK is licensed to you on the condition of using on Apple hardware only) and a yearly subscription. How is this acceptable at all?
> No, the alternative is Android devices with everything except firmware built from source and signed by myself
Normal users will not do this. Just because many of the people here can build and sign a custom Android build doesn't mean that is a viable commercial alternative. It is great that is an option for those of us who can do it, but don't present it as a viable alternative to the iOS/Google ecosystems. The fraction of people who can and will be willing to do this is really small. And even if you can do it, how many people will want to maintain their custom built OSes?
the main reason the masses don't have privacy and security-centred systems is that they don't demand them and they will trade it away for a twopence or for the slightest increment in convenience
a maxim that seems to hold true at every level of computing is that users will not care about security unless forced into caring
with privacy they may care more, but they are easily conditioned to assume it's there or that nothing can be realistically be done about losing it
I, an engineer, am not doing this myself, too. There is a middle ground though: just use a privacy-oriented Android build, like DivestOS. [1]
There are a couple caveats:
1. It is still a bit tricky for a non-technical person to install. Should not be a problem if they know somebody who can help, though. There's been some progress making the process more user friendly recently (e.g. WebUSB-based GrapheneOS installer).
2. There are some papercuts if you don't install Google services on your phone. microG [2] helps with most but some still remain. My main concern with this setup is that I can't use Google Pay this way, but having to bring my card with me every time seems like an acceptable trade off to me.
WebGPU isn't standardized yet. Hell, most of the features people complain about aren't part of any standard, but for some reason there's this sense that if it's in Chrome, it's standard - as if Google dictates standards.
I’ve been using Firefox since the Quantum version is out. It feels slightly slower to Chrome but it's negligible to me. Otherwise I can't tell a difference (except some heavy web based Office like solutions screaming 'Your browser is not supported!' but actually works fine).
Meanwhile, Apple has historically dictated that Google can't publish Chrome for iOS, only a reskinned Safari. People in glass-walled gardens shouldn't throw stones.
Because as you described, the only alternatives that exist are terrible experiences for basically everyone, so people are happy to pay to license a solution that solves their problems with minimal fuss.
Any number of people could respond to “use Android devices with everything except firmware built from source and signed by myself” with the same question.
The yearly subscription is for publishing your app on Apple’s store and definitely helps keep some garbage out. Running your own app on your own device is basically solved with free third party solutions now (see AltStore and since a newer method I can’t recall atm)
Notice that parent never talked about publishing apps, just building and running apps on their own device. "Publishing on AltStore" (or permanently running the app on your own device in any other way) still requires a $100/year subscription as far as I'm aware.
> No, the alternative is Android devices with everything except firmware built from source and signed by myself
I wouldn't bet on this long term, since it fully relies on Google hardware, and Google's long-term strategy is to remove your freedom piece by piece, cash on it, not to support it.
The real alternative is GNU/Linux phones, Librem 5 and Pinephone, without any ties to greedy, anti-freedom corporations.
> No, the alternative is Android devices with everything except firmware built from source and signed by myself. And at the same time, being secure, too.
There are people who don't know how to use file explorer, new generation grows up in a world of iPhones without ever seeing file system. Any other bright ideas?
> Heck, you can't even build your own app for your own iPhone without buying another hardware (a Mac, this is not a software issue, this is a legal issue, iOS SDK is licensed to you on the condition of using on Apple hardware only) and a yearly subscription. How is this acceptable at all?
Because they set the terms of use of the SDK? You're not required to use it. You aren't required to develop for iOS. Just because Google gives it all away for free doesn't mean Apple has to.
Sure, as a SWE I'm not going to buy a computer unable to run my own code. A smartphone is an ergonomic portable computer, so I say no to iPhone and would like to remind others who didn't have a deep think into this about it.
Do you have a legal right to write software or run your own software for hardware you bought?
Because it’s very easy to take away a right by erecting aritificial barriers, just like how you could discriminate by race at work, but pretend you are doing something else,
> Do you have a legal right to write software or run your own software for hardware you bought?
I've never heard of such a thing. Ideally I'd like that, but I don't have such freedoms with the computers in my cars, for example, or the one that operates my furnace, or even for certain parts of my PC.
So you bought "a thing' but you can't control what it does, how it does it, you don't get to decide what data it collects or who can see that data.
You aren't allowed to repair the "thing' because the software can detect you changed something and will refuse to boot. And whenever it suits the manufacturer, they will decide when the 'thing' is declared out of support and stops functioning.
I would say you are not an owner then, you (and me) and just suckers that are paying for the party. Maybe it's a lease. But then we also pay when it breaks, so it more of a digital feudalism.
> Do you have a legal right to write software or run your own software for hardware you bought?
No, obviously not. Do you have a right to run a custom OS on your PS5? Do you have a right to run a custom application on your cable set-top box? Etc. Such a right obviously doesn’t exist and most people generally are somewhere between “don’t care” and actively rejecting it for various reasons (hacking in games, content DRM, etc).
It’s fine if you think there should be, but it continues this weird trend of using apple as a foil for complaining about random other issues that other vendors tend to be just as bad or oftentimes even worse about, simply because they’re a large company with a large group of anti-fans/haters who will readily nod along.
Remember when the complaint was that the pelican case of factory OEM tools you could rent (or buy) to install your factory replacement screen was too big and bulky, meaning it was really just a plot to sabotage right to repair?
> Remember when the complaint was that the pelican case of factory OEM tools you could rent (or buy) to install your factory replacement screen was too big and bulky, meaning it was really just a plot to sabotage right to repair?
Yes, I do. That was and continues to be a valid complaint, among all other anti-repair schemes Apple have come up with over the years. DRM for parts, complete unavailability of some commonly repaired parts, deliberate kneecapping of "Apple authorized service providers", leveraging the US customs to seize shipments of legitimate and/or unlabeled replacement parts as "counterfeits", gaslighting by official representatives on Apple's own forums about data recovery, sabotaging right to repair laws, and even denial of design issues[1] to weasel out of warranty repair just to name a few.
All with the simple anti-competitive goal of making third party repair (both authorized and independent) a less attractive option due to artificially increased prices, timelines to repair, or scaremongering about privacy.
> Yes, I do. That was and continues to be a valid complaint,
No, it doesn’t - because you can simply not use the tools if you don’t want. You can just order a $2 spudger off Amazon if you want, you don’t need the tools at all.
It continues to be a completely invalid complaint that shows just how bad-faith the discussion about apple has become - it literally costs you nothing to not use the tools if you want, there is no downside to having apple make them available to people, and yet you guys still find a way to bitch about it.
Moreover, despite some “bold” proclamations from the haters… no android vendors ever ended up making their oem tooling available to consumers at all. You have to use the Amazon spudger on your pixel, and you will fuck up the waterproofing when you do your repair, because the android phone won’t seal properly against water without the tools either. IPX sixtywho!?
It’s literally a complete and total net positive: nothing was taken away from you, and you don’t need to use it, and it makes your life easier and produces a better repair if you want it. Apple went out of their way to both make the tooling available to normies who want to rent it or people who want to buy it for real. And people still bitch, and still think they come off better for having done so. Classic “hater” moment, in the Paul Graham sense. Anti-fanboys are real.
Literally, for some people - the pelican cases with the tools are too big and heavy. And that’s enough to justify the hate.
Again, great example of the point I was making in the original comment: people inserting their random hobby horse issues using apple as a foil. You don’t like how phones are made in general, so you’re using apple as a whipping boy for the issue even if it’s not really caused or worsened by the event in question etc. Even if the event in question is apple making that issue somewhat better, and is done worse by all the other vendors etc. Can’t buy tooling for a pixel at all, doing those repairs will simply break waterproofing without it, and you’re strictly better off having the ability to get access to the tooling if you decide you want it, but apple offering it is a flashpoint you can exploit for rhetorical advantage.
> Moreover, despite some “bold” proclamations from the haters… no android vendors ever ended up making their oem tooling available to consumers at all. You have to use the Amazon spudger on your pixel, and you will fuck up the waterproofing when you do your repair, because the android phone won’t seal properly against water without the tools either. IPX sixtywho!?
I think the dirty little secret here is that an iPhone is just about the only phone, apart from maybe some of the really nice Google and Samsung flagships, that anyone wants to repair, because they're bloody expensive. Which is fine and dandy but then do kindly park your endless bemoaning of the subjects of e-waste and non-repairable goods, when Android by far and away is the worse side of that equation, with absolute shit tons of low yield, crap hardware made, sold, and thrown away when the first software update renders it completely unusable (if it wasn't already, from the factory).
Could you chill with the relentless insults? I'd appreciate it.
Perhaps you haven't noticed, but once you tally up overpriced parts together with their oversized, heavy, expensive rental of tools that you don't need, you end up with a sum that matches what you would pay to have it repaired by Apple - except you're doing all of the work yourself.
A curious consumer who has never repaired a device, but might have been interested in doing so, will therefore conclude that repairing their own device is 1. Far too complicated, thanks to an intimidating-looking piece of kit that they recommend, but is completely unnecessary, and 2. Far too expensive, because Apple prices these such that the repair is made economically nonviable.
So yes, I still believe that this is Apple fighting the anti-repair war on a psychological front. You're giving them benefit of the doubt even though they've established a clear pattern of behavior that demonstrates their anti-repair stance beyond any reasonable doubt - although you dance around the citations and claim that I'm being unreasonable about Apple genuinely making the repair situation "better".
Futhermore, I'm not a fanboy or anti-fanboy of any company. The only thing I'm an anti-fanboy of are anti-consumer practices. If Apple changed some of their practices I'd go out and buy an iPhone and a Macbook tomorrow.
The fact that I pointed out that Apple is hostile against repair does not mean that I endorse Google, Samsung, or any other brand - they all suck when it comes to repair, yet you're taking it as a personal attack and calling me names for it.
Excuse me? I'm clearly not the one who crossed into flamewar, please read the previous 1-2 comments.
edit: Describing others as "bitching", "bad faith", and "hater", runs afoul of half of the guidelines on this site. That comment somehow isn't moderated, but mine is called out for crossing into flamewar?
You were both breaking the site guidelines, and I replied to both of you in the same way.
Even if I hadn't, though, we need you to follow the rules regardless of what anybody else is doing, and the same goes for every user here. Pointing a finger at the other person isn't helpful.
I understand it can be difficult to have someone point out that you're not approaching a situation in good faith, but that's not exactly "relentless insults".
It can be difficult to handle the intimation that maybe there is the mirror image of the "brainless apple sheeple" too. Haters exist - people who are not able to approach a situation fairly and are just relentlessly negative because they hate X thing too. Negative parasocial attachment is just as much of a thing as a positive one.
And when you get to the point of "literally making the tools available is bad because the pelican cases are too heavy" you have crossed that rubicon. I am being very very polite here, but you are not being rational, and you are kinda having an emotional spasm over someone disagreeing with you on the internet.
Yes, if you want OEM parts and you rent OEM tooling it's probably going to come close to OEM cost. That isn't discriminatory, if the prices are fair+reasonable, and objectively they pretty much are. $49 to rent the tools, and have them shipped both ways, etc, is not an unreasonable ask.
Not having a viable business model for your startup doesn't mean the world is wrong. It means you don't have a viable business idea. And yeah, if you are renting the tools as a one-off, and counting your personal time as having some value (or labor cost in a business), then you probably are not going to get costs that are economical with a large-scale operator with a chain operation and an assembly-line repair shop with repair people who do nothing but repair that one brand. That's not Apple's fault.
What we ultimately come down to with your argument is "apple is killing right-to-repair by being too good at repair and providing too cheap repairs such that indie shops can no longer make a margin" and I'm not sure that's actionable in a social sense of preventing e-waste. People getting their hardware repaired cheaply is good. Long parts lifetimes are good. Etc.
Being able to swap in shitty amazon knockoff parts is a whole separate discussion, of course. And afaik that is going to be forced by the EU anyway, consequences be damned. So what are you complaining about here?
Actually to be fully clear, in many cases you have an anti-right: literally not only do you not have a right, but it’s illegal to circumvent technological restrictions intended to prevent the thing you want to do.
As noxious as that whole thing is, it’s literally the law. I agree the outcome is horrifying of course… stallman was right all along, it’s either your device or it’s not.
And legally speaking, we have decided it’s ok to go with “not”.
> better for edge AI than whatever is out there, so I'm looking forward to this
What exactly are you expecting? The current hype for AI is large language models. The word 'large' has a certain meaning in that context. Much larger that can fit on your phone. Everyone is going crazy about edge AI, what am I missing?
> Everyone is going crazy about edge AI, what am I missing?
If you clone a model and then bake in a more expensive model's correct/appropriate responses to your queries, you now have the functionality of the expensive model in your clone. For your specific use case.
The size of the resulting case-specific models are small enough to run on all kinds of hardware, so everyone's seeing how much work can be done on their laptop right now. One incentive for doing so is that your approaches to problems are constrained by the cost and security of the Q&A roundtrip.
Quantized LLMs can run on a phone, like Gemini Nano or OpenLLAMA 3B. If a small local model can handle simple stuff and delegate to a model in the data center for harder tasks and with better connectivity you could get an even better experience.
> If a small local model can handle simple stuff and delegate to a model in the data center for harder tasks and with better connectivity you could get an even better experience.
Distributed mixture of experts sounds like an idea. Is anyone doing that?
Sounds like an attack vector waiting to happen if you deploy enough competing expert devices into a crowd.
I’m imagining a lot of these LLM products on phones will be used for live translation. Imagine a large crowd event of folks utilizing live AI translation services being told completely false translations because an actor deployed a 51% attack.
I’m not particularly scared of a 51% attack between the devices attached to my Apple ID. If my iPhone splits inference work with my idle MacBook, Apple TV, and iPad, what’s the problem there?
Using RAG a smaller local LLM combined with local data (e.g. your emails, iMessages etc) can be useful than a large external LLM that doesn’t have your data.
No point asking GPT4 “what time does John’s party start?”, but a local LLM can do better.
This is why I think Apple’s implementation of LLMs is going to be a big deal, even if it’s not technically as capable. Just making Siri better able to converse (e.g. ask clarifying questions) and giving it the context offered by user data will make it dramatically more useful than silo’d off remote LLMs.
In the hardware world, last year’s large has a way of becoming next year’s small. For a particularly funny example of this, check out the various letter soup names that people keep applying to screen resolutions. https://en.m.wikipedia.org/wiki/Display_resolution_standards...
Google has also been working on (and provides kits for) local machine learning on mobile devices... and they run on both iOS and Android. The Gemini App does send data in to Google for learning, but even that you can opt out of.
Apple's definitely pulling a "Heinz" move with privacy, and it is true that they're doing a better job of it overall, but Google's not completely horrible either.
Yeah, I was thinking of their non-local model like Gemini advanced though.
In any case iPhone probably don't have enough memory to run a 3.25B model? e.g. 15 pro only have 8 GB (and Gemini Nano seems to only work on the 12GB Pixel 8 Pro) and 14 has only 6GB, that hardly seems sufficient for even a small LLM if you still you want to run the full OS and other apps at the same time.
Care to cite these subpar Android apps? The app store is filled to the brim with subpar and garbage apps.
>Google dropping support for an app about once a month
I mean if you're going to lie why not go bigger
>I'm okay with the lesser of two evils.
So the more evil company is the one that pulled out of China because they refused to hand over their users data to the Chinese government on a fiber optic silver plate?
Google operates in China albeit via their HK domain.
They also had project DragonFly if you remember.
The lesser of two evils is that one company doesn’t try to actively profile me (in order for their ads business to be better) with every piece of data it can find and forces me to share all possible data with them.
>Google operates in China albeit via their HK domain.
The Chinese government has access to the iCloud account of every Chinese Apple user.
>They also had project DragonFly if you remember.
Which never materialized.
>The lesser of two evils is that one company doesn’t try to actively profile me (in order for their ads business to be better) with every piece of data it can find and forces me to share all possible data with them.
Apple does targeted and non targeted advertising as well. Additionally, your carrier has likely sold all of the data they have on you. Apple was also sued for selling user data to ad networks. Odd for a Privacy First company to engage in things like that.
Google has been around for 26 years I believe. According to that link 60 apps were killed in that timeframe. According to your statement that Google kills an app a month that would leave you 252 apps short. Furthermore, the numbers would indicate that Google has killed 2.3 apps per year or .192 apps per month.
>As for the subpar apps: there is a massive difference between the network traffic when on the Home Screen between iOS and Android.
Not sure how that has anything to do with app quality, but if network traffic is your concern there's probably a lot more an Android user can do than an iOS user to control or eliminate the traffic.
> Google has been around for 26 years I believe. According to that link 60 apps were killed in that timeframe. According to your statement that Google kills an app a month that would leave you 252 apps short. Furthermore, the numbers would indicate that Google has killed 2.3 apps per year or .192 apps per month.
Most of the "Services" on that list are effectively apps, too:
VPN by Google One, Album Archive, Hangouts, all the way back to Answers, Writely, and Deskbar.
I didn't touch hardware, because I think that should be considered separately.
The first of 211 services on that site was killed in 2006.
The first of the 60 apps on that site was killed in 2012.
So even apps alone, 4.28 a year.
But more inclusively, 271 apps or services in 17 years is ~16/year, over one a month.
You need to remind yourself of the site guidelines about assuming the worst. Your comments just come across condescendingly.
I think it was Paul Thurrott on Windows Weekly podcast who said that all these companies don't really care about privacy. Apple takes billions of dollar a year to direct data towards Google via the search defaults. Clearly privacy has a price. And I suspect it will only get worse with time as they keep chasing the next quarter.
Tim Cook unfortunately is so captured in that quarterly mindset of 'please the share holders' that it is only a matter of time.
I do hope that those working in these companies actually building the tools do care. But unfortunately, it seems that corruption is an emergent property of complexity.
The Google payments are an interesting one; I don't think it's a simple "Google pays them to prefer them", but a "Google pays them to stop them from building a competitor".
Apple is in the position to build a competing search product, but the amount Google pays is the amount of money they would have to earn from it, and that is improbable even if it means they can set their own search engine as default.
While Apple is first and foremost a hardware company, it has more or less always been about the "Apple experience". They've never "just" been a hardware company.
For as long as Apple has existed, they've done things "their way" both with hardware and software, though they tend to want to abstract the software away.
If it was merely a question of selling hardware, why does iCloud exist ? or AppleTV+, or Handoff ? or iMessage, or the countless other seemingly small life improvements that somehow the remainder of the industry cannot seem to figure out how to do well.
Just a "simple" thing as switching headphones seamlessly between devices is something i no longer think about, it just happens, and it takes a trip with a Windows computer and a regular bluetooth headset to remind me how things used to be.
As part of their "privacy first" strategy, iMessage also fits in nicely. Apple doesn't have to operate a huge instant messaging network, which undoubtedly is not making a profit, but they do, because having one entry to secure, encrypted communication fits well with the Apple Experience. iMessage did so well at abstracting the ugly details of encryption that few people even think about that that's what the blue bubble is actually about, it more or less only means your message is end to end encrypted. As a side effect you can also send full resolution images (and more), but that's in no way unique to iMessage.
I can't buy a MacBook Air for less than $999, and that's for a model with 8GB RAM, an 8-core CPU and 256GB SSD. The equivalent (based on raw specs) in the PC world runs for $300 to $500.
How is something that is twice as expensive as the competition not a luxury device?
EDIT: Because there's repeated confusion in the replies: I am not saying that a MacBook Air is not objectively a better device. I'm saying it is better by metrics that fall strictly into the "luxury" category.
Better build quality, system-on-a-chip, better OS, better battery life, aluminum case—all of these are luxury characteristics that someone who is looking for a functional device that meets their needs at a decent price won't have as dealbreakers.
> How is something that is twice as expensive as the competition not a luxury device?
You can buy a version of <insert product here> from Walmart at 1/2 price of a "normal" retailer. Does that mean every "normal" retailer is actually a luxury goods dealer?
Is my diner a luxury restaurant because a burger costs twice as much as McDonald's?
When I buy a Rick Owens coat for $3k, sure it's a luxury good. It protects from the elements just the same, I know that I overpay only because it looks nice. But when I pay the same for the device I need for my work and use for 12 hours a day, it’s not luxury — it's just common sense. I've tried working with Windows and Linux, and I know that I'm paying not only for specs, but because the sum of all the qualities will result in a much better experience — which will allow me to work (and earn money) faster and with less headache.
$1000 for a laptop that will last 10 years seems crazy to call a luxury, when we have Alienware/apple laptops that go for 2k to 5k+ and demographics that buys them yearly.
I bought a 300 euro ThinkPad that's going on 9 years now.
It was my only computer for the entire time until about a week ago, when I bought another 300 euro ThinkPad. I also didn't have a smartphone, only a basic dumbphone for a large portion of that time.
So for me, yes, MacBook Airs, like the lovely M1 I'm writing this on (I love my job!) are luxury goods.
Just because Ferraris cost half a million doesn't mean a 50k BMW isn't luxury.
> You can buy a version of <insert product here> from Walmart at 1/2 price of a "normal" retailer. Does that mean every "normal" retailer is actually a luxury goods dealer?
What percent of that retailer's products does that comparison apply to?
If it's more than half then yeah that's probably a luxury goods dealer.
> The equivalent (based on raw specs) in the PC world runs for $300 to $500.
Equivalent device?! Find me Windows laptop in ANY price category that can match weight, fanless design, screen quality, battery life, speakers quality and battery life of Air.
I got a Latitude 9430 on eBay for $520. This thing is an amazing laptop and I'd put it right there with the Macs I have to work with at dayjob, as far as build quality/feel.
That's still ~twice as expensive as the items I linked to below, and that's at clearance prices.
A good deal on a luxury item still gets you a luxury item.
And if we want to compare Walmart to Walmart, this thing currently runs for $359 and has 16GB RAM, 512GB SSD, and a CPU that benchmarks slightly faster than the M2:
No brand new laptop has a 1h battery. Also, battery life importance as in "I can work a full day unplugged from AC" it's something that affects only a subset of laptop users, and mostly during some specific conditions (i.e. long travels).
That's more like cheap vs middle of the road. There is no luxury space in laptops - displays, iPads, and workstations maybe but that's it (and those are more pro than luxury).
$999 amortized over 3 years is $30/mo which is less than what even middle class people spend on coffee.
I doubt I am alone in saying that I would gladly pay twice the price to avoid having to use Windows. It's the most user-hostile, hand-holdy, second-guess-and-confirm-my-explicit-command-ey os I've used to date. And bloatware baked in? No thanks.
You're probably right. I am in the middle-class, maybe lower middle-class, and I live in the US. I have advantages and opportunities that many in other circumstances do not and I am sincerely grateful for them.
Oh dear. 16:10 screen with superior resolution, brightness and gamut - and it still gets superior battery life driving all those pixels.. that’s a headline feature that even a non-propellerhead can observe (I was honestly surprised when I looked up that Acer screen what a dim, narrow piece of shit it is) - notably there are ballpark priced systems with better screens.
I think you unjustifiably downplay how much of a selling point a screen that looks great (or at least decent) on the floor is. And I know tons of devs that put up with the 45% NTSC abominations on Thinkpads that aren’t even suitable for casual photo editing or web media, just because you make do with that doesn’t automatically make a halfway decent display on a laptop a “luxury”.
Sorry, but don’t buy the “everything that isn’t a $300 econo shit laptop is luxury” thesis repeated ad nauseum.
"Luxury" often includes some amount of pure status symbols added to the package, and often on what is actually a sub-par experience. The quintessential luxury tech device were the Vertu phones from just before and even early in the smartphone era - mid-range phones tech and build quality-wise, with encrusted gems and gold inserts and other such bling, sold at several thousand dollars (Edit: they actually ranged between a few thousand dollars all the way to 50,000+).
But the definition of luxury varies a lot by product category. Still, high-end and luxury are separate concepts, which ven when they do overlap.
You just made up the "sub-par experience" as a defining point of a luxury product.
A luxury product is defined by being a status symbol (check for all Apple devices) and especially by its price.
A luxury car like a Bentley will still you bring from point A to point B like the cheapest Toyota.
I didn't say that sub-par experience was a requirement, I said it was often a part of luxury products. Or, more precisely, I should have said that something being of excellent build quality and offering excellent, top of the line experience is neither sufficient nor necessary for being a luxury good.
It is true though that luxury goods are, often, top of the line as well. Cars and watches are often examples of this. Clothes are a much more mixed bag, with some luxury brands using excellent materials and craftsmanship, while others use flashy design and branding with mediocre materials and craftsmanship.
Exactly where Apple sits is very debatable in my experience. I would personally say that many of their products are far too affordable and simple to be considered luxury products - the iPhone in particular. The laptops I'm less sure about.
Fair enough.
Apple is clearly not in the same luxury league like a Bentley or a yatch, but it's totally like a Mercedes, to continue with the car analogy. You get a "plus" for the extra money but then it's open for debate whether that "plus" is worth or not. And it's actually the source of many flamewars on the Internet.
I think the Mercedes comparison (or the more common BMW) one is also useful for getting the idea that not every manufacturer is competing for the same segments but the prices in segments are generally close. No Mercedes is as cheap as a Camry but a Lexus is similar.
This comes up so often in these flame wars where people are really saying “I do/don’t think you need that feature” and won’t accept that other people aren’t starting from the same point. I remember in the 90s reading some dude on Fidonet arguing that Macs were overpriced because they had unnecessary frills like sound cards and color displays; I wasn’t a Mac user then but still knew this was not a persuasive argument.
That would also apply to Apple products then, and especially so to their laptops. I actually bought a MacBook Air recently and the thing that I like most about it is how comfortable the keyboard and especially the trackpad is compared even to high-end ThinkPads. And, on the other hand, the trackpad on my T14s is certainly quite sufficient to operate it, so this comfort that MacBook offers is beyond the bare necessity of function.
By that definition, Zara is a luxury clothing brand, Braun is a luxury appliance maker, and Renault is a luxury car brand. I think it requires significantly more.
The Walmart variant was introduced 6 weeks ago to offload excess stocks of a four year old discontinued model. I'm not sure your argument of "at only 70% of the price of a model two generations newer" is the sales pitch you think it is.
They’re tools. This attempt to treat them as luxury goods doesn’t hold with those. It’s entirely common for even people who want to do some home repair—let alone professionals—but aren’t clueless about DIY to spend 2x the cheapest option, because they know the cheapest one is actually worth $0. More will advocate spending way more than 2x, as long as you’re 100% sure you’re going to use it a lot (like, say, a phone or laptop, even for a lot of non-computer-geeks). This is true even if they’re just buying a simple lowish-power impact driver, nothing fancy, not the most powerful one, not the one with the most features. Still, they’ll often not go for the cheapest one, because those are generally not even fit for their intended purpose.
[edit] I mean sure there are people who just want the Apple logo, I’m not saying there are zero of those, but they’re also excellent, reliable tools (by the standards of computers—so, still bad) and a good chunk of their buyers are there for that. Even the ones who only have a phone.
I didn't go for the cheapest option: I'm typing this on a laptop that I bought a few months ago for $1200. It has an aluminum case, 32GB RAM, an AMD Ryzen CPU that benchmarks similar to the M3, and 1TB SSD. I can open it up and replace parts with ease.
The equivalent from Apple would currently run me $3200. If I'm willing to compromise to 24GB of RAM I can get one for $2200.
What makes an Apple device a luxury item isn't that it's more expensive, it's that no matter what specs you pick it will always be much more expensive than equivalent specs from a non-luxury provider. The things that Apple provides are not the headline stats that matter for a tool-user, they're luxury properties that don't actually matter to most people.
Note that there's nothing wrong with buying a luxury item! It's entirely unsurprising that most people on HN looking at the latest M4 chip prefer luxury computers, and that's fine!
Huh. Most of the folks I know on Apple stuff started out PC (and sometimes Android—I did) and maybe even made fun of Apple devices for a while, but switched after exposure to them because they turned out to be far, far better tools. And not even much more expensive, if at all, for TCO, given the longevity and resale value.
Eh, I have to use a MacBook Pro for work because of IT rules and I'm still not sold. Might be because I'm a Linux person who absolutely must have a fully customizable environment, but MacOS always feels so limited.
The devices are great and feel great. Definitely high quality (arguably, luxury!). The OS leaves a lot to be desired for me.
I spent about a decade before switching using Linux as my main :-) Mostly Gentoo and Ubuntu (man, it was good in the first few releases)
Got a job in dual-platform mobile dev and was issued a MacBook. Exposure to dozens of phones and tablets from both ecosystem. I was converted within a year.
(I barely customize anything these days, fwiw—hit the toggle for “caps as an extra ctrl”, brew install spectacle, done. Used to have opinions about my graphical login manager, use custom icon sets, all that stuff)
> no matter what specs you pick it will always be much more expensive than equivalent specs from a non-luxury provider
On the phone side, I guess you would call Samsung and Google luxury providers? On the laptop side there are a number of differentiating features that are of general interest.
> The things that Apple provides are not the headline stats that matter for a tool-user, they're luxury properties that don't actually matter to most people
Things that might matter to regular people (and tool users):
- design and build for something you use all day
- mic and speakers that don't sound like garbage (very noticeable and relevant in the zoom/hybrid work era)
- excellent display
- excellent battery life
- seamless integration with iPhone, iPad, AirPods
- whole widget: fewer headaches vs. Windows (ymmv); better app consistency vs. Linux
- in-person service/support at Apple stores
It's hard to argue that Apple didn't reset expectations for laptop battery life (and fanless performance) with the M1 MacBook Air. If Ryzen has caught up, then competition is a good thing for all of us (maybe not intel though...) In general Apple isn't bleeding edge, but they innovate with high quality, very usable implementations (wi-fi (1999), gigabit ethernet (2001), modern MacBook Pro design (2001), "air"/ultrabook form factors (2008), thunderbolt (2011), "retina" display and standard ssd (2012), usb-c (2016), M1: SoC/SiP/unified memory/ARM/asymmetric cores/neural engine/power efficiency/battery life (2020) ...and occasionally with dubious features like the touchbar and butterfly keyboard (2016).)
Looking even further back in Apple laptop history, we find interesting features like rear keyboard placement (1991), 4 pound laptop with dock for desktop use (1992), and trackpad (1994). Apple's eMate 300 (1997) was a Newton laptop rather than a Mac, but it had an ARM processor, flash storage, and 20+ hour battery life, making it something of an ancestor to the Mac M1.
Once Arm and battery life shift occurs with Linux and Windows, they'll (ie. Apple) be on the front foot again with something new, that's the beauty of competition.
>The things that Apple provides are not the headline stats that matter for a tool-user, they're luxury properties that don't actually matter to most people.
Here lies the rub, ARE those the stats that matter? Or does the screen, touchpad, speakers, battery life, software, support services, etc. matter more?
I feel people just TOTALLY gloss over the fact that Apple is crushing the competition in terms of trackpads + speakers + battery life, which are hardly irrelevant parts of most people's computing experience. Many people hardly use their computers to compute - they mostly use them to input and display information. For such users, memory capacity and processing performance ARE frills, and Apple is a market leader where it's delivering value.
Also even in compute, apple is selling computers with a 512-bit or 1024-bit LPDDR5x bus for a lower price than you can get from the competition. Apple is also frequently leading the pack in terms of compute/watt. This has more niche appeal, but I've seen people buy Apple to run LLM inferencing 24/7 while the Mac Studio sips power.
Lenovo Thinkpad p14s(t14) gen 4, 7840U, $1300, oled 2.8K 400 nits P3, 64gb RAM, 1TB, keyboard excellent, speakers shitty(using sony wh-1000xm4), battery(52.5Wh) life not good not bad, OLED screen draws huge amount of power. weight ~3 lb.
This spec costs 2k euro in NL. Fully specd Air (15 inch) is 2,5k euro, with arguably better everything except RAM and is completely silent. Doesn’t look that much different to me in terms of price.
Also, those things aren't even true about Apple devices. Apple fanboys have been convinced that their hardware really is way better than everything else for decades. It has never been true and still isn't.
Clean os install? You haven't used windows in a while have you?
Im a Linux guy but am forced to use Mac's and windows every now and then.
Windows has outpaced macos for a decade straight.
Macos looks like it hasn't been updated in years. It's constantly bugging me for passwords for random things. It is objectively the worst OS. I'd rather work on a Chromebook.
I think he has different critera on what bothers him, thats okay though isn't it. I get a little annoyed at anything where I have to use a touchpad, not enough to rant about it, but it definitely increases friction (haha) in my thought process.
What metrics are you using for build quality? Admittedly I don't know a ton of mac people (I'm an engineer working in manufacturing) but the mac people I know, stuff always breaks, but they're bragging about how apple took care of it for free.
My Acer Aspire lasted me for tens of thousands of hours of use and abuse by small children over 6 years until I replaced it this year because I finally felt like I wanted more power. That's the Toyota Camry of laptops.
The features that Apple adds on top of that are strictly optional. You can very much prefer them and think that they're essential, but that doesn't make it so. Some people feel that way about leather seats.
tl;dr is that Walmart is also selling an Acer for $359 that beats that device on every headline metric.
It's nice to know that I could get the old-gen model for slightly cheaper, but that's still an outrageous price if the MacBook Air isn't to be considered a luxury item.
My last Acer lasted me six years until I decided to replace it for more power (which, notably, I would have done with a MacBook by then too). They're not as well built as a MacBook, but they're well built enough for the average laptop turnover rate.
If it was actually bad value they wouldn't sell as high as they do and review with as much consumer satisfaction as they do.
These products may not offer you much value and you don't have to buy them. Clearly plenty of people and institutions bought them because they believed they offered the best value to them.
If people were actually rational that might be true, but they aren't. Apple survives entirely on the fact that they have convinced people they are cool, not because they actually provide good value.
Agreed. I'd definitely make the same arguments here as I would for an Audi. There's clearly a market, and that means they're not a bad value for a certain type of person.
Yes, but having all three of those things (well, specs/performance is probably just one thing, but treating them as separate as you did means that I don't have to do the heavy lifting of figuring out what a third thing would actually be) IS, in fact, a luxury.
Nobody is away from a power source for longer than 18 hours. MOST people don't need the performance that a macbook air has, their NEEDS would be met by a raspberry pi... that is, basic finances, logging into various services, online banking, things that first world citizens "rely" on.
The definition of luxury is "great comfort and extravagance", and every current Apple product fits that definition. Past Apple definitely had non-luxury products, as recently as the iPhone C (discontinued 10 years ago)... but Apple has eliminated all low-value options from their lineup.
When you're breaking out SSD speeds you're definitely getting into the "luxury" territory.
As I said in another comment:
The point isn't that the MacBook Air isn't better by some metrics than PC laptops. A Rolls-Royce is "better" by certain metrics than a Toyota, too. What makes a device luxury is if it costs substantially more than competing products that the average person would consider a valid replacement.
They're average. A 512GB M3 MBA gets like 3000MBps for read/write. A 1TB Samsung 990 Pro, which costs less than the upgrade from 256GB to 512GB on the Air is over twice as fast. And on base models Apple skimps and speeds are slower.
Good question, I think the answer is even at thousands a window device battery can't hit 18 hour specs. Can someone name a windows device even at 2k+ that acts like an M chip? In fact the pricier windows usually mean GPU and those have worse battery then cheap windows(my 4090 is an hour or so off charge)
I am all in on Apple, to be clear. Mac Pros, multiple MBPs, Studio, Pro Display XDR, multiple Watches, phones, iPad Pro.
My experiences (multiple) with Genius Bar have been decidedly more "meh" to outright frustrating, versus "luxury", oftentimes where I know more than the Genius.
Logic Board issues where on a brand new macOS install I could reproducibly cause a kernel panic around graphics hardware. There was an open recall (finally, after waiting MONTHS) on this. It covered my Mac. But because it passed their diagnostic tool, they would only offer to replace the board on a time and materials basis.
I had a screen delamination issue. "It's not that bad - you can't see it when the screen is on, and you have to look for it". Huh. Great "luxury" experience.
And then the multiple "we are going to price this so outrageously, and use that as an excuse to try to upsell". Like the MBA that wouldn't charge due to a circuit issue. Battery fine, healthy. Laptop, fine, healthy, on AC. Just couldn't deliver current to the battery. Me, thinking sure, $300ish maybe with a little effort.
"That's going to be $899 to repair. That's only $100 less than a new MBA, maybe we should take a look at some of the new models?" Uh, no. I'm not paying $900 for a laptop that spends 99% (well, 100% now) of its life on AC power.
Is a Wendy’s burger luxury because it costs twice as much as McDonald’s?
Cost comparisons alone are stupid. And “this AMD benchmarks the same as an M2” is a useless comparison since regular people don’t buy laptops for raw compute power.
Really? You can find a laptop with the equivalent of Apple Silicon for $3-500? And while I haven't used Windows in ages I doubt it runs as well with 8 GB as MacOS does.
The point isn't that the MacBook Air isn't better by some metrics than PC laptops. A Rolls-Royce is "better" by certain metrics than a Toyota, too. What makes a device luxury is if it costs substantially more than competing products that the average person would consider a valid replacement.
> the average person would consider a valid replacement
But what is that, exactly? If you look at all aspects of a laptop: CPU, RAM, SSD, battery life, screen quality, build quality, touchpad, OS, and put them in order of importance for the average consumer, what would be on top? I don't think it's the tech specs.
For instance, I would be willing to bet that for a large number of consumers, battery life is far more important than the tech specs, which means that a valid replacement for their MacBook must have equivalent battery life. You also have to consider things like the expected lifespan of the laptop and its resale value to properly compare their costs. It's not simple.
Curious what criteria you're using for using for qualifying luxury. It seems to me that materials, software, and design are all on par with other more expansive Apple products. The main difference is the chipset which I would argue is on an equal quality level as the pro chips but designed for a less power hungry audience.
Maybe for you, but I still see sales guys who refuse working on WinTel where basically what the do is browse internet and do spreadsheets - so mainly just because they would not look cool compared to other sales guys rocking MacBooks.
I'm not sure what you're point is. My point (which I failed at), is that Apple's incentives are changing because their growth is dependent on services and extracting fees so they will likely do things that try to make people dependent on those services and find more ways to charge fees (to users and developers).
Providing services is arguably at odds with privacy since a service with access to all the data can provide a better service than one without so there will be a tension between trying to provide the best services, fueling their growth, and privacy.
My point was that it's interesting how we can frame a service business "extracting fees" to imply wrongdoing. When it's pretty normal for all services to charge ongoing fees for ongoing delivery.
It’s about the money, it’s about perverse incentives and propensity of service businesses to get away with unfair practices. We have decent laws about your rights as a consumer when you buy stuff, but like no regulation of services
There is tons of regulation of services? Everything from fraud / false advertising to disclosure of fees to length and terms of contracts. What regulation do you think is missing?
And as someone who presumably provides services for a living, what additional regulations would you like to be subject to?
So the new iPad & M4 was just some weekend project that they shrugged and decided to toss over to their physical retail store locations to see if anyone still bought physical goods eh
I have very little faith in apple in this respect.
For clarity, just install little snitch on your machine, and watch what happens with your system. Even without being signed in with an apple id and everything turned off, apple phones home all the time.
You can block 17.0.0.0 at the router, opening up only the notification servers. CDNs are a bit harder, but can be done with dnsmasq allow/deny of wildcard domains. Apple has documentation on network traffic from their devices, https://support.apple.com/en-us/101555
As a privacy professional for many, many years this is 100% correct. Apple wouldn’t be taking billions from Google for driving users to their ad tracking system, they wouldn’t give the CCP access to all Chinese user data (and maybe beyond), and they wouldn’t be on-again-off-again flirting with tailored ads in Apple News if privacy was a “human right”.
(FWIW my opinion is it is a human right, I just think Tim Cook is full of shit.)
What Apple calls privacy more often than not is just putting lipstick on the pig that is their anticompetitive walled garden.
Pretty much everybody in SV who works in privacy rolls their eyes at Apple. They talk a big game but they are as full of shit as Meta and Google - and there’s receipts to prove it thanks to this DoJ case.
Apple want to sell high end hardware. On-device computation is a better user experience, hands down.
That said, Siri is utter dogshit so on-device dogshit is just faster dogshit.
At this point call your government representatives and ask for new laws, or if you live someplace with laws, actual enforcement (looking at you EU).
The idea that user behavior or consumer choice will change any of this is basically discredited in practice. It will always been cat and mouse until the point that CEOs go to jail, then it will stop.
> The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.
Indeed.
Privacy starts with architectural fundamentals that are very difficult to retrofit...
If a supplier of products has not built the products this way, it would be naive to bet bank or farm on the supplier. Even if there were profound motivation to retrofit.
Add to this the general tendency of the market to exploit its customers.
>The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.
They failed with their ad-business so this is a nice pivot. I'll take it, I'm not usually a cheerleader for Apple, but I'll support anyone who can erode Google's surveillance dominance.
> The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.
There are a ton of us out here that consciously choose Apple because of their position on privacy. I have to imagine they know how many customers they'll lose if they ever move on this, and I want to believe that it's a large enough percentage to prevent it from happening. Certainly my circle is not a useful sample, but the Apple people in it are almost all Apple people because of privacy.
When I was at Apple for a short time, there was a small joke I hear from the ex-amazonians there who would say "What's the difference between an Apple software engineer and an Amazon software engineer? The Amazon engineer will spin up a new service on AWS. An Apple engineer will spin up a new app". Or something along those lines. I forget the exact phrasing. It was a joke that Apple's expertise is in on-device features, whereas Amazon thrives in the cloud services world.
Every company is selling one thing or another, and nothing is going to last forever. I really fail to see what, except for generic negativity, your comment adds to anything.
That is not where Apple's growth has been for quite some time, it's services. And because of that I'll be awaiting the economic rental strategy to come at any moment.
Nothing is true forever. Google wasn’t evil forever, Apple won’t value privacy forever.
Until we figure out how to have guarantees of forever, the best we can realistically do is evaluate companies and their products by their behavior now weighted by their behavior in the past.
As soon as the privacy thing goes away, I'd say a major part of their customer base goes away too. Most people use android so they don't get "hacked" if Apple is doing the hacking, I'd just buy a cheaper alternative.
Maybe true for a lot of the HN population, but my teenagers are mortified by the idea of me giving them android phones because then they would be the pariahs turning group messages from blue to green.
And just to elaborate on this: it's not just snobbery about the color of the texts, for people who rely on iMessage as their primary communication platform it really is a severely degraded experience texting with someone who uses Android. We Android users have long since adapted to it by just avoiding SMS/MMS in favor of other platforms, but iPhone users are accustomed to just being able to send a video in iMessage and have it be decent quality when viewed.
Source: I'm an Android user with a lot of iPhones on my in-laws side.
I’m in Europe and everyone uses WhatsApp, and while Android does gave higher share over here, iPhone still dominate the younger demographics. I’m not denying blue/green is a factor in the US but it’s not even a thing here. It’s nowhere near the only it even a dominant reason iPhones are successful with young people.
Interesting that some people would take that as an Apple problem and others would take it as a Google problem
Who’s at fault for not having built-in messaging that works with rich text, photos, videos, etc?
Google has abandoned more messaging products than I can remember while Apple focused on literally the main function of a phone in the 21st century. And they get shit for it
Apple get shit for it because they made it a proprietary protocol for which clients are not available on anything except their own hardware. The whole point of messaging is that it should work with all my contacts, not just those who drank the Apple-flavored Kool-Aid.
Google’s protocol is proprietary too - their encryption extension makes it inaccessible for anyone else and google will not partner or license (companies have tried).
RCS as currently implemented is iMessage but with a coat of google paint. There is no there there.
Google should get plenty of shit too for closing down GTalk in the first place. It's not an either-or. Big tech in general hates open protocols and interoperability for consumer stuff; Apple is just the most egregious offender there.
My take is that it's like a fashion accessory. People buy Gucci for the brand, not the material or comfort.
Rich people ask for the latest most expensive iPhone even if they're only going to use WhatsApp and Instagram on it. It's not because of privacy or functionality, it's simply to show off to everyone they can purchase it. Also to not stand out within their peers as the only one without it.
As another content said: it's not an argument, it's a fact here.
I have an iPhone so I guess I qualify as a rich person by your definition. I am also a software engineer. I cannot state enough how bogus that statement is. I've used both iPhone and Android, and recent flagships. iPhone is by far the easiest one to use. Speaking in more objective terms, iPhones have a coherent UI which maintains its consistency both throughout the OS and over the years. They're the most dumbed down phones and easiest to understand. I recommend iPhone to all my friends and relatives.
There's obviously tons of people who see iPhone as a status item. They're right, because iPhone is expensive and only the rich can buy them. This doesn't mean iPhone is not the best option out there for a person who doesn't want to extensively customize his phone and just use it.
Yes, by pure statistics you are probably rich compared to everyone else. The average software developer salary is way bigger than the average salary for the entirety of the US. Let's not even mention compared to the rest of the world.
Sure, some people pick up the iPhone because they like the specs, or the apps, or whatever else. That's why I said the majority picks it up for status, not all. But keep in mind nobody's judging the iPhone's specs or capabilities here. We're talking about why people buy it.
Ask any teenager why they want an iPhone. I'd be very surprised if even one said it's because of privacy. It's because of the stupid blue bubble, which is a proxy for status.
I'm pretty sure if Apple released the same phone again with a new name and design, people would still buy it. For the majority, it's not because of features, ease of use, specs, etc: it's status.
> iPhone and Android, and recent flagships. iPhone is by far the easiest one to use. Speaking in more objective terms, iPhones have a coherent UI
It’s not about if you’ve used android, it’s about if you’ve beeen poor-ish or stingy
To some people those are luxuries- the most expensive phone they buy is a mid-range Motorola for $300 with snapdragon 750g or whatever. They run all the same apps after all, they take photos.
Its not an argument, just ask why people lust after the latest iPhones in poor countries. They do it because they see rich people owning them. Unless you experience that, you won't really understand it.
The cheapest point of entry is absolutely not comparable. The cheapest new iPhone on apple.com is $429. The cheapest new Samsung on samsung.com is $199 (They do have a phone listed for $159, but it's button says "Notify Me").
Granted, you may have been leaning very heavily on the dictionary definition of "comparable", in that the two numbers are able to be compared. However, when the conclusion of that comparison is "More than twice the price", I think you should lead with that.
Keep in mind, the iPhone SE is using a 3 year old processor, the Samsung A15 was released 5 months ago with a brand new processor.
According to various sites, the Mediatek Dimensity 6100+ is a 6nm update to a core that was released 3 years ago (Dimensity 700 on a 7nm). It's 5-10% faster, likely due to the update from 7 to 6nm, as the cores are the same and run at the same speed. It contains an updated bluetooth chipset (from 5.1 to 5.2) and supports a larger max camera. The camera on the A15 is well below the max size of the previous chipset, however, the increased camera bandwidth should ensure that the camera feels snappier (a common complaint on low-end phones). The process improvement should increase efficiency as well, however, there are not benchmarks that are able to test this.
It's fashion and the kids are hip. But there is an endless void of Apple haters here who want to see it burn. They have nothing in common with 99.9% of the customer base.
I was thinking about this for a while, the problem is not about apple, it’s the fact that the rest of the industry is gutless, and has zero vision or leadership. Whatever Apple does, the rest of the industry will follow or oppose - but will be defined by it.
It’s like how people who don’t like US and want nothing to do with US still discuss US politics, because it has so much effect everywhere.
(Ironically no enough people discuss China in any coherent level of understanding)
You're absolutely right, I'm so glad that Apple was the first company to release a phone with a touch screen, or a phone with an app store, or a smart watch or a VR headset.
Apple doesn't release new products, they wait until the actual brave and innovating companies have done the exploration and then capitalize on all of their learnings. Because they are never the first movers and they have mountains of cash, they're able to enter the market without the baggage of early adopters. They don't have to worry about maintaining their early prototypes.
Apple doesn't innovate or show leadership, they wait until the innovators have proven that the market is big enough to handle Apple, then they swoop in with a product that combines the visions of the companies that were competing.
Apple is great at what they do, don't get me wrong. And swooping in when the market is right is just good business. Just don't mistake that for innovation or leadership.
This is a prejudiced take. Running AI tasks locally on the device definitely is a giant improvement for the user experience.
But not only that, Apple CPUs are objectively leagues ahead of their competition in the mobile space. I am still using a IPhone released in 2020 with absolutely no appreciable slow down or losses in perceived performance. Because even a 4 years old IPhone still has specs that don't lag behind by much the equivalent Android phones, I still receive the latest OS updates, and because frankly, Android OS is mess.
If I cared about status, I would have changed my phone already for a new one.
> I am still using a IPhone released in 2020 with absolutely no appreciable slow down or losses in perceived performance.
My Pixel 4a here is also going strong, only the battery is slowly getting worse. I mean, it's 2024, do phones really still get slow? The 4a is now past android updates, but that was promised after 3 years. But at 350 bucks, it was like 40% less than the cheapest iPhone mini at that time.
Apple says it made these changes for other reasons, honestly, truly. And if it happened to have the same effect, then that was unfortunate, and unintended.
Only Apple really knows. But there was a slew of changes and reversals following the drama. "Oh, we'll implement notifications now", "Oh, we'll change the peak performance behavior", and "we will change and add additional diagnostics to make sure issues are battery related" certainly has a feel for a bunch of ex post facto rationalization of several things that seem, to me, that if it was truly a battery thing all along, would have been functional requirements.
>Apple CPUs are objectively leagues ahead of their competition in the mobile space
This is a lie. The latest Android SoCs are just as powerful as the A series.
>Because even a 4 years old IPhone still has specs that don't lag behind by much the equivalent Android phones, I still receive the latest OS updates, and because frankly, Android OS is mess.
Samsung and Google offer 7 years of OS and security updates. I believe that beats the Apple policy.
The last iPads to stop getting OS updates (including security, to be consistent with what Samsung and Google are pledging) got 7 and 9 years of updates each (5th gen iPad and 1st gen iPad Pro). The last iPhones to lose support got about 7 years each (iPhone 8 and X). 6S, SE (1st), and 7 got 9 and 8 years of OS support with security updates. The 5S (released in 2013) last got a security update in early 2023, so also about 9 years, the 6 (2014) ended at the same time so let's call it 8 years. The 4S, 2011, got 8 years of OS support. 5 and 5C got 7 and 6 years of support (5C was 5 in a new case, so was always going to get a year less in support).
Apple has not, that I've seen at least, ever established a long term support policy on iPhones and iPads, but the numbers show they're doing at least as well as what Samsung and Google are promising to do, but have not yet done. And they've been doing this for more than a decade now.
EDIT:
Reworked the iOS numbers a bit, down to the month (I was looking at years above and rounding, so this is more accurate). iOS support time by device for devices that cannot use the current iOS 17 (so the XS and above are not counted here) in months:
The average is 72.5 months, just over 6 years. If we knock out the first 2 phones (both have somewhat justifiable short support periods, massive hardware changes between each and their successor) the average jumps to just shy of 79 months, or about 6.5 years.
The 8 and X look like regressions, but their last updates were just 2 months ago (March 21, 2024) so still a good chance their support period will increase and exceed the 7 year mark like every model since the 5S. We'll have to see if they get any more updates in November 2024 or later to see if they can hit the 7 year mark.
>The last iPads to stop getting OS updates (including security, to be consistent with what Samsung and Google are pledging) got 7 and 9 years of updates each (5th gen iPad and 1st gen iPad Pro). The last iPhones to lose support got about 7 years each (iPhone 8 and X). 6S, SE (1st), and 7 got 9 and 8 years of OS support with security updates. The 5S (released in 2013) last got a security update in early 2023, so also about 9 years, the 6 (2014) ended at the same time so let's call it 8 years. The 4S, 2011, got 8 years of OS support. 5 and 5C got 7 and 6 years of support (5C was 5 in a new case, so was always going to get a year less in support).
These are very disingenuous numbers that don't tell the complete story. An iPhone 7 getting a single critical security patch does not take into account the hundreds of security patches it did not receive when it stopped receiving support. It received that special update because Apple likely was told or discovered it was being exploited in the wild.
Google and Samsung now offer 7 years of OS upgrades and 84 months of full security patches. Selectively patching a phone that is out of the support window with a single security patch does not automatically increase its EOL support date.
I look forward to these vendors delivering on their promises, and I look forward to Apple perhaps formalizing a promise with less variability for future products.
Neither of these hopes retroactively invalidates the fact that Apple has had a much better track record of supporting old phone models up to this point. Even if you do split hairs about the level of patching some models got in their later years, they still got full iOS updates for years longer than most Android phones got any patches at all, regardless of severity.
This is not an argument that somehow puts Android on top, at best it adds nuance to just how much better iOS support has been up to this point.
Let's also not forget that if Apple wasn't putting this kind of pressure on Google, they wouldn't have even made the promise to begin with, because it's clear how long they actually care to support products with no outside pressure.
I agree. This is the type of competition I like to see between these two companies. In the end the consumer wins regardless of which one you buy. Google has also promised 10 years of Chromebook support, so they've clearly got the message on the importance of supporting hardware much longer than a lot of people would use them for.
They made that pledge for the Pixel 8 (2023). Let's revisit this in 2030 and see what the nature of their support is at that point and how it compares to Apple's support for iPhone devices. We can't make a real comparison since they haven't done anything yet, only made promises.
What we can do today is note that Apple never made a promise, but did provide very long security support for their devices despite that. They've already met or come close to the Samsung/Google pledge (for one device) on almost half their devices, and those are all the recent ones (so it's not a downward trend of good support then bad support, but rather mediocre/bad support to improving and increasingly good support).
Another fun one:
iPhone XS was released in September 2018, it is on the current iOS 17 release. In the absolute worst case of it losing iOS 18 support in September, it will have received 6 full years of support in both security and OS updates. It'll still hit 7 years (comfortably) of security updates. If it does get iOS 18 support in September, then Apple will hit the Samsung/Google pledge 5 years before Samsung/Google can even demonstrate their ability to follow through (Samsung has a chance, but Google has no history of commitment).
I have time to kill before training for a century ride:
Let's ignore everything before iPhone 4S, they had short support periods that's just a fact and hardly worth investigating. This is an analysis of devices released in 2011 and later, when the phones had, mostly, matured as a device so we should be expecting longer support periods. These are the support periods when the phones were able to run the still-current iOS versions, not counting later security updates or minor updates but after the major iOS version had been deprecated. As an example, for the iPhone 4S it had support from 2011-2016. In 2016 its OS, iOS 9, was replaced by iOS 10. Here are the numbers:
4S - 5 years
5 - 5 years
5C - 4 years (decreased, 5 hardware but released a year later in a different case)
5S - 6 years
6 - 5 years (decreased, not sure why)
6S - 7 years (hey, Apple did it! 2015 release, lost iOS upgrades in 2022)
SE(1st) - 5 years (like 5C, 6S hardware but released later)
7 - 6 years (decreased over 6S, not sure why)
8 - 6 years
X - 6 years
The 6S is a bit of an outlier, hitting 7 years of full support running the current iOS. 5C and SE(1st) both got less total support, but their internals were the same as prior phones and they lost support at the same time as them (this is reasonable, if annoying, and does drag down the average). So Apple has clearly trended towards 6 years of full support, the XS (as noted above) will get at least 6 years of support as of this coming September. We'll have to see if they can get it past the 7 year mark, I know they haven't promised anything but the trend suggests they can.
Sure. They also pledged to support Chromebooks for 10 years. My point being is that I don't think they'll be clawing back their new hardware support windows anytime soon. Their data indicates that these devices were used well beyond their initial support window metrics so it was in their, and their users, best interest to keep them updated as long as they possibly could. 3 years of OS updates and 4 years of security updates was always the weak link in their commitment to security. And this applies to all of their devices including the A series - something I don't see other Android OEM's even matching.
BTW, my daily driver is an iPhone 13 and I was coming from an iPhone X. So I'm well aware of the incredible support Apple provides its phones. Although, I would still like to see an 8+ year promise from them.
The vast majority of people don’t. They buy because the ecosystem works. Not sure how I get status from a phone that nobody knows I have. I don’t wear it on a chain.
Apple only pivoted into the “privacy” branding relatively recently [1] and I don't think that many people came for that reason alone. In any case, most are now trapped into the walled garden and the effort to escape is likely big enough. And there's no escape anyway, since Google will always make Android worse in that regard…
[1] in 2013 they even marketed their “eBeacon” technology as a way for retail stores to monitor and track their customers which…
Ca 2013 was the release of the Nexus 5, arguably the first really usable android smartphone.
Privacy wasn’t really a concern because most people didn’t have the privacy eroding device yet. In the years following the Nexus 5 is where smartphones went into geometric growth and the slow realization of the privacy nightmare became apparent
Imho I was really excited to get a Nexus 4 at the time, just a few short years later the shine wore off and I was horrified at the smartphone enabled future. And I have a 40 year background in computers and understand them better than 99 out of 100 users – if I didn’t see it, I can’t blame them either
Define usable. Imho before Nexus 4 everything was crap, Nexus 4 barely was enough (4x1.4 GHz), Nexus 5 (4x2.2GHz) plus software at the time (post-kitkat) was when it was really ready for mainstream
I'd say from my experience the average Apple users care less about privacy then the general public. It's a status symbol first and foremost 99% of what people do on their phones is basically identical on both platforms at this point.
I think it will be a winning strategy. Lag is a real killer for LLMs.
I think they'll have another LLM on a server (maybe a deal for openai/gemini) that the one on the device can use like ChatGPT uses plugins.
But on device Apple have a gigantic advantage. Rabbit and Humane are good ideas humbled by shitty hardware that runs out of battery, gets too hot, has to connect to the internet to do literally anything.
Apple is in a brilliant position to solve all those things.
I run a few models (eg Llama3:8b) on my 2023 MacBook Air, and there is still a fair bit of lag and delay, compared to a hosted (and much larger) model like Gemini. A large source of the lag is the initial loading of the model into RAM. Which an iPhone will surely suffer from.
Humane had lag and they used voice chat which is a bad UX paradigm. VUI is bad because it adds lag to the information within the medium. Listening to preambles and lists are always slower than a human eyes ability to scan a page of text. Their lag is not due to LLMs, which can be much faster than whatever they did.
We should remind ourselves that an iPhone can likely suffer similar battery and heat issues - especially if it’s running models locally.
Humane's lag feels down to just bad software design too, it almost feels like a two stage thing is happening like it's sending your voice or transcription up to the cloud, figuring out where it needs to go to get it done, telling the device to tell you its about to do that then finally doing it. E.g
User: "What is this thing?"
Pin: "I'll have a look what that is" (It feels this response has to come from a server)
Pin: "It's a <answer>" (The actual answer)
We're still a bit away from iPhone running anything viable locally, even small models today you can almost feel the chip creaking under the load they're incurring on it and the whole phone begins to choke.
I'm curious to hear more about this. My experience has been that inference speeds are the #1 cause of delay by orders of magnitude, and I'd assume those won't go down substantially on edge devices because the cloud will be getting faster at approximately the same rate.
Have people outside the US benchmarked OpenAI's response times and found network lag to be a substantial contributor to slowness?
Or at least, a good enough internet connection to send plaintext.
> * when you live in the USA
Even from Australia to USA is just ~300ms of latency for first token and then the whole thing can finish in ~1s. And making that faster doesn't require on-device deployment, it just requires a server in Australia - which is obviously going to be coming if it hasn't already for many providers.
There really isn't enough emphasis on the downsides of server side platforms.
So many of these are only deployed in US and so if you're say in country Australia not only do you have all your traffic going to the US but it will be via slow and intermittent cellular connections.
It makes using services like LLMs unusably slow.
I miss the 90s and having applications and data reside locally.
I wonder if BYOE (bring your own electricity) also plays a part in their long term vision? Data centres are expensive in terms of hardware, staffing and energy. Externalising this cost to customers saves money, but also helps to paint a green(washing) narrative. It's more meaningful to more people to say they've cut their energy consumption by x than to say they have a better server obselesence strategy, for example.
Apple has committed that all of its products will be carbon-neutral - including emissions from charging during their lifetime - by 2030. The Apple Watch is already there.
> "Apple defines high-quality credits as those from projects that are real, additional, measurable, and quantified, with systems in place to avoid double-counting, and that ensure permanence."
Apple then pledged to buy carbon credits from a company called Verra. In 2023, an investigation found that more than 90% of Verra's carbon credits are a sham. Notably, Apple made their pledge after the results of this investigation were known - so much for their greenwashing.
I wish we had gotten a resolution on whether that report from the Guardian was correct. Regardless, Apple says that credits make up only about 25% of their reduced emissions.
That is an interesting angle to look at it from. If they're gonna keep pushing this they end up with a strong incentive to make the iPhone even more energy efficient, since users have come to expect good/always improving battery life.
At the end of the day, AI workloads in the cloud will always be a lot more compute effective however, meaning lowered combined footprint. However, in the server based model, there is more incentive to pre-compute (waste inference) things to make them appear snappy on device. Analogous would be all that energy spent doing video encoding for YouTube videos that never get watched. Although, it's "idle" resources for budgeting purposes.
I’m not sure it’s that (benched pointed out their carbon commitment) as simple logistics.
Apple doesn’t have to build the data centers. Apple doesn’t have to buy the AI capacity themselves (even if from TSMC for Apple designed chips). Apple doesn’t have to have the personnel for the data centers or the air conditioning. They don’t have to pay for all the network bandwidth.
There are benefits to the user to having the AI run on their own devices in terms of privacy and latency as mentioned by the GP.
But there are also benefits to Apple simply because it means it’s no longer their resources being used up above and beyond electricity.
I keep reading about companies having trouble getting GPUs from the cloud providers and that some crypto networks have pivoted to selling GPU access for AI work as crypto profits fall.
Apple doesn’t have to deal with any of that. They have underused silicon sitting out there ready to light up to make their customers happy (and perhaps interested in buying a faster device).
I agree with everything you said but the TSMC bit. They are quite literally competing with NVidia et al for fab space for customers chips. Sure they get the AI bits built-in to existing products but surely they’re bigger/more expensive to manufacture and commit from TSMC because of it.
I was trying to say that there was still a cost to using their own chips for server AI because they still had to pay to have them made so they weren’t “free” because they’re Apple products as opposed to buying nVidia parts.
You’re right, there is a cost to them to put the AI stuff on end user chips too since die space isn’t free and extra circuits mean fewer chips fit per wafer.
If you offload data processing to the end user, then your data center uses less energy on paper. The washing part is that work is still being done and spending energy, just outside of the data center.
Which honestly is still good for the environment to have the work distributed across the entire electricity grid.
That work needs to be done anyways and Apple is doing it in the cleanest way possible.
What’s an alternative in your mind, just don’t do the processing? That sounds like making progress towards being green. If you’re making claims of green washing you need to be able to back it up with what alternative would actually be “green”.
I didn't make any claims, I just explained what the parent was saying. There could be multiple ways to make it more green: one being not doing the processing, or another perhaps just optimizing the work being done. But actually, no, you don't need a viable way to be green in order to call greenwashing "greenwashing." It can just be greenwashing, with no alternative that is actually green.
> Which honestly is still good for the environment to have the work distributed across the entire electricity grid.
This doesn't make any sense.
> If you’re making claims of green washing you need to be able to back it up with what alternative would actually be “green”.
Sometimes there isn't an alternative. In which case you don't get to look green, sorry. The person critiquing greenwashing doesn't need to give an alternative, why would that be their job? They're just evaluating whether it's real or fake.
Though in this case using renewable energy can help.
> Sometimes there isn't an alternative. In which case you don't get to look green, sorry. The person critiquing greenwashing doesn't need to give an alternative, why would that be their job? They're just evaluating whether it's real or fake.
Baselessly calling every greening and sustainability effort “greenwashing”, especially when there’s practically no thought put into what the alternative might be, is trite and borderline intellectually dishonest. They don’t want to have a conversation about how it could be improved, they just want to interject “haha that’s stupid, corporations are fooling all of you sheeple” from their place of moral superiority. This shit is so played out.
> Baselessly calling every greening and sustainability effort “greenwashing”, especially when there’s practically no thought put into what the alternative might be
Baseless? The foundation of this accusation is rock solid. Offloading the exact same computation to another person so your energy numbers look better is not a greening or sustainability effort.
Fake green should always be called greenwashing.
You don't need to suggest an improvement to call out something that is completely fake. The faker doesn't get to demand a "conversation".
You've seen a bunch of people be incorrectly dismissive and decided that dismissiveness is automatically wrong. It's not.
For an extreme example, imagine a company installs a "pollution-preventing boulder" at their HQ. It's very valid to call that greenwashing and walk away. Don't let them get PR for nothing. If they were actually trying, and made a mistake, suggest a fix. But you can't fix fake.
> Baseless? The foundation of this accusation is rock solid. Offloading the exact same computation to another person so your energy numbers look better is not a greening or sustainability effort.
Yes, I consider it baseless for the following reasons:
- First, consider the hardware running in data centers, and the iDevices running at the edge – the iPhones, iPads and presumably Macs. There's a massive difference in power consumption between a data center full of GPUs, and whatever the equivalent might be in iDevices. Few chips come close to Apple's M-series in power usage.
- Second, Apple's commitment to making those devices carbon neutral by 2030; I'm unaware of any commitment to make cloud compute hardware carbon neutral, but I'll admit that I don't really keep up with that kind of hardware so I could be totally wrong there.
- Third, consider that an AI compute service (I'm not sure what you call it) like OpenAI is always running and crunching numbers in its data center, while the iDevices are each individually running only when needed by the user.
- Fourth, the people who own the iDevices may charge them using more sustainable methods than would power a data center. For example, Iowa – where I live – generates 62% of its energy from wind power and nearly two-thirds of its total energy from renewable resources [1], whereas California only gets 54% of its energy from renewable resources. Of course this cuts both ways, there are plenty of states or even countries that get most of their power from coal, like Ohio.
That said, neither of us have any real numbers on these things so the best either of us can do is be optimistic or pessimistic. But I'd rather do that and have a discussion about it, instead of dismiss it out of hand like everyone else does by saying "haha dumb, get greenwashed".
You're right that improvements don't need to be suggested to have a conversation about greening/greenwashing. My irritation lies more in the fact that it's almost a trope at this point that you can click into the comments on any HN story that mentions greening/sustainability, and there will be comments calling it fake greenwashing. I don't disagree that it's easy for a company to greenwash if they want to, but it's tiring to see everything called greenwashing without applying any critical thinking. Everyone wants to be so jaded about corporations that they'll never trust a word about it.
[1] Although this two-thirds total includes the bio-fuel ethanol, so I feel like it shouldn't be included.
1. Maybe, but wouldn't Apple want to use M-series chips to do this either way?
2. That's an interesting angle.
3. It's the same total amount, and both will go idle when there's less demand.
4. I think the average data center gets cleaner energy than the average house but I can't find a proper comparison so maybe that's baseless.
Also as far as I'm aware, inference takes significantly fewer resources when you can batch it.
> but it's tiring to see everything called greenwashing without applying any critical thinking
That does sound tiring, but in this particular case I think there was sufficient critical thinking, and it was originally brought up as just a possibility.
> Which honestly is still good for the environment to have the work distributed across the entire electricity grid.
Sometimes, but parallelization has a cost. The power consumption from 400,000,000 iPhones downloading a 2gb LLM is not negligible, probably more than what you'd consume running it as a REST API on a remote server. Not to mention slower.
Yeah it's a shame that mobile games are shit when console and PC gaming gets taken so serious by comparison. If you want to blame that on developers and not Apple's stupid-ass policies stopping you from emulating real games, be my guest. That's a take I'd love to hear.
Keep downloadin' those ads. This is what Apple wants from you, a helpless and docile revenue stream. Think Different or stay mad.
It makes sense for desktops but not for devices with batteries. I think Apple should introduce a new device for $5-10k that has 400GB of VRAM that all Macs on the network use for ML.
If you're on battery, you don't want to do LLM inference on a laptop. Hell, you don't really want to do transcription inference for that long - but would be nice not to have to send it to a data center.
The fundamental problem with this strategy is model size. I want all my apps to be privacy first with local models, but there is no way they can share models in any kind of coherent way. Especially when good apps are going to fine tune their models. Every app is going to be 3GB+
This would be interesting but also feels a little restrictive. Maybe something like LoRa could bridge the capability gap but if a competitor then drops a much more capable model then you either have to ignore it or bring it into your app.
(assuming companies won't easily share all their models for this kind of effort)
>n case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning)
I'm curious: is anyone seriously using apple hardware to train Ai models at the moment? Obviously not the big players, but I imagine it might be a viable option for Ai engineers in smaller, less ambitious companies.
I like to think back to 2011 and paraphrase what people were saying:
"Is anyone seriously using gpu hardware to write nl translation software at the moment?"
"No, we should be use cheap commodity abundantly available cpus and orchestrate then behind cloud magic to write our nl translation apps"
or maybe "no we should build purpose built high performance computing hardware to write our nl translation apps"
Or perhaps in the early 70s "is anyone seriously considering personal computer hardware to ...". "no, we should just buy IBM mainframes ..."
I don't know. Im probably super biased. I like the idea of all this training work breaking the shackles of cloud/mainframe/servers/off-end-user-device and migrating to run on peoples devices. It feels "democratic".
I remember having lunch with a speech recognition researcher who was using GPUs to train DNNs to do speech recognition in 2011. It really was thought of as niche back then. But the writing was on the wall I guess in the results they were getting.
I don't think of examples really apply, because it's more a question of being on "cutting edge" vs personal hardware.
For example, running a local model and access to the features of a larger more capable/cloud model are two completely different features therefore there is no "no we should do x instead".
I'd imagine that a dumber local model runs and defers to cloud model when it needs to/if user has allowed it to go to cloud. Apple could not compete on "our models run locally privacy is a bankable feature" alone imo, TikTok install base has shown us enough that users prefer content/features over privacy, they'll definitely still need SoA cloud based models to compete.
Apple are. Their “Personal Voice” feature fine tunes a voice model on device using recordings of your own voice.
An older example is the “Hey Siri” model, which is fine tuned to your specific voice.
But with regards to on device training, I don’t think anyone is seriously looking at training a model from scratch on device, that doesn’t make much sense. But taking models and fine tuning them to specific users makes a whole ton of sense, and an obvious approach to producing “personal” AI assistants.
They already do some “simple” training on device. The example I can think of is photo recognition in the photo library. It likely builds on something else but being able to identify which phase is your grandma versus your neighbor is not done in Apple‘s cloud. It’s done when your devices are idle and plugged into power.
A few years ago it wasn’t shared between devices so each device had to do it themselves. I don’t know if it’s shared at this point.
I agree you’re not going to be training an LLM or anything. But smaller tasks limited and scope may prove a good fit.
Not really (I work on AI/ML Infrastructure at a well known tech company and talk regularly w/ our peer companies).
That said, inference on apple products is a different story. There's definitely interest in inference on the edge. So far though, nearly everyone is still opting for inference in the cloud for two reasons:
1. There's a lot of extra work involved in getting ML/AI models ready for mobile inference. And this work is different for iOS vs. Android
2. You're limited on which exact device models will run the thing optimally. Most of your customers won't necessarily have that. So you need some kind of fallback.
3. You're limited on what kind of models you can actually run. You have way more flexibility running inference in the cloud.
A cloud solution I looked at a few years ago could be replicated (poorly) in your browser today. In my mind the question has become one of determining when my model is useful enough to detach from the cloud, not whether that should happen.
Mobile can be more efficient. But you're making big tradeoffs. You are very limited in what you can actually run on-device. And ultimately you're also screwing over your user's battery life, etc.
Pytorch actually has surprisingly good support for Apple Silicon. Occasionally an operation needs to use CPU fallback but many applications are able to run inference entirely off of the CPU cores.
I’ve found it to be pretty terrible compared to CUDA, especially with Huggingface transformers. There’s no technical reason why it has to be terrible there though. Apple should fix that.
MLX will probably be even faster than that, if the model is already ported. Faster startup time too. That’s my main pet peeve though: there’s no technical reason why PyTorch couldn’t be just as good. It’s just underfunding and neglect
Yes, it can be more cost effective for smaller businesses to do all their work on Mac Studios, versus having a dedicated Nvidia rig plus Apple or Linux hardware for your workstation.
Honestly, you can train basic models just fine on M-Series Max MacBook Pros.
A non-decked out Mac Studio is a hell of a machine for $1999.
Do you also compare cars by looking at only the super expensive limited editions, with every single option box ticked?
I'd also point out that said 3 year old $1999 Mac Studio that I'm typing this on already runs ML models usefully, maybe 40-50% of the old 3000-series Nvidia machine it replaces, while using literally less than 10% of the power and making a tiny tiny fraction of the noise.
For training the Macs do have some interesting advantages due to the unified memory. The GPU cores have access to all of system RAM (and also the system RAM is ridiculously fast - 400GB/sec when DDR4 is barely 30GB/sec, which has a lot of little fringe benefits of it's own, part of why the Studio feels like an even more powerful machine than it actually is. It's just super snappy and responsive, even under heavy load.)
The largest consumer NVidia card has 22GB of useable RAM.
The $1999 Mac has 32GB, and for $400 more you get 64GB.
$3200 gets you 96GB, and more GPU cores. You can hit the system max of 192GB for $5500 on an Ultra, albeit it with the lessor GPU.
Even the recently announced 6000-series AI-oriented NVidia cards max out at 48GB.
My understanding is a that a lot of enthusiasts are using Macs for training because for certain things having more RAM is just enabling.
The huge amount of optimizations available on Nvidia and not available on Apple make the reduced VRAM worth it, because even the most bloated of foundation models will have some magical 0.1bit quantization technique be invented by a turbo-nerd which only works on Nvidia.
I keep hearing this meme of Mac's being a big deal in LLM training, but I have seen zero evidence of it, and I am deeply immersed in the world of LLM training, including training from scratch.
Stop trying to meme apple M chips as AI accelerators. I'll believe it when unsloth starts to support a single non-nvidia chip.
Yeah, and I think people forget all the time that inference (usually batch_size=1) is memory bandwidth bound, but training (usually batch_size=large) is usually compute bound. And people use enormous batch sizes for training.
And while the Mac Studio has a lot of memory bandwidth compared to most desktops CPUs, it isn't comparable to consumer GPUs (the 3090 has a bandwidth of ~936GBps) let alone those with HBM.
I really don't hear about anyone training on anything besides NVIDIA GPUs. There are too many useful features like mixed-precision training, and don't even get me started on software issues.
Don't attack me, I'm not disagreeing with you that an nVidia GPU is far superior at that price point.
I simply want to point out that these folks don't really care about that. They want a Mac for more reasons than "performance per watt/dollar" and if it's "good enough", they'll pay that Apple tax.
Yes, yes, I know, it's frustrating and they could get better Linux + GPU goodness with an nVidia PC running Ubuntu/Arch/Debian, but macOS is painless for the average science AI/ML training person to set up and work with. There are also known enterprise OS management solutions that business folks will happily sign off on.
Also, $7000 is chump change in the land of "can I get this AI/ML dev to just get to work on my GPT model I'm using to convince some VC's to give me $25-500 million?"
tldr; they're gonna buy a Mac cause it's a Mac and they want a Mac and their business uses Mac's. No amount of "but my nVidia GPU = better" is ever going to convince them otherwise as long as there is a "sort of" reasonable price point inside Apple's ecosystem.
I am honestly shocked Nvidia has been allowed to maintain their moat with cuda. It seems like AMD would have a ton to gain just spending a couple million a year to implement all the relevant ML libraries with a non-cuda back-end.
AMD doesn’t really seem inclined toward building developer ecosystems in general.
Intel seems like they could have some interesting stuff in the annoyingly named “OneAPI” suite but I ran it on my iGPU so I have no idea if it is actually good. It was easy to use, though!
There are quite a few back and forth X/Twitter storms in teacups between George Hotz / tinygrad and the AMD management about opening up the firmware for custom ML integrations to replace CUDA but last I checked they were running into walls
I don't understand why you would need custom firmware. It seems like you could go a long way just implementing back-ends for popular ML libraries in openCL / compute shaders
I don't think this is what you meant but it matches the spec: federated learning is being used by Apple to train models for various applications and some of that happens on device (iphones/ipads) with your personal data before its hashed and sent up to the mothership model anonymously.
Does one need to train an AI model on specific hardware, or can a model be trained in one place and then used somewhere else? Seems like Apple could just run their fine tuned model called Siri on each device. Seems to me like asking for training on Apple devices is missing the strategy. Unless of course, it's just for purely scientific $reasons like "why install Doom on the toaster?" vs doing it for a purpose.
It doesn’t require specific hardware; you can train a neural net with pencil and paper if you have enough time. Of course, some pieces of hardware are more efficient than others for this.
But you can get an M2 Ultra with 192GB of UMA for $6k or so. It's very hard to get that much GPU memory at all, let alone at that price. Of course the GPU processing power is anemic compared to a DGX Station 100 cluster, but the mac is $143,000 less.
You want to buy a bunch of new equipment to do training? Yeah Mac’s aren’t going to make sense.
You want your developers to be able to do training locally and they already use Macs? Maybe an upgrade would make business sense. Even if you have beefy servers or the cloud for large jobs.
Yes, that would be great. But without the ability for us to verify this who's to say they won't use the edge resources(your computer and electricity) to process data(your data) and then send the results to their data center? It would certainly save them a lot of money.
They already do this. It's called federated learning and its a way for them to use your data to help personalize the model for you and also (to a much lesser extent) the global model for everyone whilst still respecting your data privacy. It's not to save money, it's so they can keep your data private on device and still use ML.
When you can do all inference at the edge, you can keep it disconnected from the network if you don't trust the data handling.
I happen to think they wouldn't, simply because sending this data back to Apple in any form that they could digest it is not aligned with their current privacy-first strategies. But if they make a device that still works if it stays disconnected, the neat thing is that you can just...keep it disconnected. You don't have to trust them.
Except that's an unreasonable scenario for a smart phone. It doesn't prove that the minute the user goes online it won't be egressing data willingly or not.
I don't disagree, although when I composed my comment I had desktop/laptop in mind, as I think genuinely useful on-device smartphone-AI is a ways of yet, and who knows what company Apple will be by then.
+1 The idea that it's on device, hence it's privacy-preserving is Apple's marketing machine speaking and that doesn't fly anymore. They have to do better to convince any security and privacy expert worth their salt that their claims and guarantees can be independently verified on behalf of iOS users.
Google did some of that on Android, which means open-sourcing their on-device TEE implementation, publishing a paper about it etc.
I'm all for running as much on the edge as possible, but we're not even close to being able to do real-time inference on Frontier models on Macs or iPads, and that's just for vanilla LLM chatbots. Low-precision Llama 3-8b is awesome, but it isn't a Claude 3 replacer, totally drains my battery, and is slow (M1 Max).
Multimodal agent setups are going to be data center/home-lab only for at least the next five years.
Apple isn't about to put 80GB on VRAM in an iPad for about 15 reasons.
The entire software stack is non-free and closed-source.
This means you'd be taking Apple at their word on "privacy". Do you trust Apple? I wouldn't, given their track record.
They might have been thinking of the recently discovered hardware backdoor issue, CVE-2023-38606 (see also Operation Triangulation). There was surprisingly little reporting on it.
Yes at the end its just some data representing user's trained model. Is there a contractual agreement with users that apple will never ever transfer a single byte of those, otherwise huge penalties will happen? If not, its pinky PR promise that sounds nice.
But what does that have to do with the price of milk in Turkmenistan.
Because Boeing's issues have nothing to do with privacy or security and since they are not consumer facing have no relevance to what we are talking about.
For everyone else who doesn't understand what this means, he's saying Apple wants you to be able to run models on their devices, just like you've been doing on nvidia cards for a while.
I think he's saying they want to make local AI a first class, default, capability, which is very unlike buying a $1k peripheral to enable it. At this point (though everyone seems to be working on it), other companies need to include a gaming GPU in every laptop, and tablet now (lol), to enable this.
Yes is it completely clear. My guess is they do something like "Siri-powered shortcuts". Where you can ask it to do a couple things and it'll dynamically create a script and execute it.
I can see a smaller model trained to do that may work well enough, however, I've never seen any real working examples of this work, that rabit device is heading in that direction, but it's mostly vaporware now.
This comment is odd. I wouldn't say it is misleading, but it is odd because it borders on such definition.
> Apple's AI strategy is to put inference (and longer term even learning) on edge devices
This is pretty much everyone's strategy. Model distillation is huge because of this. This goes in line with federated learning. This goes in line with model pruning too. And parameter efficient tuning and fine tuning and prompt learning etc.
> This is completely coherent with their privacy-first strategy
Apple's marketing for their current approach is privacy-first. They are not privacy first. If they were privacy first, you would not be able to use app tracking data on their first party ad platform. They shut it off for everyone else but themselves. Apple's approach is walled garden first.
> Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity
as long as you don't depend on graph centric problems where keeping a local copy of that graph is prohibitive. Graph problems will become more common. Not sure if this is a problem for apple though. I am just commenting in general.
> If (and that's a big if) they keep their APIs open to run any kind of AI workload on their chips
Apple does not have a good track record of this; they are quite antagonistic when it comes to this topic. Gaming on apple was dead for nearly a decade (and pretty much still is) because steve jobs did not want people gaming on macs. Apple has eased up on this, but it very much seems that if they want you to use their devices (not yours) in a certain way, then they make it expensive to do anything else.
Tbf, I don't blame apple for any of this. It is their strategy. Whether it works or not, it doesn't matter. I just found this comment really odd since it almost seemed like evangelism.
edit: weird to praise apple for on device training when it is not publicly known if they have trained any substantial model even on cloud.
The biggest players in commercial AI models at the moment - OpenAI and Google - have made absolutely no noise about pushing inference to end user devices at all. Microsoft, Adobe, other players who are going big on embedding ML models into their products, are not pushing those models to the edge, they’re investing in cloud GPU.
Where are you picking up that this is everyone’s strategy?
> Where are you picking up that this is everyone’s strategy?
Read what their engineers say in public. Unless I hallucinated years of federated learning.
Also apple isn't even a player yet and everyone is discussing how they are moving stuff to the edge lol. Can't critique companies for not being on the edge yet when apple doesn't have anything out there.
I believe at least Google is starting to do edge inference—take a look at the pixel 8 line-up they just announced. It doesn't seem to be emphasized as much, but the tensor G3 chip certainly has builtin inference.
I think this is being too charitable on the state of "everyone". It's everyone's goal. Apple is actively achieving that goal, with their many year strategy of in house silicon/features.
No. "On edge" is not a model existence limitation, it is a hardware capability/existence limitation, by definition, and by the fact that, as you point out, the models already exist.
You can already run those open weight models on Apple devices, on edge, with huge improvements on the newer hardware. Why is a distinct model required? Do the rumors appease these thoughts?
If others are making models, with no way to actually run them, that's not a viable "on edge" strategy, since it involves waiting for someone else to actually accomplish the goal first (as is being done by Apple).
It absolutely is. Model distillation will still be pertinent. And so will be parameter efficient tuning for edge training. I cannot emphasize more how important this is. You will need your own set of weights. If apple wants to use open weights, then sure. Ignore this. Don't seem like they want to long-term... And even if they use open weights, they will still be behind other companies have done model distillation and federated learning for years.
> Why is a distinct model required?
Ask apple's newly poached AI hires this question. Doesn't seem like you would take an answer from me.
> If others are making models, with no way to actually run them
Is this the case? People have been running distilled llamas on rPis with pretty good throughput.
> And even if they use open weights, they will still be behind other companies have done model distillation and federated learning for years.
I'm sorry, but we're talking about "on edge" here though. Those other companies have no flipping hardware to run it "on edge", in a "generic" way, which is the goal. Apple's strategy involves the generic.
> If apple wants to use open weights
This doesn't make sense. Apple doesn't dictate the models you can use with their hardware. You can already accelerate LLAMA with the neural engines. You can download the app right now. You can already deploy your models on edge, on their hardware. That is the success they're achieving. You cannot effectively do this on competitor hardware, with good performance, from "budget" to "Pro" lineup, which is a requirement of the goal.
> they will still be behind other companies have done model distillation and federated learning for years.
What hardware are they running it on? Are they taking advantage of Apple (or other) hardware in their strategy? Federated learning is an application of "on edge", it doesn't *enable* on edge, which is part of Apple's strategy.
> Ask apple's newly poached AI hires this question. Doesn't seem like you would take an answer from me.
Integrating AI in their apps/experience is not the same as enabling a generic "on edge", default, capability in all Apple devices (which they have been working towards for years now). This is the end goal for "on edge". You seem to be talking about OS integration, or something else.
> People have been running distilled llamas on rPis with pretty good throughput.
Yes, the fundamental limitation there being hardware performance, not the model, with that "pretty good" making the "pretty terrible" user experience. But, there's also nothing stopping anyone from running these distilled (a requirement of limited hardware) models on Apple hardware, taking advantage of Apples fully defined "on edge" strategy. ;) Again, you can run llamas on Apple silicon, accelerated, as I do.
> Those other companies have no flipping hardware to run it "on edge", in a "generic" way, which is the goal
Maybe? This is why I responded to:
> It's everyone's goal. Apple is actively achieving that goal
This is is the issue I found disagreeable. Other organizations and individual people are achieving that goal too. Google says GPT-Nano is going to device, and if the benchmarks are to be believed, if it runs at that level, their work so far is also actively achieving that goal. Meta has released multiple distilled models that people have already proven to run inference at the device level. It cannot be argued that meta is not actively achieving that goal either. They don't have to release the hardware because they went a different route. I applaud Apple for the M chips. They are super cool. People are still working on using them so Apple can realize that goal too.
So when you go to the statement that started this
> Apple's AI strategy is to put inference (and longer term even learning) on edge devices
Multiple orgs also share this. And I can't say that one particular org is super ahead of the others. And I can't elevate apple in that race because it is not clear that they are truly privacy-focused or that they will keep APIs open.
> You cannot effectively do this on competitor hardware, with good performance, from "budget" to "Pro" lineup, which is a requirement of the goal
Why do you say you cannot do this with good performance? How many tokens do you want for a device? Is 30T/s enough? You can do that on laptops running small mixtral.
> What hardware are they running it on? Are they taking advantage of Apple (or other) hardware in their strategy?
I don't know. I have nothing indicating necessarily apple or nvidia or otherwise. Do you?
> [Regarding the rest]
Sure, my point is that they definitely have an intent for bespoke models. And why I raised the point that not all computation will be feasible on edge for the time being. My point with what raised this particular line of inquiry is whether a pure edge experience truly enables the best user experience. And also why I raised the point about Apple's track record of open APIs. Which is why "actively achieving" is something that I put doubt on. And I also cast doubt on apple being privacy focused. Just emphasize tying it back to the reason I even commented.
don't bother. apple's marketing seems to have won on here. i made a similar point only for people to tell me that apple is the only org seriously pushing federated learning.
> Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity and hence minimal latency.
I know a shop who's doing this and it's a very promising approach. The ability to offload the costs of cloud GPU time is a tremendous advantage. That's to say nothing of the decreased latency, increased privacy, etc. The glaring downside is that you are dependent upon your users to be willing and able to run native apps (or possibly WASM, I'm not sure) on bleeding edge hardware. However, for some target markets (e.g. video production, photography, designers, etc.) it's a "safe" assumption that they will be using the latest and greatest Macs.
I've also been hearing people talk somewhat seriously about setting up their own training/inference farms using Macs because, at least for now, they're more readily available and cheaper to buy/run than big GPUs. That comes with a host of ops problems but it still may prove worthwhile for some use cases and addresses some of the same privacy concerns as edge computing if you're able to keep data/computation in-house.
I think these days everyone links their products with AI. Today even BP CEO linked his business with AI. Edge inference and cloud inference are not mutually exclusive choices. Any serious provider will provide both and the improvement in quality of services come from you giving more of your data to the service provider. Most people are totally fine with that and that will not change any time sooner. Privacy paranoia is mostly a fringe thing in consumer tech.
I agree. Apple has been on this path for a while, the first processor with a Neural Engine was the A11 in 2017 or so. The path didn’t appear to change at all.
The big differences today that stood out to me were adopting AI as a term (they used machine learning before) and repeating the term AI everywhere they could shove it in since that’s obviously what the street wants to hear.
That’s all that was different. And I’m not surprised they emphasized it given all the weird “Apple is behind on AI“ articles that have been going around.
I've been saying the same thing since ANE and the incredible new chips with shared ram, suddenly everyone could run capable local models - but then Apple decided to be catastrophically stingy once again putting ridiculous 8gb's of ram in these new iPads' and their new macbook air's destroying having a widespread "intelligent local siri" because now half the new generation can't run anything.
Apple is an amazing powerhouse but also disgustingly elitist and wasteful if not straight up vulgar in its profit motives. There's really zero idealism there despite their romantic and creative legacy.
There's always some straight idiotic limitations in their otherwise incredible machines, with no other purpose than to create planned obsolescence, "PRO" exclusivity and piles e-waste.
On “privacy”: If Apple owned the Search app versus paying Google, and used their own ad network (which they have for App Store today), Apple will absolutely use your data and location etc to target you with ads.
It can even be third party services sending ad candidates directly to your phone and then the on-device AI chooses which is relevant.
Privacy is a contract not the absence of a clear business opportunity. Just look at how Apple does testing internally today. They have no more respect for human privacy than any of their competitors. They just differentiate through marketing and design.
Something they should be able to do now, but do not seem to, is to allow you to train Siri to recognize exactly your voice and accent. Which is to say, to take the speech-to-text model that is listening and putting it into the Siri integration API, to both be 99.99% accurate for your speech and to recognize you and only you when it comes to invoking voice commands.
It could, if it chose to, continue to recognize all voices but at the same time limit the things the non-owner could ask for based on owner preferences.
This is really easy to do: it's just an embedding of your voice. So typically like 10/30 sec max of your voice to configure this. You already do a similar setup for faceId. I agree with you, I don't understand why they don't do it.
"Which would be at odds with sending data up to the cloud for processing"
I don't think there is enough information here for that to be a true claim - it is possible to send the input used for inference, to the cloud, while computing the result locally. It is also possible to store the input used for inference while offline, and send that later when the device is online.
> Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity and hence minimal latency.
That's one particular set of trade-offs, but not necessarily the best. Eg if your network connection and server processing speed is sufficiently faster than your local processing speed, the latency would be higher for doing it locally.
Local inference can also use more battery power. And you need a more beefy device, all else being equal.
Every embedded company is pushing ML at the edge with inference engines. Check out MLPerfTiny. They’ve been benchmarking all sorts of edge AI since 2019.
If they're doing inference on edge devices, one challenge I see is protecting model weights. If you want to deploy a proprietary model on an edge AI chip, the weights can get stolen via side-channel attacks [1]. Obviously this isn't a concern for open models, but I doubt Apple would go the open models route.
Privacy can actually be reduced with on-device ai too. Now, without actually sending any data to iCloud apple can still have a general idea of what you’re doing. Imagine a state has a law that makes certain subjects illegal to discuss. They could compel apple to have their local AI detect that content and then to broadcast a ping in the AirTags network about the user and their location. No internet connection required on the target.
> This is completely coherent with their privacy-first strategy
Apple has never been privacy-first in practice. They give you the illusion of privacy but in reality it's a closed-source system and you are forced to trust Apple with your data.
They also make it a LOT harder than Android to execute your own MITM proxies to inspect what exact data is being sent about you by all of your apps including the OS itself.
You say that like open source isn't also an illusion of trust.
The reality is, there's too much to verify, and not enough interest for the "many eyeballs make all bugs shallow" argument.
We are, all of us, forced to trust, forced to go without the genuine capacity to verify. It's not great, and the best we can do is look for incentives and try to keep those aligned.
You've missed my point if you think I've made yours for you.
I'm not saying closed source is a silver bullet.
I'm saying OSS also isn't a silver bullet, it doesn't find everything because there's not enough interest in doing this work.
The Log4j example alone, given it took 8 years, is enough to demonstrate that.
Everything is an illusion of trust, nothing is perfect; all we can do is try to align the interests of those working on projects with the interests of society — which is so hard that it's an entire field of study called "politics".
I don't agree with relying on the many eyeballs argument for security, but from a privacy standpoint, I do think at least the availability of source to MY eyeballs, as well as the ability to modify, recompile, and deploy it, is better than "trust me bro I'm your uncle Steve Jobs and I know more about you than you but I'm a good guy".
If you want to, for example, compile a GPS-free version of Android that appears like it has GPS but in reality just sends fake coordinates to keep apps happy thinking they got actual permissions, it's fairly straightforward to make this edit, and you own the hardware so it's within your rights to do this.
Open-source is only part of it; in terms of privacy, being able to see what all is being sent in/out of my device is is arguably more important than open source. Closed source would be fine if they allowed me to easily inject my own root certificate for this purpose. If they aren't willing to do that, including a 1-click replacement of the certificates in various third-party, certificate-pinning apps that are themselves potential privacy risks, it's a fairly easy modification to any open source system.
A screen on my wall that flashes every JSON that gets sent out of hardware that I own should be my right.
> Open-source is only part of it; in terms of privacy, being able to see what all is being sent in/out of my device is is arguably more important than open source.
I agree; unfortunately it feels as if this ship has not only sailed, but the metaphor would have to be expanded to involve the port at well.
Is it even possible, these days, to have a functioning experience with no surprise network requests? I've tried to limit mine via an extensive hosts file list, but that did break stuff even a decade ago, and the latest version of MacOS doesn't seem to fully respect the hosts file (weirdly it partially respects it?)
> A screen on my wall that flashes every JSON that gets sent out of hardware that I own should be my right.
I remember reading a tale about someone, I think it was a court case or an audit, who wanted every IP packet to be printed out on paper. Only backed down when the volume was given in articulated lorries per hour.
Yeah, given that they resisted putting RCS in iMessage so long, I am a bit skeptical about the whole privacy narrative. Especially when Apple's profit is at odds with user privacy.
From my understanding, the reason RCS was delayed is because Google's RCS was E2EE only in certain cases (both users using RCS). But also because Google's RCS runs through Google servers.
If Apple enabled RCS in messages back then, but the recipient was not using RCS, then Google now has the decrypted text message, even when RCS advertises itself as E2EE. With iMessage, at least I know all of my messages are E2EE when I see a blue bubble.
Even now, RCS is available on Android if using Google Messages. Yes, it's pre-installed on all phones, but OEMs aren't required to use it as the default. It opens up more privacy concerns because now I don't know if my messages are secure. At least with the green bubbles, I can assume that anything I send is not encrypted. With RCS, I can't be certain unless I verify the messaging app the recipient is using and hope they don't replace it with something else that doesn't support RCS.
Agreed. While I have concerns regarding RCS, Apple's refusal to make iMessage an open platform due to customer lock-in is ridiculous and anti-competitive.
RCS is a net loss for privacy: it gives the carriers visibility into your social graph and doesn’t support end to end encryption. Google’s PR campaign tried to give the impression that RCS supports E2EE but it’s restricted to their proprietary client.
By what? It's impossible for a process to know for sure if the system is rooted or not. A rooted system can present itself to a process to look like a non-rooted system if it's engineered well enough.
I'd bet that most of these apps probably just check if "su" returns a shell, in which case perhaps all that's needed is to modify the "su" executable to require "su --magic-phrase foobar" before it drops into a root shell, and returns "bash: su: not found" or whatever if called with no arguments.
Right. The difference is that Apple has a ton of edge capacity, they’ve been building it for a long time.
Google and Samsung have been building it too, at different speeds.
Intel and AMD seem further behind (at the moment) unless the user has a strong GPU, which is especially uncommon on the most popular kind of computer: laptops.
And if you’re not one of those four companies… you probably don’t have much capable consumer edge hardware.
> This is completely coherent with their privacy-first strategy (...)
I think you're trying too hard to rationalize this move as pro-privacy and pro-consumer.
Apple is charging a premium for hardware based on performance claims, which they need to create relevance and demand for it.
There is zero demand for the capacity for running computationally demanding workloads beyond very niche applications, for what classifies as demanding for the consumer-grade hardware being sold for the past two decades.
If Apple offloads these workloads to the customer's own hardware, they don't have to provide this computing capacity themselves. This means no global network of data centers, no infrastructure, no staff, no customer support, no lawyer, nothing.
More importantly, Apple claims to be pro privacy but their business moves are in reality in the direction of ensuring that they are in sole control of their users' data. Call it what you want but leveraging their position to ensure they hold a monopoly over a market created over their userbase is not a pro privacy move, just like Apple's abuse of their control over the app store is not a security move.
>I personally really really welcome as I don't want the AI future to be centralised in the hands of a few powerful cloud providers.
Watch out for being able to using ai on your local machine and those ai services using telemetry to send your data (recorded conversations, for instance) to their motherships.
Now the subscription is 20$ a month and the API price is accessible. What will happen when they all decide to x100 or x1000 the price of their API ?
All the companies that got rid of people in favor of AI, might have lost the knowledge as well.
This is dangerous and might kill a lot of companies no ?
There is no guarantee that local processing is going to have lower latency than remote processing. Given the huge compute needs of some AI models (e.g. chat gpt) the time saved by using larger compute likely dwarfs the relatively small time need to transmit a request.
Nah. Apple doesn't have incentive to provide any more dev power. It will keep things locked down and charge people for Apple branded software products. That has been their business for the past decade.
I think there's always been a tension at Apple between keeping everything as locked down as possible and opening up parts because they need the developer driven app ecosystem. My prediction is Neural Engine is going to become more useful to third party developers. I could be wrong
My cynical view is that doing AI on the client is the only way they can try to keep selling luxury items (jewelry really) and increasing prices for what are essentially and functionally commodity devices.
... I think that the more correct assertion would be that Apple is a sector leader in privacy. If only because their competitors make no bones about violating the privacy of their customers as it is the basis of thier business model. So it's not that Apple is A+ so much as the other students are getting Ds and Fs.
Prob beacuse they are like super-behind in the cloud space, it is not like they wouldn't like to sell the service. They ignored photos privacy quite a few times in the icloud.
I hope this means AI-accelerated frameworks get better support on Mx. Unified memory and Metal are a pretty good alternative for local deep learning development.
So for hardware accelerated training with something like PyTorch, does anyone have a good comparison between Metal vs Cuda, both in terms of performance and capabilities?
How is local more private? Whether AI runs on my phone or in a data center I still have to trust third parties to respect my data. That leaves only latency and connectivity as possible reasons to wish for endpoint AI.
If you can run AI in airplane mode, you are not trusting any third party, at least until you reconnect to the Internet. Even if the model was malware, it wouldn’t be able to exfiltrate any data prior to reconnecting.
You’re trusting the third party at training time, to build the model. But you’re not trusting it at inference time (or at least, you don’t have to, since you can airgap inference).
Ehhh at this point Apple’s privacy strategy is little more than marketing. Sure they’ll push stuff to the edge to save themselves money and book the win, but they also are addicted to the billions they make selling your searches to Google.
Some people really seem to be truly delusional. It's obvious that the company's "privacy" is a marketing gimmick when you consider the facts. Do people not consider the facts anymore? How does somebody appeal to the company's "privacy-first strategy" with a straight face in light of the facts? I suppose they are not aware of the advertising ID that is embedded in all Apple operating systems. That one doesn't even require login.
A "mistake" seems to be putting it lightly when the thing has been reiterated multiple times throughout the years, but yeah. Seems more like blind dogma. Obviously people don't like the facts pointed out to them either as you can tell by the down votes on my comment. If I am wrong, please tell me how in a reply.
Honestly, if they manage this, they have my money.
But to get actually powerful models running, they need to supply the devices with enough RAM - and that's definitely not what Apple like to do.
And yet Siri is super slow because it does the processing off-device, and is far less useful than it could be because it is cobbled with restrictions.
I can't even find a way to resume playing whatever Audible book I was last playing. "Siri play audible" or something. As far as I know, this is impossible to do.
> This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).
I feel like people are being a bit naïve here. Apple's "Privacy First" strategy was a marketing spin developed in response to being dead-last in web-development/cloud computing/smart features.
Apple has had no problem changing their standards by 180 degrees and being blatantly anti-consumer whenever they have a competitive advantage to do so.
Having worked at Apple I can assure you it's not just spin. It's nigh on impossible to get permission to even compare your data with another service inside of Apple and even if you do get permission the user ids and everything are completely different so theres no way to match up users. Honestly its kind of ridiculous the lengths they go to and makes development an absolute PITA.
That could very well be true, but I also think it could change faster than people realize. Or that Apple has the ability to compartmentalize (kind of like how Apple can advocate for USB C adoption in some areas and fight it in others).
I'm not saying this to trash Apple - I think it's true of any corporation. If Apple starts losing revenue in 5 years because their LLM isn't good enough because they don't have enough data, they are still going to take it and have some reason justifying why theirs is privacy focused and everyone else is not.
Of course! The difference is that, for the time being, my incentives are aligned with theirs in regards to preserving my privacy.
The future is always fungible. Anyone can break whatever trust they've built very quickly. But, like the post you are replying to, I have no qualms about supporting companies that are currently doing things in my interest and don't have any clear strategic incentive to violate that trust.
Edit: that same incentive structure would apply to NVIDIA, afaik
I can't agree with your comment. apple has all the incentives to monetize your data, that's the whole value of Google and Meta. And they are already heading into ad-business earning billions last I've checked. Hardware ain't selling as much as before, this isn't going to change for the better in foreseeable future.
The logic is exactly same as ie Meta claims - we will pseudoanonymize your data, so technically your specific privacy is just yours, see nothing changed. But you are in various target groups for ads, plus we know how 'good' those anon efforts are when money are at play and corporations are only there to earn as much money as possible. Rest is PR.
I'll disagree with your disagreement - in part at least. Apple is still bigger than Meta or Google. Even if they had a strong channel to serve ads or otherwise monetize data, the return would represent pennies on the dollar.
And Apple's privacy stance is a moat against these other companies making money off of their customer base. So for the cost of pennies on the dollar, they protect their customer base and ward off competition. That's a pretty strong incentive.
Don't bother the fanboys have an Apple can't do anything wrong/malicious. At this point it's closer to a religion than ever.
You would be amazed at the response of some of them when I point out some shit Apple does that make their products clearly lacking for the price, the cognitive dissonance is so strong they don't know how to react in any other way than lying or pretending it doesn't matter.
If you’re annoyed about quasi-religious behavior, consider that your comment has nothing quantifiable and contributed nothing to this thread other than letting us know that you don’t like Apple products for non-specific reasons. Maybe you could try to model the better behavior you want to see?
How do you even come to the conclusion that I don't like Apple products?
I have a phone, watch and computer from them. It's not like I hate the products.
I have very specific reasons to be annoyed but they are far too many to list them all in a simple post. I was working with and buying Apple stuff before the turn of the millenium and I have worked as an Apple technician and helped way more Apple users than I could care to list.
This is my experience and your comment precisely illustrates that.
My comment was about the delusion of privacy first marketing bullshit that they came up with to excuse the limitations of some of their stuff.
But since Apple can't do anything wrong, we are not going anywhere. Whatever, keep on believing.
Considering you commented on another one of my comments about the Apple "special sauce magic' RAM I can see how you could think that.
I'll check myself thanks, but you should check your allegiance to a trillion dollar corp and its bullshit marketing, that's really not useful to anyone but them.
Yes, it’s possible that they’ll change in the future but that doesn’t make it inevitable. Everything you describe could have happened at any point in the last decade or two but didn’t, which suggests that it’s not “waiting for the right MBA” but an active effort to keep the abusive ones out.
One thing to remember is that they understand the value of long-term investments. They aren’t going to beat Google and Facebook at advertising and have invested billions in a different model those companies can’t easily adopt, and I’m sure someone has done the math on how expensive it would be to switch.
"With these improvements to the CPU and GPU, M4 maintains Apple silicon’s industry-leading performance per watt. M4 can deliver the same performance as M2 using just half the power. And compared with the latest PC chip in a thin and light laptop, M4 can deliver the same performance using just a fourth of the power."
That's an incredible improvement in just a few years. I wonder how much of that is Apple engineering and how much is TSMC improving their 3nm process.
Apple usually massively exaggerates their tech spec comparison - is it REALLY half the power use of all times (so we'll get double the battery life) or is it half the power use in some scenarios (so we'll get like... 15% more battery life total) ?
IME Apple has always been the most honest when it makes performance claims. LIke when they said an Macbook Air would last 10+ hours and third-party reviewers would get 8-9+ hours. All the while, Dell or HP would claim 19 hours and you'd be lucky to get 2 eg [1].
As for CPU power use, of course that doesn't translate into doubling battery life because there are other components. And yes, it seems the OLED display uses more power so, all in all, battery life seems to be about the same.
I'm interested to see an M3 vs M4 performance comparison in the real world. IIRC the M3 was a questionable upgrade. Some things were better but some weren't.
Overall the M-series SoCs have been an excellent product however.
> ME Apple has always been the most honest when it makes performance claims.
Okay, but your example was about battery life:
> LIke when they said an Macbook Air would last 10+ hours and third-party reviewers would get 8-9+ hours. All the while, Dell or HP would claim 19 hours and you'd be lucky to get 2 eg [1]
And even then, they exaggerated their claims. And your link doesn't say anything about HP or Dell claiming 19 hour battery life.
Apple has definitely exaggerated their performance claims over and over again. The Apple silicon parts are fast and low power indeed, but they've made ridiculous claims like comparing their chips to an nVidia RTX 3090 with completely misleading graphs
This is why you have to take everything they say with a huge grain of salt. Their chip may be "twice" as power efficient in some carefully chosen unique scenario that only exists in an artificial setting, but how does it fare in the real world? That's the question that matters, and you're not going to get an honest answer from Apple's marketing team.
M1 Ultra did benchmark close to 3090 in some synthetic gaming tests. The claim was not outlandish, just largely irrelevant for any reasonable purpose.
Apple does usually explain their testing methodology and they don’t cheat on benchmarks like some other companies. It’s just that the results are still marketing and should be treated as such.
Outlandish claims notwithstanding, I don’t think anyone can deny the progress they achieved with their CPU and especially GPU IP. Improving performance on complex workloads by 30–50% in a single year is very impressive.
It did not get anywhere close to a 3090 in any test when the 3090 was running at full power. They were only comparable at specific power usage thresholds.
Different chips are generally compared at similar power levels, ime. If you ran 400 watts through an M1 Ultra and somehow avoid instantly vaporizing the chip in the process, I'm sure it wouldn't be far behind the 3090.
Ok but that doesn't matter if you can't actually run 400 watts through an M1 Ultra. If you wanna compare how efficient a chip is, sure, that's a great way to test. But you can't make the claim that your chip is as good as a 3090 if the end user is never going to see the performance of an actual 3090
You're right, its not 19 hours claimed. It was more than even that.
> HP gave the 13-inch HP Spectre x360 an absurd 22.5 hours of estimated battery life, while our real-world test results showed that the laptop could last for 12 hours and 7 minutes.
the absurdness was difference in claimed battery life vs actual battery life. 19 vs 2 is more absurd than 22.5 vs 12
> Speaking of the ThinkPad P72, here are the top three laptops with the most, er, far out battery life claims of all our analyzed products: the Lenovo ThinkPad P72, the Dell Latitude 7400 2-in-1 and the Acer TravelMate P6 P614. The three fell short of their advertised battery life by 821 minutes (13 hours and 41 mins), 818 minutes (13 hours and 38 minutes) and 746 minutes (12 hours and 26 minutes), respectively.
Dell did manage to be one of the top 3 most absurd claims though.
Dell and IBM were lying about battery life before OSX was even a thing and normal people started buying MacBooks. Dell and IBM will be lying about battery life when the sun goes red dwarf.
Reviewers and individuals like me have always been able to get 90% of Apple’s official battery times without jumping through hoops to do so. “If you were very careful” makes sense for an 11% difference. A ten hour difference is fucking bullshit.
So you are saying that Dell with Intel CPU could get longer battery life than Mac with M1? What does that say about quality of Apple engineering? Their marketeering is certainly second to none.
Maybe for battery life, but definitely not when it comes to CPU/GPU performance. Tbf, no chip company is, but Apple is particularly egregious. Their charts assume best case multi-core performance when users rarely ever use all cores at once. They'd have you thinking it's the equivalent of a 3090 or that you get double the frames you did before when the reality is more like 10% gains.
I don't think less honest covers it and can't believe anything their marketing says after the 3090 claims. Maybe it's true, maybe not. We'll see from the reviews. Well assuming the reviewers weren't paid off with an "evaluation unit".
I’ve never known a time when Dell, IBM, Sony, Toshiba, Fujitsu, Alien, weren’t lying through their teeth about battery times.
What time period are you thinking about for Apple? I’ve been using their laptops since the last G4 which is twenty years. They’ve always been substantially more accurate about battery times.
The problem with arguing about battery life this way is that it's highly dependent on usage patterns.
For example I would be surprised if there is any laptop, which is sufficiently fast for my usage, and it's battery life is more than 2-3 hours top. Heck, I have several laptops and all of them dies in one-one and a half hours. But of course, I never optimized for battery life, so who knows. So in my case, all of them are lying equally. I don't even check battery life for 15 years now. It's a useless metric for me, because all of them are shit.
But of course for people who don't need to use VMs, run several "micro"services at once, have constant internet transfer and have 5+ Intellij project open at the same time which caching several millions LOC, while gazillion web pages are open, maybe there is a difference, for me it doesn't matter whether it's one or one and a half hours.
You should try a MacBook Pro someday. It would still last all day with that workload. I had an XPS at work and it would last 1.5 hrs. My Apple laptop with the same workload lasts 6-8 hours easily. I never undocked the dell because of the performance issues. I undock the mac all the time because I can trust it to last.
Nothing too crazy I don't think. A bunch of standard Electron applications, a browser, a terminal - that's pretty much it. Sometimes Dockers, but I always kill it when I'm done.
Controlling the OS is probably a big help there. At least, I saw lots of complaints about my zenbook model’s battery not hitting the spec. It was easy to hit or exceed it in Linux, but you have to tell it not to randomly spin up the CPU.
I had to work my ass off on my Fujitsu Lifebook to get 90% of the estimate, even on Linux. I even worked on a kernel patch for the Transmeta CPU, based on unexploited settings in the CPU documentation, but it came to no or negligible difference in power draw, which I suppose is why Linus didn’t do it in the first place.
This is why Apple can be slightly more honest about their battery specs, they don’t have the OS working against them. Unfortunately most DELLs XPS will be running Windows, so it is still misleading to provide specs based on what the hardware could do if not sabotaged.
Archlinux, mitigations (spectre alike) off, X11, OpenBox, bmpanel with only CPU/IO indicator. Light theme everywhere. Opera in power save mode. `powertop --auto-tune` and `echo 1 | sudo tee /sys/devices/system/cpu/intel_pstate/no_turbo` Current laptop is Latitude 7390.
Right, so you are disabling all performance features and effectively turning your CPU into a low–end low–power SKU. Of course you’d get better battery life. It’s not the same thing though.
> echo 1 | sudo tee /sys/devices/system/cpu/intel_pstate/no_turbo
Isn't that going to torch performance? My i9-9900 has a base frequency of 3.6 Ghz and a turbo of 5.0 Ghz. Disabling the turbo would create a 28% drop in performance.
I suppose if everything else on the system is configured to use as little power as possible, then it won't even be noticed. But seeing as CPUs underclock when idle (I've seen my i9 go as low as 1.2 Ghz), I'm not sure disabling turbo makes a significant impact except when your CPU is being pegged.
That's the point. I have no performance bottleneck with no_turbo. My i5 tends to turn on turbo mode and increased power demand (heat leaks) even if it's no needed. For example with no_turbo laptop is always cold and fan basically stays silent. With turbo it easily gets 40C warm while watching YT or doing my developer stuff, building docker containers and so.
I get 20 minutes from my Dell (not the XPS), with Vim. When it was brand-new, I got 40 minutes. A piece of hot garbage, with an energy-inefficient intel cpu..
> IME Apple has always been the most honest when it makes performance claims
Yes and no. They'll always be honest with the claim, but the scenario for the claimed improvement will always be chosen to make the claim as large as possible, sometimes with laughable results.
Typically something like "watch videos for 3x longer <small>when viewing 4k h265 video</small>" (which means they adapted the previous gen's silicon which could only handle h264).
> IME Apple has always been the most honest when it makes performance claims
That's just laughable, sorry. No one is particularly honest in marketing copy, but Apple is for sure one of the worst, historically. Even more so when you go back to the PPC days. I still remember Jobs on stage talking about how the G4 was the fasted CPU in the world when I knew damn well that it was half the speed of the P3 on my desk.
A year later I was doing bonkers (for the time) photoshop work on very large compressed tiff files and my G4 laptop running at 400Mhz was more than 2x as fast as PIIIs on my bench.
Was it faster all around? I don't know how to tell. Was Apple as honest as I am in this commentary about how it mattered what you were doing? No. Was it a CPU that was able to do some things very fast vs others? I know it was.
Funny you mention that machine I still have one of those laying around.
It was a very cool machine indeed with a very capable graphics card but that's about it.
It did some things better/faster than a Pentium III PC but only if you went for the bottom of the barrel unit and crippled the software support (MMX just like another reply mentioned).
On top of that Intel increased frequency faster than Apple could handle. And after the release of the Pentium 4, the G4s became very noncompetitive so fast that one would question what could save Apple (later, down the road, Intel it turns out).
They tried to salvage it with the G5s but those came with so many issues that even their bi-proc water-cooled were just not keeping up. I briefly owned of those after repairing it for "free" using 3 of them, supposedly dead; the only thing worth a dam in that was the GPU. Extremely good hardware in many ways but also very weak for so many things that it had to be used only for very specific tasks, otherwise a cheap Intel PC was much better.
Which is precisely why right after they went with Intel. After years of subpar performance on laptops because they were stuck at G4 (not even high frequency).
Now I know from your other comments that you are a very strong believer and I'll admit that there were many reasons to use a Mac (software related) but please stop pretending they were performance competitive because that's just bonkers. If they were, the Intel switch would never have happend in the first place...
It's just amazing that this kind of nonsense persists. There were no significant benchmarks, "scientific" or otherwise, at the time or since showing that kind of behavior. The G4 was a dud. Apple rushed out some apples/oranges comparisons at launch (the one you link appears to be the bit where they compared a SIMD-optimized tool on PPC to generic compiled C on x86, though I'm too lazy to try to dig out the specifics from stale links), and the reality distortion field did the rest.
While certainly misleading, there were situations where the G4 was incredibly fast for the time. I remember being able to edit Video in iMove on a 12" G4 Laptop. At that time there was no equivalent x86 machine.
Have any examples from the past decade? Especially in the context of how exaggerated the claims are from PC and Android brands they are competing with?
Fair point. Based on their first sentence, I mischaracterized how “laughable” was used.
Though the author also made clear in their second sentence that they think Apple is one of the worst when it comes to marketing claims, so I don’t think your characterization is totally accurate either.
Ye that was hilarious, my basic workload borders on the 8GB limit not even pushing it.
They have fast swap but nothing beats real ram in the end, and considering their storage pricing is as stupid as their RAM pricing it really makes no difference.
If you go for the base model, you are in for a bad time, 256GB with heavy swap and no dedicated GPU memory (making the 8GB even worse) is just plain stupid.
This what the Apple fanboys don't seem to get, their base model at somewhat affordable price are deeply incompetent and if you start to load it up the pricing just do not make a lot of sense...
> If you go for the base model, you are in for a bad time, 256GB with heavy swap and no dedicated GPU memory (making the 8GB even worse) is just plain stupid ... their base model at somewhat affordable price are deeply incompetent
I got the base model M1 Air a couple of years back and whilst I don't do much gaming I do do C#, Python, Go, Rails, local Postgres, and more. I also have a (new last year) Lenovo 13th gen i7 with 16GB RAM running Windows 11 and the performance with the same load is night and day - the M1 walks all over it whilst easily lasting 10hrs+.
Note that I'm not a fanboy; I run both by choice. Also both iPhone and Android.
The Windows laptop often gets sluggish and hot. The M1 never slows down and stays cold. There's just no comparison (though the Air keyboard remains poor).
I don't much care about the technical details, and I know 8GB isn't a lot. I care about the experience and the underspecced Mac wins.
I don't know about your Lenovo and how your particular workload is handled by Windows.
And I agree that in pure performance, the Apple Silicon Macs will kill it; however, I am really skeptical that an 8GB model would give you a better experience overall.
Faster for long compute operations sure, but then you have to deal with all the small slowdown from constant swapping. Unless you stick to a very small amounts of apps and very small amounts of tabs at the same time (which is rather limiting) I don't know how you do it.
I don't want to call you a liar but maybe you are emotionally attached (just like I am sometimes) to the device to realize it, or maybe the various advantages of the Mac make you ignore the serious limitations that come with it.
Everyone has their own sets of tradeoffs but my argument is that you can deal with the 8GB Apple Silicon devices you are very likely to be well served by a much cheaper device anyway (like half as cheap).
All I can say is I have both and I use both most days. In addition to work-issued Windows laptops, so I have a reasonable and very regular comparison. And the comparative experience is exactly as I described. Always. Every time.
> you have to deal with all the small slowdown from constant swapping
That just doesn't happen. As I responded to another post, though, I don't do Docker or LLMs on the M1 otherwise you'd probably be right.
> Unless you stick to a very small amounts of apps and very small amounts of tabs at the same time
It's really common to have approaching 50+ tabs open at once. And using Word is often accompanied by VS Code, Excel, Affinity Designer, DotNet, Python, and others due to the nature of what I'm doing. No slowdown.
> maybe you are emotionally attached
I am emotionally attached to the device. Though as a long-time Mac, Windows, and Linux user I'm neither blinkered nor tribal - the attachment is driven by the experience and not the other way around.
> maybe the various advantages of the Mac make you ignore the serious limitations that come with it
There are indeed limitations. 8GB is too small. The fact that for what I do it has no impact doesn't mean I don't see that.
> you can deal with the 8GB Apple Silicon devices you are very likely to be well served by a much cheaper device anyway (like half as cheap)
I already have better Windows laptops than that, and I know that going for a Windows laptop that's half as cheap as the entry level Air would be nothing like as nice because the more expensive ones already aren't (the Lenovo was dearer than the Air).
---
To conclude, you have to use the right tool for the job. If the nature of the task intrinsically needs lots of RAM then 8GB is not good enough. But when it is enough it runs rings around equivalent (and often 'better') Windows machines.
Not individually, no. Though it's often done simultaneously.
That said you're right about lots of RAM in that I wouldn't bother using the 8GB M1 Air for Docker or running LLMs (it can run SD for images though, but very slowly). Partly that's why I have the Lenovo. You need to pick the right machine for the job at hand.
You know that RAM in these machines is more different than the same as "RAM" in a standard PC? Apple's SoC RAM is more or less part of the CPU/GPU and is super fast. And for obvious reasons cannot be added to.
Anyway, I manage a few M1 and M3 machines with 256/8 configs and they all run just as fast as 16 and 32 machines EXCEPT for workloads that need more than 8GB for a process (virtualization) or workloads that need lots of video memory (Lightroom can KILL an 8GB machine that isn't doing anything else...)
The 8GB is stupid discussion isn't "wrong" in the general case, but it is wrong for maybe 80% of users.
> EXCEPT for workloads that need more than 8GB for a process
Isn't that exactly the upthread contention: Apple's magic compressed swap management is still swap management that replaces O(1) fast(-ish) DRAM access with thousands+ cycle page decompression operations. It may be faster than storage, but it's still extremely slow relative to a DRAM fetch. And once your working set gets beyond your available RAM you start thrashing just like VAXen did on 4BSD.
Exactly!
Load a 4GB file and welcome the beach ball spinner any time you need to context switch to another app.
I don't know how they don't realize that because it's not really hard to get there.
But when I was enamored with Apple stuff in my formative years, I would gladly ignore that or brush it off so I can see where they come from, I guess.
It's not as different as the marketing would like you to think. In fact, for the low-end models even the bandwidth/speed isn't as big of a deal as they make it out to be, especially considering that bandwidth has to be shared for the GPU needs.
And if you go up in specs the bandwidth of Apple silicon has to be compared to the bandwidth of a combo with dedicated GPU. The bandwidth of dedicated GPUs is very high and usually higher than what Apple Silicon gives you if you consider the RAM bandwidth for the CPU.
It's a bit more complicated but that's marketing for you. When it comes to speed Apple RAM isn't faster than what can be found in high-end laptops (or desktops for that matter).
Not sure what windows does but the popular method on e.g. fedora is to split memory into main and swap and then compress swap. It could be more efficient the way Apple does it by not having to partition main memory.
I do admit the "reliance on swap" thing is speculation on my part :)
My experience is that I can still tell when the OS is unhappy when I demand more RAM than it can give. MacOS is still relatively responsive around this range, which I just attributed to super fast swapping. (I'd assume memory compression too, but I usually run into this trouble when working with large amounts of poorly-compressible data.)
In either case, I know it's frustrating when someone is confidently wrong but you can't properly correct them, so you have my apologies
I suggest you go and look HOW it is done in apple silicon macs, and then think long and hard why this might make a huge difference. Maybe Asahi Linux guys can explain it to you ;)
I understand that it can make a difference to performance (which is already baked into the benchmarks we look at), I don't see how it can make a difference to compression ratios, if anything in similar implementations (ex: console APUs) it tends to lead to worse compression ratios.
If there's any publicly available data to the contrary I'd love to read it. Anecdotally I haven't seen a significant difference between zswap on Linux and macOS memory compression in terms of compression ratios, and on the workloads I've tested zswap tends to be faster than no memory compression on x86 for many core machines.
I remember that Apple used to wave around these SIMD benchmarks showing their PowerPC chips trouncing Intel chips. In the fine print, you'd see that the benchmark was built to use AltiVec on PowerPC, but without MMX or SSE on Intel.
You can claim Apple is dishonest for a few reasons.
1) Graphs often are unannotatted.
2) Comparisons are rarely against latest generation products. (their argument for that has been that they do not expect people to upgrade yearly, so its showing the difference of their intended upgrade path).
3) They have conflated performance, for performance per watt.
However, when it comes to battery life, performance (for a task) or specification of their components (screens, ability to use external displays up to 6k, port speed etc) there are almost no hidden gotchas and they have tended to be trustworthy.
The first wave of M1 announcements were met with similar suspicion as you have shown here; but it was swiftly dispelled once people actually got their hands on them.
*EDIT:* Blaming a guy who's been dead for 13 years for something they said 50 years ago, and primarily it seems for internal use is weird. I had to look up the context but it seems it was more about internal motivation in the 70’s than relating to anything today, especially when referring to concrete claims.
Apple marketed their PPC systems as "a supercomputer on your desk", but it was nowhere near the performance of a supercomputer of that age. Maybe similar performance to a supercomputer from the 1970's, but that was their marketing angle from the 1990's.
From https://512pixels.net/2013/07/power-mac-g4/: the ad was based on the fact that Apple was forbidden to export the G4 to many countries due to its “supercomputer” classification by the US government.
It seems that US government was buying too much into tech hypes at the turn of the millenium. Around the same period PS2 exports were also restricted [1].
> Apple marketed their PPC systems as "a supercomputer on your desk"
It's certainly fair to say that twenty years ago Apple was marketing some of its PPC systems as "the first supercomputer on a chip"[^1].
> but it was nowhere near the performance of a supercomputer of that age.
That was not the claim. Apple did not argue that the G4's performance was commensurate with the state of the art in supercomputing. (If you'll forgive me: like, fucking obviously? The entire reason they made the claim is precisely because the latest room-sized supercomputers with leapfrog performance gains were in the news very often.)
The claim was that the G4 was capable of sustained gigaflop performance, and therefore met the narrow technical definition of a supercomputer.
You'll see in the aforelinked marketing page that Apple compared the G4 chip to UC Irvine’s Aeneas Project, which in ~2000 was delivering 1.9 gigaflop performance.
This chart[^2] shows the trailing average of various subsets of super computers, for context.
This narrow definition is also why the machine could not be exported to many countries, which Apple leaned into.[^3]
> Maybe similar performance to a supercomputer from the 1970's
What am I missing here? Picking perhaps the most famous supercomputer of the mid-1970s, the Cray-1,[^4] we can see performance of 160 MFLOPS, which is 160 million floating point operations per second (with an 80 MHz processor!).
The G4 was capable of delivering ~1 GFLOP performance, which is a billion floating point operations per second.
>That was not the claim. Apple did not argue that the G4's performance was commensurate with the state of the art in supercomputing.
This is marketing we're talking about, people see "supercomputer on a chip" and they get hyped up by it. Apple was 100% using the "supercomputer" claim to make their luddite audience think they had a performance advantage, which they did not.
> The entire reason they made the claim is
The reason they marketed it that way was to get people to part with their money. Full stop.
In the first link you added, there's a photo of a Cray supercomputer, which makes the viewer equate Apple = Supercomputer = I am a computing god if I buy this product. Apple's marketing has always been a bit shady that way.
And soon after that period Apple jumped off the PPC architecture and onto the x86 bandwagon. Gimmicks like "supercomputer on a chip" don't last long when the competition is far ahead.
> This is marketing we're talking about, people see "supercomputer on a chip" and they get hyped up by it.
That is also not in dispute. I am disputing your specific claim that Apple somehow suggested that the G4 was of commensurate performance to a modern supercomputer, which does not seem to be true.
> Apple was 100% using the "supercomputer" claim to make their luddite audience think they had a performance advantage, which they did not.
This is why context is important (and why I'd appreciate clarity on whether you genuinely believe a supercomputer from the 1970s was anywhere near as powerful as a G4).
In the late twentieth and early twenty-first century, megapixels were a proxy for camera quality, and megahertz were a proxy for processor performance. More MHz = more capable processor.
This created a problem for Apple, because the G4's SPECfp_95 (floating point) benchmarks crushed Pentium III at lower clock speeds.
PPC G4 500 MHz - 22.6
PPC G4 450 MHz - 20.4
PPC G4 400 MHz - 18.36
Pentium III 600 MHz – 15.9
For both floating point and integer benchmarks, the G3 and G4 outgunned comparable Pentium II/III processors.
You can question how this translates to real world use cases – the Photoshop filters on stage were real, but others have pointed out in this thread that it wasn't an apples-to-apples comparison vs. Wintel – but it is inarguable that the G4 had some performance advantages over Pentium at launch, and that it met the (inane) definition of a supercomputer.
> The reason they marketed it that way was to get people to part with their money. Full stop.
Yes, marketing exists to convince people to buy one product over another. That's why companies do marketing. IMO that's a self-evidently inane thing to say in a nested discussion of microprocessor architecture on a technical forum – especially when your interlocutor is establishing the historical context you may be unaware of (judging by your comment about supercomputers from the 1970s, which I am surprised you have not addressed).
I didn't say "The reason Apple markets its computers," I said "The entire reason they made the claim [about supercomputer performance]…"
Both of us appear to know that companies do marketing, but only you appear to be confused about the specific claims Apple made – given that you proactively raised them, and got them wrong – and the historical backdrop against which they were made.
> In the first link you added, there's a photo of a Cray supercomputer
That's right. It looks like a stylized rendering of a Cray-1 to me – what do you think?
> which makes the viewer equate Apple = Supercomputer = I am a computing god if I buy this product
The Cray-1's compute, as measured in GFLOPS, was approximately 6.5x lower than the G4 processor.
I'm therefore not sure what your argument is: you started by claiming that Apple deliberately suggested that the G4 had comparable performance to a modern supercomputer. That isn't the case, and the page you're referring to contains imagery of a much less performant supercomputer, as well as a lot of information relating to the history of supercomputers (and a link to a Forbes article).
> Apple's marketing has always been a bit shady that way.
All companies make tradeoffs they think are right for their shareholders and customers. They accentuate the positives in marketing and gloss over the drawbacks.
Note, too, that Adobe's CEO has been duped on the page you link to. Despite your emphatic claim:
> Apple was 100% using the "supercomputer" claim to make their luddite audience think they had a performance advantage, which they did not.
The CEO of Adobe is quoted as saying:
> “Currently, the G4 is significantly faster than any platform we’ve seen running Photoshop 5.5,” said John E. Warnock, chairman and CEO of Adobe.
How is what you are doing materially different to what you accuse Apple of doing?
> And soon after that period Apple jumped off the PPC architecture and onto the x86 bandwagon.
They did so when Intel's roadmap introduced Core Duo, which was significantly more energy-efficient than Pentium 4. I don't have benchmarks to hand, but I suspect that a PowerBook G5 would have given the Core Duo a run for its money (despite the G5 being significantly older), but only for about fifteen seconds before thermal throttling and draining the battery entirely in minutes.
My iBook G4 was absolutely crushed by my friends Wintel laptops that they bought for half as much.
Granted it was more carriable and had somewhat better battery life (needed it cause how much longer was needed to do stuff) but really performance was not a good reason to go with Apple hardware, and that still holds true as far as I'm concerned.
That is a long time – bet it felt even longer to the poor PowerBook DRI at Apple who had to keep explaining to Steve Jobs why a G5 PowerBook wasn't viable!
That's funny you say that, because this is precisely the time, I started buying Macs (I got a Pismo PowerBook G3 gifted and then bought an iBook G4). And my experience was that for sure, if you put as much money into a PC than in a Mac you would get MUCH better performance.
What made it worth it at the time (I felt) was the software. Today I'm really don't think so, software has improved overall in the industry and there is not a lot of things "Mac specific" that makes it a clear-cut choice.
As for the performance I can't believe all the Apple silicon hype. Sure, it gets good battery life given you use strictly Apple software (or software optimized for it heavily) but in mixed workload situation it's not that impressive.
Using the M2 MacBook Pro of a friend I figured I could get maybe 4-5 hours out of its best case scenario which is better than the 2-3 hours you would get from a PC laptop but also not that great considering the price difference.
And when it comes to performance it is extremely unequal and very lackluster for many things. Like there is more lag launching Activity Monitor on a 2K++ MacBook Pro than launching task manager on a 500 PC. This is a small somewhat stupid example but it does tell the overall story.
They talk a big game but in reality, their stuff isn't that performant in the real world.
And they still market games when one of their 2K laptops plays Dota 2 (a very old, relatively ressource efficient game) worse than a cheapo PC.
> Using the M2 MacBook Pro of a friend I figured I could get maybe 4-5 hours out of its best case scenario which is better than the 2-3 hours you would get from a PC laptop but also not that great considering the price difference.
Yes, but I stopped caring about electron apps some time ago. You can't just drop or ignore useful software to satisfy Apple marketing.
Just like you can't just ignore Chrome for Safari to satisfy the autonomy claims, because Chrome is much more useful and better at quite a bit of things.
I went the way of only Apple and Apple optimized software for quite a while but I just can't be bothered anymore, considering the price of the hardware and nowadays the price of subscription software.
And this is exactly my argument I you use the hardware in a very specific way, you get there but it is very limiting, annoying and inacceptable considering the pricing.
It's like saying that a small city car gets more gas mileage when what one needs is actually a capable truck. It's not strictly wrong but also not very helpful.
I think the Apple Silicon laptops are very nice if you can work within the limitations, but the moment you start pushing on those you realize they are not really worth the money.
Just like the new iPad Pro they released; completely awesome hardware but how many people can actually work within the limitations of iPad OS to make the price not look like a complete ripoff. Very few I would argue.
Oh those megahertz myths! Their marketing department is pretty amazing at their spin control. This one was right up there with "it's not a bug; it's a feature" type of spin.
Before macOS became NextStep it was practically a different company. I’ve been using Apple hardware for 21 years, when they got a real operating system. Even the G4 did better than the laptop it replaced.
Yeah, the assumption seems to be that using less battery by one component means that the power will just magically go unused. As with everything else in life, as soon as something stops using a resource something else fills the vacuum to take advantage of the resource.
Quickly looking at the press release, it seems to have the same comparisons as in the video. None of Apple's comparisons today are between the M3 and M4. They are ALL comparing the M2 and M4. Why? It's frustrating, but today Apple replaced a product with an M2 with a product with an M4. Apple always compares product to product, never component to component when it comes to processors. So those specs are far more impressive than if we could have numbers between the M3 and M4.
Apple was comparing the power envelope (already a complicated concept) of their GPU against a 3090. Apple wanted to show that the peak of their GPU's performance was reached with a fraction of the power of a 3090. What was terrible was that Apple was cropping their chart at the point where the 3090 was pulling ahead in pure compute by throwing more watts at the problem. So their GPU was not as powerful as a 3090, but a quick glance at the chart would completely tell you otherwise.
Ultimately we didn't see one of those charts today, just a mention about the GPU being 50% more efficient than the competition. I think those charts are beloved by Johny Srouji and no one else. They're not getting the message across.
Plenty of people on HN thought that M1 GPU is as powerful as 3090 GPU, so I think the message worked very well for Apple.
They really love those kind of comparisons - e.g. they also compared M1s against really old Intel CPUs to make the numbers look better, knowing that news headlines won't care for details.
They compared against really old intel CPUs because those were the last ones they used in their own computers! Apple likes to compare device to device, not component to component.
Yes, can't remember the precise combo either, there was a solid year or two of latent misunderstandings.
I eventually made a visual showing it was the same as claiming your iPhone was 3x the speed of a Core i9: Sure, if you limit the power draw of your PC to a battery the size of a post it pad.
Similar issues when on-device LLMs happened, thankfully, quieted since then (last egregious thing I saw was stonk-related wishcasting that Apple was obviously turning its Xcode CI service into a full-blown AWS competitor that'd wipe the floor with any cloud service, given the 2x performance)
I don't know who would prefer to do music or video editing on smaller display, without keyboard for shortcuts, without proper file system and with problematic connectivity to external hardware. Sure, it's possible, but why? Ok, maybe there's some usecase on the road where every gram counts, but that seems niche.
I’m vaguely considering this but entirely for the screen. The chip has been irrelevant to me for years, it’s long past the point where I don’t notice it.
A series was definitely not good enough. Really depends on what you're using it for. Netflix and web? Sure. But any old HDR tablet, that can maintain 24Hz, is good enough for that.
These are 2048x2732 with 120Hz displays, that support 6k external displays. Gaming and art apps push them pretty hard. From the iPad user in my house, goin from the 2020 non M* iPad to a 2023 M2 iPad made a huge difference for the drawing apps. Better latency is always better for drawing, and complex brushes (especially newer ones), selections, etc, would get fairly unusable.
For gaming, it was pretty trivial to dip well below 60Hz with a non M* iPad, with some of the higher demand games like Fortnight, Minecraft (high view distance), Roblox (it ain't what it used to be), etc.
But, the apps will always gravitate to the performance of the average user. A step function in performance won't show up in the apps until the adoption follows, years down the line. Not pushing the average to higher performance is how you stagnate the future software of the devices.
You’re right, it’s good enough for me. That’s what I meant but I didn’t make that clear at all. I suspect a ton of people are in a similar position.
I just don’t push it at all. The few games I play are not complicated in graphics or CPU needs. I don’t draw, 3D model, use Logic or Final Cut or anything like that.
I agree the extra power is useful to some people. But even there we have the M1 (what I’ve got) and the M2 models. But I bet there are plenty of people like me who mostly bought the pro models for the better screen and not the additional grunt.
Well, the obvious answer is that those with older machines are more likely to upgrade than those with newer machines. The market for insta-upgraders is tiny.
edit: And perhaps an even more obvious answer: there are no iPads that contained the M3, so the comparison would be more useless. The M4 was just launched today exclusively in iPads.
They know that anyone who has bought an M3 is good on computers for a long while. They're targeting people who have m2 or older macs. People who own an m3 are basically going to buy anything that comes down the pipe, because who needs an m3 over an m2 or even an m1 today?
I’m starting to worry that I’m missing out on some huge gains (M1 Air user.) But as a programmer who’s not making games or anything intensive, I think I’m still good for another year or two?
You're not going to be missing out on much. I had the first M1 Air and recently upgraded to an M3 Air. The M1 Air has years of useful life left and my upgrade was for reasons not performance related.
The M3 Air performs better than the M1 in raw numbers but outside of some truly CPU or GPU limited tasks you're not likely to actually notice the difference. The day to day behavior between the two is pretty similar.
If your current M1 works you're not missing out on anything. For the power/size/battery envelope the M1 Air was pretty awesome, it hasn't really gotten any worse over time. If it does what you need then you're good until it doesn't do what you need.
I have a 2018 15" MBP, and an M1 Air and honestly they both perform about the same. The only noticeable difference is the MBP takes ~3 seconds to wake from sleep and the M1 is instant.
I have an M1 Air and I test drove a friend's recent M3 Air. It's not very different performance-wise for what I do (programming, watching video, editing small memory-constrained GIS models, etc)
I wanted to upgrade my M1 because it was going to swap a lot with only 8 gigs of RAM and because I wanted a machine that could run big LLMs locally. Ended up going 8G macbook air M1 -> 64G macbook pro M1. My other reasoning was that it would speed up compilation, which it has, but not by too much.
The M1 air is a very fast machine and is perfect for anyone doing normal things on the computer.
Doesn't seem plausible to me that Apple will release a "M3 variant" that can drive "tandem OLED" displays. So probably logical to package whatever chip progress (including process improvements) into "M4".
And it can signal that "We are serious about iPad as a computer", using their latest chip.
Logical alignment to progresses in engineering (and manufacturing) packaged smartly to generate marketing capital for sales and brand value creation.
Wonder how the newer Macs will use these "tandem OLED" capabilities of the M4.
> I like the comparison between much older hardware with brand new to highlight how far we came.
That's ok, but why skip the previous iteration then? Isn't the M2 only two generations behind? It's not that much older. It's also a marketing blurb, not a reproducible benchmark. Why leave out comparisons with the previous iteration even when you're just hand-waving over your own data?
In this specific case, it's because iPad's never got the M3. They're literally comparing it with the previous model of iPad.
There were some disingenuous comparisons throughout the presentation going back to A11 for the first Neural Engine and some comparisons to M1, but the M2 comparison actually makes sense.
I wouldn't call the comparison to A11 disingenuous, they were very clear they were talking about how far their neural engines have come, in the context of the competition just starting to put NPUs in their stuff.
I mean, they compared the new iPad Pro to an iPod Nano, that's just using your own history to make a point.
Fair point—I just get a little annoyed when the marketing speak confuses the average consumer and felt as though some of the jargon they used could trip less informed customers up.
Yes, kinda annoying. But on the other hand, given that apple releases a new chip every 12 months, we can grant them some slack here. Given that from AMD, Intel or nvidia we see usually a 2 year cadence.
And yet they seem to be the only people picking the apparently "Low Hanging Fruit" in ARM land. We'll see about Qualcomm's Nuvia-based stuff, but that's been "nearly released" for what feels like years now, but you still can't buy one to actually test.
And don't underestimate the investment Apple made - it's likely at a similar level to the big x86 incumbents. I mean AMD's entire Zen development team cost was likely a blip on the balance sheet for Apple.
They don't care as much for the ARM stuff because software development investment vastly outweighs the chip development costs.
Sure, maybe they can do better but at what cost and for what? The only thing Apple does truly better is performance per watt which is not something that is relevant for a large part of the market.
x86 stuff is still competitive performance wise, especially in the GPU department where Apple attempts are rather weak compared to what is on offer across the pond.
The Apple Silicon switch cost a large amount of developer effort for optimisation, and in the process a lot of software compatibility was lost, it took a long time to get even the most popular softwares to get properly optimized and some software house even gave up on supporting macOS because it just wasn't worth the man hour investment considering the tiny market.
This is why I am very skeptical about the Qualcomm ARM stuff, it needs to be priced extremely well to have a chance, if consumers do not pick it up in droves, no software port is going to happen in a timely manner and it will stay irrelevant.
Considering the only thing much better than the current x86 offering is the performance per watt, I do not have a lot of hope, but I may be pleasantly surprised.
Apple aficionados keep raving about battery life but it's not really something a lot of people care about (appart for smartphones, where Apple isn't doing any better than the rest of industry).
speaking of which, whatever happened to qualcomm's bizarre assertion that ARM was pulling a sneak move in all its new licensing deals to outlaw third-party IP entirely and force ARM-IP-only?
there was one quiet "we haven't got anything like that in the contract we're signing with ARM" from someone else, and then radio silence. And you'd really think that would be major news, because it's massively impactful on pretty much everyone, since one of the major use-cases of ARM is as a base SOC to bolt your custom proprietary accelerators onto...
seemed like obvious bullshit at the time from a company trying to "publicly renegotiate" a licensing agreement they probably broke...
Again, not saying that they are easy (or cheap!) problems to solve, but that there are more relatively easy problems in the ARM space than the x86 space.
That’s why Apple can release a meaningfully new chip every year where it takes several for x86 manufacturers
> We'll see about Qualcomm's Nuvia-based stuff, but that's been "nearly released" for what feels like years now, but you still can't buy one to actually test.
That's more bound by legal than technical reasons...
I’ve not seen discussion that Apple likely scales performance of chips to match the use profile of the specific device it’s used in. An M2 in an iPad Air is very likely not the same as an M2 in an MBP or Mac Studio.
Surprisingly, I think it is: I was going to comment that here, then checked Geekbench, single core scores match for M2 iPad/MacBook Pro/etc. at same clock speed. i.e. M2 "base" = M2 "base", but core count differs, and with the desktops/laptops, you get options for M2 Ultra Max SE bla bla.
Yeah, but the difference is that you usually don't get people arguing that it's the same thing or that it can be performance competitive in the long run.
When it comes to Apple stuff, people say some irrational stuff that is totally bonkers...
I don't know, but the M3 MBP I got from work already gives the impression of using barely any power at all. I'm really impressed by Apple Silicon, and I'm seriously reconsidering my decision from years ago to never ever buy Apple again. Why doesn't everybody else use chips like these?
I have an M3 for my personal laptop and an M2 for my work laptop. I get ~8 hours if I'm lucky on my work laptop, but I have attributed most of that battery loss to all the "protection" software they put on my work laptop that is always showing up under the "Apps Using Significant Power" category in the battery dropdown.
I can have my laptop with nothing on screen, and the battery still points to TrendMicro and others as the cause of heavy battery drain while my laptop seemingly idles.
I recently upgraded my personal laptop to the M3 MacBook pro and the difference is astonishing. I almost never use it plugged in because I genuinely get close to that 20 hour reported battery life. Last weekend I played a AAA Video Game through Xbox Cloud Gaming (awesome for mac gamers btw) and with essentially max graphics (rendered elsewhere and streamed to me of course), I got sucked into a game for like 5 hours and lost only 8% of my battery during that time, while playing a top tier video game! It really blew my mind. I also use GoLand IDE on there and have managed to get a full day of development done using only about 25-30% battery.
So yeah, whatever Apple is doing, they are doing it right. Performance without all the spyware that your work gives you makes a huge difference too.
Over the weekend, I accidentally left my work M3 unplugged with caffeinate running (so it doesn't sleep). It wasn't running anything particularly heavy, but still, on Monday, 80% charge left.
That's mindblowing. Especially since my personal laptop is a Thinkpad X1 Extreme. I can't leave that unplugged at all.
Apple quote 18h of Apple TV playback or 12h of web browsing so I will call a large amount of bullshit on that.
Even considering the marketing, best case scenario you would be between 27 and 41% battery consumption for 5h of runtime.
The actual number will be lower than that because you probably don't use the MBP at the low brightness they use for marketing benchmarks and game streaming constantly requires power for the wifi chip (video can buffer, hence the lower consumption).
There is no way to say this nicely, but can you stop lying ?
For the AAA video game example, I mean, it is interesting how far that kind of tech has come… but really that’s just video streaming (maybe slightly more difficult because latency matters?) from the point of view of the laptop, right? The quality of the graphics there have more-or-less nothing to do with the battery.
I think the market will move to using chips like this, or at least have additional options. The new Snapdragon SOC is interesting, and I would suspect we could see Google and Microsoft play in this space at some point soon.
> is it REALLY half the power use of all times (so we'll get double the battery life)
I'm not sure what you mean by "of all times" but half the battery usage of the processor definitely doesn't translate into double the battery life since the processor is not the only thing consuming power.
It's not just games. There is in fact not a lot of stuff that Apple Silicon can run well.
In theory you get great battery life, to use software nobody wants to use or to take longer to run stuff that is not running well.
The problem is two-fold, first the marketing bullshit does not match the reality and second the Apple converted will lie without even thinking about it to justify the outrageous price.
There are a lot of things I like about Apple hardware but the reality is that they can charge so much because there is a lot of mythology around their products and it just doesn't add up.
Now if only they be bothered to actually make software great (and not simpleton copies of what already exists), there would be an actual valid reason to unequivocally recommend their stuff but they can't be bothered since they already make too much money as it is.
I mean, I remember Apple comparing the M1 Ultra to Nvidia's RTX 3090. While that chart was definitely putting a spin on things to say the least, and we can argue from now until tomorrow about whether power consumption should or should not be equalised, I have no idea why anyone would expect the M1 Pro (an explicitly much weaker chip) to perform anywhere near the same.
Also what games are you trying to play on it? All my M-series Macbooks have run games more than well enough with reasonable settings (and that has a lot more to do with OS bugs and the constraints of the form factor than with just the chipset).
That is fault of the devs. Because optimization for dedicated graphic cards is a either integrated in the game engine or they just have a version for rtx users.
Apple might use simplified and opaque plots to drive their point, but they all too often undersell the differences. Indepedent reviews for example find that they not just hit the mark Apple mentions for things like battery but that often do slightly better...
Well battery life would be used by other things too right? Especially by that double OLED screen. "best ever" in every keynote makes me laugh at this point, but it doesn't mean that they're not improving their power envelope.
You wouldn’t necessarily get twice the battery life. It could be less than that due to the thinner body causing more heat, a screen that utilizes more energy, etc
If there is any dishonesty, I would wager it is a case of it can double the battery life in low power scenarios. Can go twice as long when doing word processing for instance. Can potentially idle a lot lower
Actually, TSMC's N3E process is somewhat of a regression on the first-generation 3nm process, N3. However, it is simpler and more cost-efficient, and everyone seems to want to get out of that N3 process as quickly as possible. That seems to be the biggest reason Apple released the A17(M3) generation and now the M4 the way they did.
The N3 process is in the A17 Pro, the M3, M3 Pro, and M3 Max. The A17 Pro name seems to imply you won't find it trickle down on the regular iPhones next year. So we'll see that processor only this year in phones, since Apple discontinues their Pro range of phones every year; only the regular phones trickle downrange lowering their prices. The M3 devices are all Macs that needed an upgrade due to their popularity: the Macbook Pro and Macbook Air. They made three chips for them, but they did not make an M3 Ultra for the lower volume desktops. With the announcement of an M4 chip in iPads today, we can expect to see the Macbook Air and Macbook Pro upgraded to M4 soon, with the introduction of an M4 Ultra to match later. We can now expect those M3 devices to be discontinued instead of going downrange in price.
That would leave one device with an N3 process chip: the iMac. At its sale level, I wouldn't be surprised if all the M3 chips that will go into it will be made this year, with the model staying around for a year or two running on fumes.
N3E still has a +9% logic transistor density increase on N3 despite a relaxation to design rules, for reasons such as introduction of FinFlex.[1] Critically though, SRAM cell sizes remain the same as N5 (reversing the ~5% reduction in N3), and it looks like the situation with SRAM cell sizes won't be improving soon.[2][3] It appears more likely that designers particularly for AI chips will just stick with N5 as their designs are increasingly constrained by SRAM.
SRAM has really stalled. I don’t think 5nm was much better than 7nm. On ever smaller nodes, sram will be taking up a larger and larger percent of the entire chip. But the cost is much higher on the smaller nodes even if the performance is not better.
I can see why AMD started putting the SRAM on top.
It wasn't immediately clear to me why SRAM wouldn't scale like logic. This[1] article and this[2] paper sheds some light.
From what I can gather the key aspects are that decreased feature sizes lead to more variability between transistors, but also to less margin between on-state and off-state. Thus a kind of double-whammy. In logic circuits you're constantly overwriting with new values regardless of what was already there, so they're not as sensitive to this, while the entire point of a memory circuit is to reliably keep values around.
Alternate transistor designs such as FinFET, Gate-all-around and such can provide mitigation of some of this, say by reducing transistor-to-transistor variability by a factor, but can't get around root issue.
There is also the fact that we currently have an iPhone generation where only the Pro models got updated to chips on TSMC 3nm.
The next iPhone generation is said to be a return to form with all models using the same SOC on the revised version of the 3nm node.
> Code from the operating system also indicates that the entire iPhone 16 range will use a new system-on-chip – t8140 – Tahiti, which is what Apple calls the A18 chip internally. The A18 chip is referenced in relation to the base model iPhone 16 and 16 Plus (known collectively as D4y within Apple) as well as the iPhone 16 Pro and 16 Pro Max (referred to as D9x internally)
If they maintain that pace, it will start compounding incredibly quickly. If we round to 2 years vs 2.5 years, after just a decade you're an entire doubling ahead.
We've seen a slow march over the last decade towards the unification of iOS and macOS. Maybe not a "it runs macOS", but an eventual "they share all the same apps" with adaptive UIs.
Unfortunately I think "they share all the same apps" will not include a terminal with root access, which is what would really be needed to make iPad a general purpose computer for development
It's a shame, because it's definitely powerful enough, and the idea of traveling with just an iPad seems super interesting, but I imagine they will not extend those features to any devices besides macs
I mean, it doesn't even have to be true "root" access. Chromebooks have a containerized linux environment, and aside from the odd bug, the high end ones are actually great dev machines while retaining the "You spend most of your time in the browser so we may as well bake that into the OS" base layer.
I actually do use a Chromebook in this way! Out of all the Linux machines I've used, that's why I like it. Give me a space to work and provide an OS that I don't have to babysit or mentally maintain.
Been a while since I've used a chromebook but iirc there's ALSO root access that's just a bit more difficult to access, and you do actually need to access it from time to time for various reasons, or at least you used to.
You're thinking of Crouton, the old method of using linux on a Chromebook (which involved disabling boot protection and setting up a second linux install in a chroot, with a keybind that allowed you to toggle between the two environments). It was including a
Crostini is the new containerized version that is both officially supported and integrated into ChromeOS
The writing was on the wall with the introduction of Swift, IMO. Since then it's been over complicating the iPad and dumbing down the macOS interfaces to attain this goal. So much wasted touch/negative space in macOS since Catalina to compensate for fingers and adapative interfaces; so many hidden menus and long taps squirreled away in iOS.
They probably saw the debacle that was Windows 8 and thought merging a desktop and touch OS is a decade-long gradual task, if that is even the final intention.
Unlike MS that went with the Big Bang in your face approach that was oh-so successful.
At this point, there's two fundamentally different types of computing that will likely never be mergeable in a satisfactory way.
We now have 'content consumption platforms' and 'content creation platforms'.
While attempts have been made to try and enable some creation on locked-down touchscreen devices, you're never going to want to try and operate a fully-featured version of Photoshop, Maya, Visual Studio, etc on them. And if you've got a serious workstation with multiple large monitors and precision input devices, you don't want to have dumbed-down touch-centric apps forced upon you Win8-style.
The bleak future that seems likely is that the 'content creation platforms' become ever more niche and far more costly. Barriers to entry for content creators are raised significantly as mainstream computing is mostly limited to locked-down content consumption platforms. And Linux is only an option for as long as non-locked-down hardware is available for sensible prices.
Kinda weird to exclude Procreate, Affinity, Final Cut, Logic, etc. from your definition of content creation. The trend has clearly been more professional and creative apps year over year and ever more capable devices to run them on. I mean, you're right that nobody wants to use Photoshop on the iPad, but that's because there are better options.
Honestly, the biggest barrier to creativity is thinking you need a specific concept of a "serious workstation" to do it. Plenty of people are using $2k+ desktops just to play video games.
In these cases, it still seems that tablet-based tools are very much 'secondary tools', more of a sketchpad to fiddle with ideas while on the move, rather than 'production tools'.
Then there's the whole dealing with lots of files and version control side of things, essential for working as part of a team. Think about creating (and previewing, and finally uploading) a very simple web page, just HTML and a couple of images, entirely on an iPad. While it's probably quite possible these days, I suspect the workflow would be abysmal compared to a 'proper computer' where the file system isn't hidden from you and where you're not constantly switching between full-screen apps.
And that's before you start dealing with anything with significant numbers of files in deep directory structures, or doing more technical image creation (e.g. dealing with alpha channels). And of course, before testing your webpage on all the major browsers. Hmm...
There are so many artists who exclusively work on their iPad. It does seem cumbersome for a whole studio to use iPads, but they can be a powerhouse for an individual
But nobody is using iPads as a sole production tool. It's part of the production tooling but it's not exactly an essential part or a part that can't get rid of or replace easily, unlike a "real" computer.
It's rather disingenuous to pretend that an iPad can be sufficient. At its price tag it is still a rather extremely expensive accessory and people pretending otherwise are just full of it. There are enough reviews/testimonies saying as much (even from the diehard fans) for it to be an obvious fact.
> At this point, there's two fundamentally different types of computing that will likely never be mergeable in a satisfactory way.
This is a completely artificial creation by Apple and Google to extract more money from you. Nothing technical prevents one from using a full OS on a phone today.
On the other hand, a $4000 mid-game Macbook doesn’t have a touchscreen and that’s a heresy. Granted, you can get the one with the emoji bar, but why interact using touch on a bar when you could touch the screen directly?
Maybe the end game for Apple isn’t the full convergence, but just having a touch screen on the Mac.
Why would you want greasy finger marks on your Macbook screen?
Not much point having a touchscreen on a Macbook (or any laptop really), unless the hardware has a 'tablet mode' with a detachable or fold-away keyboard.
Mouse and keyboard is still a better interface for A LOT of work. I have yet to find a workflow for any of my professional work that would be faster or easier if you gave me a touchscreen.
There are plenty of laptops that do have touchscreens, and it has always felt more like a gimmick than a useful hardware interface.
> Barriers to entry for content creators are raised significantly as mainstream computing is mostly limited to locked-down content consumption platforms. And Linux is only an option for as long as non-locked-down hardware is available for sensible prices.
Respectfully, I disagree partially. It has never been easier or more affordable to get into creating content. You can create cinema grade video with used cameras that sell for a few hundred dollars. You can create pixar level animation on open source software, and a pretty cheap computer. A computer that can edit a 4k video costs less than the latest iPhone. There are people that create plenty of content with just a phone. Simply put it is orders of magnitude cheaper and easier to create content than it was less than two decades ago, which is why we are seeing so much content getting made. I used to work for a newspaper and it used to be a lot harder and more expensive to produce audio visual media.
My strong feeling is that the problem of content being locked into platforms has precious little to do with consumption oriented hardware, and more to do with the platforms. Embrace -> extinguish -> exlcusivity -> enshittify seems to be the model behind basically anything that hosts user content these days.
People have complained about why Logic Pro / Final Cut wasn't ported to the iPad Pro line. The obvious answer is that making workflows done properly take time.
You're right about the reason but wrong about the timeline: Jobs saw Windows XP Tablet Edition and built a skunkworks at Apple to engineer a tablet that did not require a stylus. This was purely to spite a friend[0] of his that worked at Microsoft and was very bullish on XP tablets.
Apple then later took the tablet demo technology, wrapped it up in a very stripped-down OS X with a different window server and UI library, and called it iPhone OS. Apple was very clear from the beginning that Fingers Can't Use Mouse Software, Damn It, and that the whole ocean needed to be boiled to support the new user interface paradigm[1]. They even have very specific UI rules specifically to ensure a finger never meets a desktop UI widget, including things like iPad Sidecar just not forwarding touch events at all and only supporting connected keyboards, mice, and the Apple Pencil.
Microsoft's philosophy has always been the complete opposite. Windows XP through 7 had tablet support that amounted to just some affordances for stylus users layered on top of a mouse-only UI. Windows 8 was the first time they took tablets seriously, but instead of just shipping a separate tablet OS or making Windows Phone bigger, they turned it into a parasite that ate the Windows desktop from the inside-out.
This causes awkwardness. For example, window management. Desktops have traditionally been implemented as a shared data structure - a tree of controls - that every app on the desktop can manipulate. Tablets don't support this: your app gets one[2] display surface to present their whole UI inside of[3], and that surface is typically either full-screen or half-screen. Microsoft solved this incongruity by shoving the entire Desktop inside of another app that could be properly split-screened against the new, better-behaved tablet apps.
If Apple were to decide "ok let's support Mac apps on iPad", it'd have to be done in exactly the same way Windows 8 did it, with a special Desktop app that contained all the Mac apps in a penalty box. This is so that they didn't have to add support for all sorts of incongruous, touch-hostile UI like floating toolbars, floating pop-ups, global menus, five different ways of dragging-and-dropping tabs, and that weird drawer thing you're not supposed to use anymore, to iPadOS. There really isn't a way to gradually do this, either. You can gradually add feature parity with macOS (which they should), but you can't gradually find ways to make desktop UI designed by third-parties work on a tablet. You either put it in a penalty box, or you put all the well-behaved tablet apps in their own penalty boxes, like Windows 10.
Microsoft solved Windows 8's problems by going back to the Windows XP/Vista/7 approach of just shipping a desktop for fingers. Tablet Mode tries to hide this, but it's fundamentally just window management automation, and it has to handle all the craziness of desktop. If a desktop app decides it wants a floating toolbar or a window that can't be resized[4], Tablet Mode has to honor that request. In fact, Tablet Mode needs a lot of heuristics to tell what floating windows pair with which apps. So it's a lot more awkward for tablet users in exchange for desktop users having a usable desktop again.
[0] Given what I've heard about Jobs I don't think Jobs was psychologically capable of having friends, but I'll use the word out of convenience.
[1] Though the Safari team was way better at building compatibility with existing websites, so much so that this is the one platform that doesn't have a deep mobile/desktop split.
[2] This was later extended to multiple windows per app, of course.
[3] This is also why popovers and context menus never extend outside their containing window on tablets. Hell, also on websites. Even when you have multiwindow, there's no API surface for "I want to have a control floating on top of my window that is positioned over here and has this width and height".
[4] Which, BTW, is why the iPad has no default calculator app. Before Stage Manager there was no way to have a window the size of a pocket calculator.
Clip Studio is one Mac app port I’ve seen that was literally the desktop version moved to the iPad. It uniquely has the top menu bar and everything. They might have made an exception because you’re intended to use the pencil and not your fingers.
Honestly, using a stylus isn't that bad. I've had to support floor traders for many years and they all still use a Windows-based tablet + a stylus to get around. Heck, even Palm devices were a pleasure to use. Not sure why Steve was so hell bent against them, it probably had to do with his beef with Sculley/Newton.
Even with the advantage of time, I don't think Microsoft would have been able to do it. They can't even get their own UI situated, much less adaptive. Windows 10/11 is this odd mishmash of old and new, without a consistent language across it. They can't unify what isn't even cohesive in the first place.
I will settle for: you can connect 2 monitors to iPad and select audio device sound is going through. If can run IntelliJ and compile rust on the iPad, I would promise to upgrade to the new iPad Pro as soon as it is released every time.
Agreed, this will be the way forward in the future. I've already seen one of my apps (Authy) say "We're no longer building a macOS version, just install the iPad app on your mac".
That's great, but you need an M series chip in your mac for that to work so backwords compatibility only goes back a few years at this point, which is fine for corporate upgrade cycles but might be a bit short for consumers at this time. But it will be fine in the future.
Michelin-starred restaurants not only have top-tier chefs. They have buyers who negotiate with food suppliers to get the best ingredients they can at the lowest prices they can. Having a preferential relationship with a good supplier is as important to the food quality and the health of the business as having a good chef to prepare the dishes.
Apple has top-tier engineering talent but they are also able to negotiate preferential relationships with their suppliers, and it's both those things that make Apple a phenomenal tech company.
Qualcomm is also with TSMC and their newer 4nm processor is expected to stay competitive with the M series.
If the magic comes mostly from TSMC, there's a good chance for these claims to be true and to have a series of better chips coming on the other platforms as well.
“Stay” competitive implies they’ve been competitive. Which they haven’t.
I’m filing this into the bin with all the other “This next Qualcomm chip will close the performance gap” claims made over the past decade. Maybe this time it’ll be true. I wouldn’t bet on it.
Point taken. I used "stay" as in, their next rumored/leaked chip wouldn't be an single anomalous success but the start of a trend that could expand to the X2, X3 Elite chips coming after.
Basically we'd need some basis to believe they'll be progressively improving at more or less the same pace as Intel's or Apple's chips to get on board with ARM laptops for Windows/linux.
Otherwise I don't see software makers care enough to port their build to ARM as well.
Sadly, this is only processor power consumption, you need to put power into a whole lot of other things to make an useful computer… a display backlight and the system's RAM come to mind as particular offenders.
backlight is now the main bottleneck for consumption heavy uses. I wonder what are the main advancements that are happening there to optimize the wattage.
If the usecases involve
working on dark terminals all day or watching movies with dark scenes or if the general theme is dark, may be the new oled display will help reduce the display power consumption too.
AMD gpus have "Adaptive Backlight Management" which reduces your screen's backlight but then tweaks the colors to compensate. For example, my laptop's backlight is set at 33% but with abm it reduces my backlight to 8%. Personally I don't even notice it is on / my screen seems just as bright as before, but when I first enabled it I did notice some slight difference in colors so its probably not suitable for designers/artists. I'd 100% recommend it for coders though.
Strangely, Apple seems to be doing the opposite for some reason (Color accuracy?), as dimming the display doesn't seem to reduce the backlight as much, and they're using a combination of software dimming, even at "max" brightness.
Evidence can be seen when opening up iOS apps, which seem to glitch out and reveals the brighter backlight [1]. Notice how #FFFFFF white isn't the same brightness as the white in the iOS app.
The max brightness of the desktop is gonna be lower than the actual max brightness of the panel, because the panel needs to support HDR content. That brightness would be too much for most cases
This was a photo of my MBA 15" which doesn't have an HDR capable screen afaik. Additionally, this artifacting happens at all brightness levels, including the lowest.
It also just doesn't seem ideal that some apps (iOS) appear much brighter than the rest of the system. HDR support in macOS is a complete mess, although I'm not sure if Windows is any better.
Dang, yeah, this is the opposite of what I had in mind
I was thinking, like, a couple hundred dollar Kindle the size of a big iPad I can plug into a laptop for text-editing out and about. Hell, for my purposes I'd love an integrated keyboard.
Basically a second, super-lightweight laptop form-factor I can just plug into my chonky Macbook Pro and set on top of it in high-light environments when all I need to do is edit text.
Honestly not a compelling business case now that I write it out, but I just wanna code under a tree lol
I think we're getting pretty close to this. The Remarkable 2 tablet is $300, but can't take video input and software support for non-notetaking is near non-existent. There's even a keyboard available. Boox and Hisense are also making e-ink tablets/phones for reasonable prices.
If that existed as a drop-in screen replacement on the framework laptop and with a high refresh rate color gallery 3 panel, then I'd buy it at that price point in a heart beat.
I can't replace my desktop monitor with eink because I occasionally play video games. I can't use a 2nd monitor because I live in a small apartment.
I can't replace my laptop screen with greyscale because I need syntax highlighting for programming.
Maybe the $100 nano-texture screen will give you the visibility you want. Not the low power of a epaper screen though.
Hmm, emacs on an epaper screen might be great if it had all the display update optimization and "slow modem mode" that Emacs had back in the TECO days. (The SUPDUP network protocol even implemented that at the client end and interacted with Emacs directly!)
QD-OLED is an engineering improvement, i.e. combining existing researched technology to improve the result product. I wasn't able to find a good source on what exactly it improves in efficiency, but it's not a fundamental improvement in OLED electrical→optical energy conversion (if my understanding is correct.)
In general, OLED screens seem to have an efficiency around 20≈30%. Some research departments seem to be trying to bump that up [https://www.nature.com/articles/s41467-018-05671-x] which I'd be more hopeful on…
…but, honestly, at some point you just hit the limits of physics. It seems internal scattering is already a major problem; maybe someone can invent pixel-sized microlasers and that'd help? More than 50-60% seems like a pipe dream at this point…
…unless we can change to a technology that fundamentally doesn't emit light, i.e. e-paper and the likes. Or just LCD displays without a backlight, using ambient light instead.
Is the iPad Pro not yet on OLED? All of Samsung's flagship tablets have OLED screens for well over a decade now. It eliminates the need for backlighting, has superior contrast and pleasant to ise in low-light conditions.
The iPad that came out today finally made the switch. iPhones made the switch around 2016. It does seem odd how long it took for the iPad to switch, but Samsung definitely switched too early: my Galaxy Tab 2 suffered from screen burn in that I was never able to recover from.
I'm not sure how OLED and backlit LCD compare power-wise exactly, but OLED screens still need to put off a lot of light, they just do it directly instead of with a backlight.
I don't expect an M4 macbook to last any longer than an M2 macbook of otherwise similar specs; they will spend that extra power budget on things other than the battery life specification.
Unfortunately Apple only ever thinks about battery life in terms of web surfing and video playback, so we don't get official battery-life figures for anything else. Perhaps you can get more battery life out of your iPad Pro web surfing by using dark mode, since OLEDs should use less power than IPS displays with darker content.
Yeah double the PPW does not mean double the battery, because unless you're pegging the CPU/SOC it's likely only a small fraction of the power consumption of a light-use or idle device, especially for an SOC which originates in mobile devices.
Doing basic web navigation with some music in the background, my old M1 Pro has short bursts at ~5W (for the entire SoC) when navigating around, a pair of watts for mild webapps (e.g. checking various channels in discord), and typing into this here textbox it's sitting happy at under half a watt, with the P-cores essentially sitting idle and the E cores at under 50% utilisation.
With a 100Wh battery that would be a "potential" of 150 hours or so. Except nobody would ever sell it for that, because between the display and radios the laptop's actually pulling 10~11W.
On my M1 air, I find for casual use of about an hour or so a day, I can literally go close to a couple weeks without needing to recharge. Which to me is pretty awesome. Mostly use my personal desktop when not on my work laptop (docked m3 pro).
No, they have a "battery budget". It the CPU power draw goes down that means the budget goes up and you can spend it on other things, like a nicer display or some other feature.
When you say "up to 10 hours" most people will think "oh nice that's an entire day" and be fine with it. It's what they're used to.
Turning that into 12 hours might be possible but are the tradeoffs worth it? Will enough people buy the device because of the +2 hour battery life? Can you market that effectively? Or will putting in a nicer fancy display cause more people to buy it?
We'll never get significant battery life improvements because of this, sadly.
The OLED likely adds a fair bit of draw; they're generally somewhat more power-hungry than LCDs these days, assuming like-for-like brightness. Realistically, this will be the case until MicroLEDs are available for non-completely-silly money.
OLED will use less for a screen of black and LCD will use less for a screen of white. Now, take whatever average of what content is on the screen and for you, it may be better or may be worse.
White background document editing, etc., will be worse, and this is rather common.
It's not weird when you consider that browsing the web or watching videos has the CPU idle or near enough, so 95% of the power draw is from the display and radios.
That's because M2 was on the same TSMC process generation as M1. TSMC is the real hero here. M4 is the same generation as M3, which is why Apple's marketing here is comparing M4 vs M2 instead of M3.
Actually, M4 is reportedly on a more cost-efficient TSMC N3E node, where Apple was apparently the only customer on the more expensive TSMC N3B node; I'd expect Apple to move away from M3 to M4 very quickly for all their products.
Saying tsmc is a hero ignores the thousands of suppliers that improved everything required for tsmc to operate. Tsmc is the biggest, so they get the most experience on all the new toys the world’s engineers and scientists are building.
It's almost as if every part of the stack -- from the uArch that Apple designs down to the insane machinery from ASML, to the fully finished SoC delivered by TSMC -- is vitally important to creating a successful product.
But people like to assign credit solely to certain spaces if it suits their narrative (lately, Apple isn't actually all that special at designing their chips, it's all solely the process advantage)
Saying TSMC's success is due to their suppliers ignores the fact that all of their competitors failed to keep up despite having access to the same suppliers. TSMC couldn't do it without ASML, but Intel and Samsung failed to do it even with ASML.
In contrast, when Apple's CPU and GPU competitors get access to TSMC's new processes after Apple's exclusivity period expires, they achieve similar levels of performance (except for Qualcomm because they don't target the high end of CPU performance, but AMD does).
Tsmc being the biggest let them experiment at 10x the rate. It turns out they had the right business model that Intel didn’t notice was there, it just requires dramatically lower margins and higher volumes and far lower paid engineers.
That doesn't seem to reflect in the battery life of these. They have the same exact battery life. Does it mean it's not entirely accurate? Since they don't indicate the battery capacity in their specs, it's hard to confirm this.
I haven't paid too much attention today, but what I did see with the iPad Pro was that they're using an OLED display (maybe even some kind of double layer OLED for increased brightness if I'm understanding the marketing jargon?).
I believe that OLED is much more power hungry than the previous display type (LED backlit LCD of some type?). I could be wrong, but in TV land that's the case...
Could explain, at least partly, why run time isn't greatly increased.
They made the battery on the 13" 5% smaller than the previous generation. They also write that they tested the device with auto-brightness disabled and brightness set at 50%. Not sure who the brightness slider works on the new iPads since the iPhones don't get max brightness unless auto-brightness is enabled. So 50% might be 1000/2=500 nits on the M4 iPad Pro and 600/2=300 nits on the M2 iPad Pro, or they might both be about 300 nits.
I have a minimum of 64GB on my all my main developer machines (home, work, laptop), but I have a spare laptop with only 8GB of RAM for lightweight travel.
Despite the entire internet telling me it would be "unusable" and a disaster and a complete disaster, it's actually 100% perfectly fine. I can run IDEs, Slack, Discord, Chrome, and do dev work without a problem. I can't run a lot of VMs or compile giant projects with 10 threads, of course, but for typical work tasks it's just fine.
And for the average consumer, it would also be fine. I think it's obvious that a lot of people are out of touch with normal people's computer use cases. 8GB of RAM is fine for 95% of the population and the other 5% can buy something more expensive.
But why did you configure 3 machines with 64+ GB, if 8 GB RAM are "100% perfectly fine" for typical work tasks?
For me personally 16 or 32 GB are perfectly fine, 8 GB was too little (even without VMs) and I've never needed 64 or more. So it's curious to see you are pretty much exactly the opposite.
I have the base M2 air with 8gb ram, and it's really been perfect for working on. The only time things have become an issue is dual user accounts being logged in at the same time. Which is very preventable.
Half my organisation runs on 8GB Chromebooks. We were testing one of our app changes the other day and it performed better on the Chromebook than it did on my i7 machine with 32GB.
> 8GB of RAM is fine for 95% of the population and the other 5% can buy something more expensive.
This argument is self defeating in the context of the M4 announcement. "Average consumers" who don't need 16 GB of RAM don't need an M4 either. But people who do need an M4 chip probably also need 16 GB of RAM.
I think actually more people need 16 GB of RAM rather than a top M4 chip. Having only 8 GB can be a serious limitation in some memory heavy circumstances, while having (say) an M2 SoC rather than an M4 SoC probably doesn't break any workflow at all, it just makes it somewhat slower.
For me personally, it’s not an issue of being out of touch. I did, in fact, use a 2014 Macbook with an i5 CPU and 16 GB of RAM for nearly a decade and know how often I hit swap and/or OOM on it even without attempting multicore shenanigans which its processor couldn’t have managed anyway.
It’s rather an issue of selling deliberately underpowered hardware for no good reason other than to sell actually up-to-date versions for a difference in price that has no relation to the actual availability or price of the components. The sheer disconnect from any kind of reality offends me as a person whose job and alleged primary competency is to recognize reality then bend it to one’s will.
I don't think we ever were at a point in computing were you could buy a high-end (even entry level macbooks have high-end pricing) laptop with the same amount of ram as you could 10 years earlier.
Hmm, unexpected. I was quite sure my partner's 2015 mbp was sitting at 4gb, but you win this one! ;)
Edit: I confirmed that I was indeed wrong, but the payoff isn't great anyway because that just means that yes in fact they've kept the exact same ram floor for 10 years. Insane.
Fine just doesn't cut it for a premium machine you expect to last a few years at least. It's honestly just marketed so you want to spend extra and upgrade. Let's be real.
I bought a second-hand office-grade PC recently, about a year ago. It was about $10 to $15, had no disks (obviously) and just 2 GB of DDR3 RAM. Also, an integrated GPU with some low-grade Intel CPU (Pentium, if I’m not wrong). Even the generation isn’t current, it’s about a decade old, a bit more.
I put a spare 120 GB SSD, a cheap no-name brand that was just lying around for some testing purposes. Found the similar off-the-shelf DDR3 2 GB RAM stick. I thought the RAM was faulty, turned out it’s in a working condition, so I put it there.
I need the computer for basic so-called office work (a browser, some messengers, email client and a couple of other utilities). I thought I’d buy at least two 4GB RAM sticks after I test it, so you know, 8 GB is just the bare imaginable minimum! I have my 16 GBs everywhere since, idk, maybe 2012 or something.
And you know what?! It works very well with 4 GB of RAM and default Fedora (it’s 40 now, but I started with 38, iirc). It has the default Gnome (also, 46 now, started with 44, iirc). And it works very well!
It doesn’t allow me to open a gazillion of browser tabs, but my workflow is designed to avoid it, so I have like 5 to 10 open simultaneously.
Before throwing Fedora at the PC, I thought I would just install a minimal Arch Linux with swaywm and be good. But I decided I don’t want to bother, and I’ll just buy 8 GB later on, and be done with it.
And here I am, having full-blown Gnome and just 4 GB of RAM. I don’t restrict myself too much, the only time I notice it’s not my main PC is when I want to do some heavy web-browsing (e.g. shopping on some different heavy websites with many tabs opened). Then it slows down significantly, till I close the unnecessary tabs or apps. All the software is updated and current, so it’s not like it’s some ancient PC from 00’s.
Also, I have my iPad Pro 12,9 1st Gen with just 4 GB of RAM too, and I never feel it’s slow for me.
I understand that some tasks would require a lot of RAM, and it’s not for everyone. Having a lot of RAM everywhere, I’m quite used to not thinking of it at all for a significant part of my career (for over a decade now), so I may have something opened for weeks that I don’t have any need for.
So, it’s 2024, and I’m surprised to say that 4 GB of RAM is plenty when you’re focused on some tasks and don’t multitask heavily. Which never productive for me at least. I even noticed that I enjoy my low-memory PC even more, as it reminds me with its slowdowns that I’m entering the multitasking state.
I use swaywm on my Arch Linux laptop, and most of the time it’s less than 3–4 Gb (I have 16 Gb).
This comes up frequently. 8GB is sufficient for most casual and light productivity use cases. Not everyone is a power user, in fact, most people aren’t.
At this point I don't think the frustration has much to do with the performance but rather RAM is so cheap that intentionally creating a bottleneck to extract another $150 from a customer comes across as greedy, and I am inclined to agree. Maybe the shared memory makes things more expensive but the upgrade cost has always been around the same amount.
It's not quite in the same ballpark as showing apartment or airfare listings without mandatory fees but it is at the ticket booth outside of the stadium.
The bigger problem is when you need a new machine fast, the apple store doesn't have anything but the base models in stock. In my org we bought a machine for a new developer who was leaving town, and were forced to buy an 8gb machine because the store didn't have other options (it was going to be a 2 week wait). As you can imagine, the machine sucked for running Docker etc and we had to sell it on facebook marketplace for a loss.
I've never encountered an actual Apple Store not having specced up machines on hand (maybe not EVERY possible configuration, but a decent selection). If you go to a non-Apple retailer, afaik, they are limited to the base spec machines (RAM wise), it's not even a matter of them being out of stock. If you want anything other than 8GB (or whatever the base amount is for that model) of RAM you need to go through Apple directly. This was the case, at least in Canada a few years ago, correct me if I'm wrong/things have changed.
Mine is 8GB M1 and it is not fine. But the actual issue for me isn't RAM as much as it is disk space, I'm pretty confident if it wasn't also the 128 GB SSD model it would handle the small memory just fine.
I'm still getting at least 16 GB on my next one though.
Yeah personally I find cheaping out on the storage far more egregious than cheaping out on the RAM. Even if you have most things offloaded onto the cloud, 128 GB was not even enough for that, and the 256 GB is still going to be a pain point even for many casual home users, and at the price point of Apple machines it's inexcusable to not add another $25 of flash
Both are disgusting for the price asked. It would be a lot easier to excuse all the other compromises if the base was 16/512, which would cost Apple like 50 bucks tops per machine.
But greed is unlimited, I guess.
> What’s the price difference between 8 and 16? Like $3 in wholesale prices.
Your estimates are not even close. You can't honestly think that LPDDR5 at leading edge speeds is only $3 per 64 Gb (aka 8GB), right?
Your estimate is off my an order of magnitude. The memory Apple is using is closer to $40 for that increment, not $3.
And yes, they include a markup, because nobody is integrating hardware parts and selling them at cost. But if you think the fastest LPDDR5 around only costs $3 for 8GB, that's completely out of touch with reality.
GP said "LPDDR5" and that Apple won't sell at component prices.
You mention DIMMs and component prices instead. This is unhelpful.
See https://www.digikey.com/en/products/filter/memory/memory/774... for LPDDR5 prices. You can get a price of $48/chip at a volume of 2000 chips. Assuming that Apple got a deal of $30-40-ish at a few orders of magnitude larger order is quite fair. Though it certainly would be nicer if Apple priced 8GB increments not much above $80-120.
I am aware that there are differences, I just took RAM DIMMs as a reference because there is a >0% chance that anyone reading this has actually ever bought a comparable product themselves.
As for prices, the prices you cited are not at all comparable. Apple is absolutely certainly buying directly from manufacturers without a middleman since we're talking about millions of units delivered each quarter. Based on those quantities, unit prices are guaranteed to be substantially lower than what DigiKey offers.
Based on what little public information I was able to find, spot market prices for LPDDR4 RAM seem to be somewhere in the 3 to 5$ range for 16GB modules. Let's be generous and put LPDDR5 at tripe the price with 15$ a 16GB module. Given the upgrade price for going from 8 to 16GB is 230 EUR Apple is surely making a huge profit on those upgrades alone by selling an essentially unusable base configuration for a supposed "Pro" product.
DDR5 DIMMs and LPDDR chips as in the MacBooks are not the same beasts at all.
A DIMM is 8 or 16 chips (9/18 is ECC), while the LPDDR is a single chip for the same storage. The wild density difference in chip capacity (512MB or 1GB vs 8GB) makes a huge difference, and how a stick can be sold at retail for cheaper than the bare LPDDR chip in volume.
Programming has a wierd way of requirering basically nothing some times, but other times you need to build the latest version of your toolchain, or you are working on some similarly huge project that takes ages to compile.
I was using my 4gb ram pinebook pro in public transport yesterday, and decided to turn of all cores except for a single Cortex-A53, to safe some battery. I had no problems for my usecase of a text editor + shell to compile for doing some SIMD programming.
The number of tabs you have doesn’t correlate to the number of active web views you have, if you use any browser that unloads background tabs while still saving their state.
Those are the type of "I'll go back later to it", The workflow on modern browser is broken. Instead of leveraging the bookmark functionality to improve the UX, we have this situation of user having 50+ tabs open, because they can. It takes quite a bit of discipline to close down tabs to a more manageable numbers.
It's really not weird. The more you charge for the base product and upgrades, serving the bare minimum becomes less acceptable. It also doesn't help that the 4GB base models from years past aged super quickly compared to it's higher end cousins.
The point you're missing is that it's about the future. I generally agree, but it's obvious everything becomes more RAM intensive as time goes on. Hell, even games can take more than 8 GB of purely VRam these days.
MacBook Air starts at 1,199 euro. For insane battery life, amazing performance, great screen and one of the lightest chassis. Find me comparable laptop, I’ll wait.
The screen is the killer. you can have a nice-ish 2nd corporate laptop with decent and swappable battery on which you can install a decent OS (non Windows) and get good milage but the screen is something else.
Asking for a machine with "insane battery life, amazing performance, great screen and one of the lightest chassis" and oh, it must be completely silent is a loaded set of demands. Apple in the current market is essentially the only player that can actually make a laptop that can meet your demands, at least without doing a bunch of research into something that's equivalent and hoping the goal posts don't move again.
this is extremely funny in the context of the protracted argument up-thread about what you could reasonably be comparing the macbook air against.
like, the $359 acer shitbox probably doesn't do all the exact same thing as the MBA either, but that's actually ok and really only demonstrates the MBA is an unaffordable luxury product, basically the same as a gold-plated diamond-encrusted flip-phone.
Not your circus, not your clowns, but this is sort of the duality of apple: "it's all marketing and glitz, a luxury product, there's no reason to buy it, and the fact that they have a better product only PROVES it" vs "of course no PC manufacturer could possibly be expected to offer a top-notch 120hz mini-LED screen, a good keyboard, great trackpad, good speakers, and good SOC performance in a thin-n-light..."
Race car drivers. They are pros. Professional drivers. They definitely know how to drive a car much more efficiently than I do, or anyone that’s just into cars. I assume the race car engineers are the pros at rebuilding engines.
And as for the parent comment’s point, being into cars doesn’t mean you’re as good as a professional race car driver.
If the iPad could run Mac apps when docked to Magic Keyboard like the Mac can run iPad apps then there may be a worthwhile middle ground that mostly achieves what people want.
The multitasking will still be poor but perhaps Apple can do something about that when in docked mode.
That said, development likely remains a non-starter given the lack of unix tooling.
iOS has become such a waste of great hardware, especially in the larger form factor of the iPad.
M1 chips, great screens, precise pencil input and keyboard support, but we still aren't permitted a serious OS on it, to protect the monopolistic store.
App Stores have beeen around long enough to prove that they're little more than laboratories in which to carry out accelerated enshittification experimentation. Everything so dumbed down and feature-light, yet demanding subscriptions or pushing endless scammy ads. And games that are a shameless introduction to gambling addiction, targeted at kids.
Most of the 'apps' that people actually use shouldn't need to be native apps anyway, they should be websites. And now we get the further enshittification of trying to force people out of the browser and into apps, not for a better experience, but for a worse one, where more data can be harvested and ads can't so easily be blocked...
I feel sorry you got downvoted so much. But it looks like nowadays people don't like to hear the truth, at least when it comes to Apple stuff.
I feel bad about all this because I advised my mother to buy an iPad Pro as it was sold as a laptop replacement but it never materialized anywhere near expectations and ended being an expensive mistake that can only offer compromised workflows with limitations that do not even come close to a MacBook she was used to (even in Apple own apps, even something like the Page app, yes).
I think the iPad would be a fine device, if only they didn't try to upsell it and stopped pretending it is a "Pro" thing.
If the whole lineup would get OLED displays, more RAM across the board and decent base storage for media at a more honest price it would be an easy recommendation.
As it is, the entry level is lackluster because of the display (if you only buy for content consumption it is better to go with an Android OLED options) and the high-end variant are just stupid expensive for what they will actually allow you to do.
But too many Apple customers have drunk the Kool-Aid so they don't have to care much about the reality and we just get expensive bullshit.
These specific model of tablets won't ever get MacOS support. Apple will tell you when you're allowed to run MacOS on a tablet, and they'll make you buy a new tablet specifically for that.
You haven't seen the size of my music, TV, or movie collection then. We have the technology to put it all on my phone and it should be cheaper, but it's not because of an absurd, money-and-design-only consumerist monoculture from the head bean counter who drifted far afield of the cool practicality and leadership of SJ.
Couldn't agree more. Cook deserves so much hate for what he did to Apple. I'm astounded people still worship Apple like nothing has changed at all.
All the changes under the Cook leadership are appalling and reveal what is definitely one of the worse character you can get as a leader. At this point even Bill Gates looks like a good guy in comparison.
The fundamental problem is TC and most of his immediate subordinates lack creative vision and the boldness to experiment and move beyond past wins that SJ shepherded. Apple needs a new leader who is both cool and interested in daring to take greater leaps of enriching the lives of users. First steps should be to give access to PCB circuit diagrams like computers of the 1970's, access to individual components for purchase by anyone through an "Amazon"-like supply chain, and access to security chip purchase and "recalibration" for verified owners. The works of art used as workhorses should be like an old Mercedes: able to keep going for years and treasured rather than fragile and disposable.
To be fair the ipod classic used a platter drive and ipads are high speed SSD storage. That being said, it's been years of the same storage options and at those prices it should be much higher, along with their iCloud storage offerings.
I can do it without even trying. I can't fit all of just any one of all of my music, shows, documentaries, books, or movies on any {i,iPad,tv,watch,vision}OS device ever made.
If 15 years of technological progress can’t find it cost effective to fit more than 256gb of solid state storage in a $1000 device, then what are we even doing here?
A 1 TB consumer-oriented SSD is about $50 today. At Apple’s manufacturing scale, do you have any doubt that the cost to them is nearly negligible?
I don't think it's strictly for price gouging/segmentation purposes.
On the Macbooks (running MacOS), RAM has been used as data cache to speed up data read/write performance until the actual SSD storage operation completes. It makes sense for Apple to account for with higher RAM spec for the 1TB/2TB configurations.
I'm writing this from memory, so some details may be wrong but: most high end ssds have dram caches on board, with a capacitor that maintains enough charge to flush the cache to flash in case of power failure. This operates below the system page cache that is standard for all disks and oses.
Apple doesn't do this, and use their tight integration to perform a similar function using system memory. So there is some technical justification, I think. They are 100% price gougers though.
Even on the Atari ST you would use a "RAM disk" when working with "large" data before manually flushing it to a floppy. Some people would use the trashcan icon to emphasise the need to manually flush... Not quite a cache, but the concept was there.
With hardware where power-off is only controlled by software, battery life is predictable, and large amounts of data like raw video are being persisted, they might have a very aggressive version of page caching, and a large amount of storage may imply that a scale-up of RAM would be necessary to keep all the data juggling on a happy path. That said, there’s no non-business reasons why they couldn’t extend that large RAM to smaller storage systems as well.
People without the "large amount of storage model" need to record video from the camera too.
The justifications I see are to reduce the number of models needed to stock and to keep the purchasing decision simple for customers. These are very good reasons.
Shouldn't the required cache size be dependent on throughput more so than disk size? It does not necessarily seem like you'd need a bigger write cache if the disk is bigger, people who have a 2TB drive don't read/write 2x as much in a given time as those with a 1TB drive. Or am I missing something?
IIRC SSD manufacturers are likely to store a mapping table of LBAs (logical block addresses) to PBAs (physical block addresses) in the DRAM or Host Memory Buffer.
Some calculation like:
total storage size / page size per LBA (512B or 4KiB usually) * mapping data structure size
> SSD manufacturers are likely to store a mapping table of LBAs (logical block addresses) to PBAs (physical block addresses) in the DRAM or Host Memory Buffer.
Are LBA's a thing on SSD's nowadays? I thought it was the legacy of the spinning rust.
SSD's operate on memory pages of the flash memory, and the page management is a complicated affair that is also entirely opaque to the host operating system due to the behind the scenes page remapping. Since flash memory is less durable (in the long term), the SSD's come overprovisioned and the true SSD capacity is always more (up to a double if my memory serves me well). The SSD controller also runs an embedded RTOS that monitors failures in flash chips and proactively evacuates and remaps ailing flash memory pages onto the healthy ones. Owing to this behaviour, the memory pages that the SSD controller reports back to the operating system have another, entirely hidden, layer of indirection.
Yep, LBAs are the primary addressing scheme in the NVMe spec, written into every single IO command. I would imagine there could be a better way, but NVMe & OS support still carries some baggage from SATA HDDs -> SATA SSDs -> NVMe SSDs.
As you mentioned, over-provisioning and other NAND flash memory health management techniques like garbage collection and wear leveling are needed for usable modern SSDs. Modern SSD controllers are complex beasts having 3-7 microprocessor cores (probably double digit core counts now with PCIe 5.0), encryption engines, power & thermal management, error correction, multiple hardware PHYs, etc.
If I'm understanding your point correctly that wouldn't prevent them from offering higher ram specs for the lower storage eg. 512 gig macs. So it seems like it is just price gouging
If that were the case why do they bother with an iPad, iPad Air, iPad Pro, iPhone SE, iPhone, iPhone Pro, iPhone Pro Max, ... each with their own number of colors and storage variations.
iPad Air was created to make a new price category.
The iPhone SE exists because there is a market for this form factor. If you look at the specs, you would notice it uses hardware previously used by more expensive models.
> iPhone, iPhone Pro, iPhone Pro Max
Again, different customers for different form-factors. These phones differ more than just SoC in them.
You understand that having N different colors of iPads is different from having N different SoCs for the same model of an iPad.
Do people not understand that Apple's 'price gouging' is about UX? A person who has the money to buy a 1TB iPad is worth more than average customer. A 16GB RAM doubtlessly results in a faster UX and that person is more likely to continue purchasing.
> Do people not understand that Apple's 'price gouging' is about UX? A person who has the money to buy a 1TB iPad is worth more than average customer. A 16GB RAM doubtlessly results in a faster UX and that person is more likely to continue purchasing.
And that decision somehow turns into making budget conscious people's UX shittier? How is that a reason not to make 16gb RAM, which is almost a bare minimum in 2024, available to everyone?
It's fairly absurd that they're still selling a 256gb "Pro" machine in the first place.
That said, Apple's policy toward SKUs is pretty consistent: you pay more money and you get more machine, and vice versa. The MacBooks are the only product which has separately configurable memory / storage / chip, and even there some combinations aren't manufactured.
I have a 256G iPhone. I think I’m using like 160G. Most stuff is just in the cloud. For an iPad it wouldn’t be any different, modulo media cached for flights. I could see some cases like people working on audio to want a bunch stored locally, but it’s probably in some kind of compressed format such that it wouldn’t matter too much.
I don't know about 'concern' necessarily, but it seems to me that 512GB for the base Pro model is a more realistic minimum. There are plenty of use cases where that amount of storage is overkill, but they're all served better by the Air, which come in the same sizes and as little as 128GB storage.
I would expect most actual users of the Pro model, now that 13 inch is available at the lower tier, would be working with photos and video. Even shooting ProRes off a pro iPhone is going to eat into 256 pretty fast.
Seems like that model exists mainly so they can charge $1500 for the one people are actually likely to get, and still say "starts at $1299".
Then again, it's Apple, and they can get away with it, so they do. My main point here is that the 256GB model is bad value compared to the equivalent Air model, because if you have any work where the extra beef is going to matter, it's going to eat right through that amount of storage pretty quick.
I think you're underestimating the number of people who go in to buy an iPad and gravitate to the Pro because it looks the coolest and sounds like a luxury thing. For those people, who are likely just going to use it for web browsing and streaming videos, the cheapest configuration is the only one they care about.
That type of buyer is a very significant % of sales for iPad pros. Despite the marketing, there are really not that many people (as a % of sales) that will be pushing these iPad's anywhere even remotely close to their computational/storage/spec limits.
If that were the goal (I don't think it is), they'd be better off shipping enough storage to push people into the 2TB tier, which is $11 vs. $3 a month for 200GB.
I said this in a sibling comment already, but I think it's just price anchoring so that people find the $1500 they're actually going to pay a bit easier to swallow.
Real creative pros will likely be using a 10G Thunderbolt NIC to a SAN; local video editing is not advised unless it’s only a single project at a time.
Because before you know it you're Dell and you're stocking 18 different variants of "Laptop, 15 inch screen, 16GB RAM, 512GB SSD" and users are scratching their heads trying to figure out WTF the difference is between a "Latitude 3540" a "Latitude 5540" and a "New Latitude 3550"
Yes, the additional $600 they make off of users who just want extra RAM is just an unfortunate side effect of the unavoidable process of not being Dell. Couldn't be any other reason.
Apple already fixed this with the Mac: they stock a handful of configurations most likely to sell, and then everything else is a custom order shipped direct from China. The reason why Apple has to sell specific RAM/storage pairs for iPads is that they don't have a custom order program for their other devices, so everything has to be an SKU, and has to sell in enough quantity to justify being an SKU.
I can't tell whether this is serious or not. Surely adding independently configurable memory/storage combinations won't confuse the user, any more than having configurable storage options don't make the user confused about what iphone to get?
Configuring your iPhone storage is something every consumer has a concept of, it's some function of "how many pictures can I store on it"? When it comes to CPU/GPU/RAM and you're having to configure all three, the average person is absolutely more likely to be confused.
It's anecdotal, but 8/10 people that I know over the age of 40 would have no idea what RAM or CPU configurations even theoretically do for them. This is probably the case for most iPad purchasers, and Apple knows this - so why would they provide expensive/confusing configurability options just for the handful of tech-y people who may care? There are still high/med/low performant variations that those people can choose from, any the number of people for whom that would sour them away from a sale is vanishingly small, and they would be likely to not even be looking at Apple in the first place
Margins and profit. Less variations in production makes for higher efficiency. Segmenting the product line can push consumers to purchase higher tiers of product. It's iOS anyways, and the people who know enough to care how much RAM they are getting are self-selecting for those higher product tiers.
They push you to buy the more expensive model with higher margins.
This is what they did when I was buying iPad Air - it starts with actually problematically low 64GB of Storage... and the 256GB model is the next one with massive price jump.
It's the same kind of "anchoring" (marketing term) that car dealers use to lure you into deciding for their car based on the cheapest 29.999$ model which with "useful" equipment will end you costing like 45.000$
Honest question: What data do you store on an iPad Air? On a phone you might have some photos and videos but isn't a tablet just a media consumption device? Especially on iOS where they try to hide the filesystem as much as possible.
No data, but iOS apps have gotten massive, caches have gotten massive and install a game or two and 64GB is gone.
Not to mention that occasionally is nice to have a set of downloaded media available for vacation/travel and 64GB isn't enough to download week worth of content from Netflix.
This is why this is so annoying - you're right, I don't need 512GB or 256GB. But I'd still like to have more than "You're out of space!!" amount.
I've had the original iPad Pro with 64gb since it first released and have somehow never run out of storage. Maybe my problem is that I don't download games. I'd suggest using a USB drive for downloaded media though if you're planning to travel. All of the media apps I use (Netflix, YouTube, Crunchyroll, etc.) support them. That's worked well for me and is one reason I was comfortable buying the 64gb model.
Sorry, I thought I had done this with Netflix but I tried it just now and couldn't find the option. Then I googled it and it looks like it was never supported, I must've misremembered Netflix being an option.
Yes. However, applications have to be specifically written to use external storage, which requires popping open the same file picker you use to interact with non-Apple cloud storage. If they store data in their own container, then that can only ever go on the internal storage, iCloud, or device backups. You aren't allowed to rugpull an app and move its storage somewhere else.
I mean, what would happen if you yanked out the drive while an app was running on it?
As he said, you buy excess storage so that you don't have to think about how much storage you are using. Meanwhile if you barely have enough, you're going to have to play data tetris. You can find 256GB SSDs that sell for as low as 20€. How much money is it worth to not worry about running out of data? Probably more than the cost of the SSD at these prices.
We use PLEX for long trips in the car for the kids. Like 24 hour drives. We drive to Florida in the winter and the iPads easily run out of space after we’ve downloaded a season or two of Adventure Time and Daniel Tiger.
I could fit more if I didn’t insist on downloading everything 1080p I guess.
> but isn't a tablet just a media consumption device
In my sphere, everyone with an iPad uses it for the Apple Pencil and/or video editing. Raw files for drawings get surprisingly big, once you get up into the many tens of layers, considering an artist can draw a few a day.
Games. You can put maybe three or four significant games on an iPad Air before it maxes out. (MTG Arena is almost 20GB all on it's own, Genshin Impact is like 40+ GB)
1. Having more SKUs is expensive, for everything from planning to inventory management to making sure you have enough shelf space at Best Buy (which you have to negotiate for). Chances are good that stores like Best Buy and Costco would only want 2 SKUs anyway, so the additional configs would be a special-order item for a small number of consumers.
2. After a certain point, adding more options actually decreases your sales. This is confusing to people who think they'd be more likely to buy if they could get exactly what they wanted, but what you're not seeing is the legions of casual consumers who are thinking about maybe getting an iPad, but would get overwhelmed by the number of options. They might spend days or weeks asking friends which model to get, debating about whether to spend extra on this upgrade or that, and eventually not buying it or getting an alternative. If you simplify the lineup to the "cheap one" and the "high end one" then people abandon most of that overhead and just decide what they want to pay.
The biggest thing tech people miss is that they're not the core consumers of these devices. The majority go to casual consumers who don't care about specifying every little thing. They just want to get the one that fits their budget and move on. Tech people are secondary.
> What do they lose by allowing slightly more freedom in configurations?
More costs everywhere in the chain; limiting SKUs is a big efficiency from manufacturing to distribution to retail to support, and it is an easy way (for the same reason) to improve the customer experience, because it makes it a lot easier to not be out of or have delays for a customer’s preferred model, as well as making the UI (online) or physical presentation (brick and mortar) for options much cleaner.
Of course, it can feel worse if you you are a power user with detailed knowledge of your particular needs in multiple dimensions and you feel like you are paying extra for features you don't want, but the efficiencies may make that feeling an illusion — with more freedom, you would being paying for the additional costs that created, so a higher cost for the same options and possibly just as much or more for the particular combination option you would prefer with multidimensional freedom as for the one with extra features without it. Though that counterfactual is impossible to test.
Logistical efficiencies mostly. It ends up being a lot of additional SKUs to manage, and it would probably discourage people from moving up a price tier if they would have otherwise. So from Apple’s perspective they’re undergoing more hassle (which costs) for the benefit of selling you lower margin products. No upside for them besides maybe higher customer satisfaction, but I doubt it would have moved the needle on that very much.
It's not obvious to me that Apple does make a significant amount of money by selling upgrades. Almost everyone buys the base model. The other models are probably little more than a logistical pain in the butt from Apple's perspective. Apple has to offer more powerful systems to be credible as a platform, but I wouldn't be surprised if the apparently exorbitant price of the upgrades reflects the overall costs associated with complicating the production and distribution lines.
It’s not about the price of upgrades though, it’s about their bundling together and the ridiculously stingy base specs that often make the upgrade non-optional. People who buy a base MacBook Air probably aren’t thinking about keeping it for 8 years or using it for heavy workloads.
Sure, but bundling them together reduces the supply chain complexity and reduces Apple's costs. If the options were more fine grained, Apple would sell even less of each model and it would be even less worth their while.
Also, I have seen lots of people on HN complain about the price itself, even if it's not what you yourself object to.
The GP comment can be misleading because it suggests Apple is tying storage to ram. That is not the case (at least not directly).
The RAM and system-on-chip are tied together as part of the system-on-package. The SoP is what enables M chips to hit their incredible memory bandwidth numbers.
This is not an easy thing to allow configuration. They can’t just plug a different memory chip as a final assembly step before shipping.
They only have two SoPs as part of this launch: 9-core CPU with 8gb, and 10-core CPU with 16gb. The RAM is unified for cpu/gpu (and I would assume neural engine too).
Each new SoP is going to reduce economies of scale and increase supply chain complexity. The 256/512gb models are tied to the first package, the 1/2tb models are tied to the second. Again, these are all part of the PCB, so production decisions have to be made way ahead of consumer orders.
Maybe it’s not perfect for each individual’s needs, but it seems reasonable to assume that those with greater storage needs also would benefit from more compute and RAM. That is, you need more storage to handle more video production so you are probably more likely to use more advanced features which make better use of increased compute and RAM.
I'm not defending Apple's absurd stinginess with RAM (though I don't think it's much of an issue on an iPad given how gimped the OS is), but I've never understood why high-end Android phones have 12/16+ GB RAM.
What needs that amount on a phone? 8GB on a desktop is...well it's not great, but it's usable, and usable for a damn sight more multi-tasking than you would ever do on a smartphone. Is it just because you need to look better on the spec sheet, like the silly camera megapixel wars of the 2010s?
I think that is very obvious. You have a browser window and a few apps open -- messaging, youtube, email, podcast etc. When you switch between apps and eventually back to your browser, you don't want the page to reload due to other apps eating up the memory. As simple as that. It's about having a good experience.
I don't know how they didn't realize that. Maybe they just don't use their phone as much as they think.
I freaking hate my iPhone Mini for that, load a YouTube video, a few tabs and a note and boom you are in for a reload every 3-context switch. It's disgusting for a phone that launched at a price too close to 1Keur for my liking.
And no, I'm not going to drop over 1keur on a stupid smartphone just to be able to load more than 5 tabs and a note without constant reloading when I'm just trying to do research in a shop to inform a purchasing decision.
I bought it solely for the form factor in the first place, but I really regret considering all the shortcomings that came with the price.
But if Apple think they are smartass, I'll have the last laugh by not buying their hardware again. My next smartphone is definitely going to have over 8Gb of RAM, just not from Apple.
We've reached a point where their chips has become so amazing they have to introduce "fake scarcity" and "fake limits" to sell their pro lines, while dividing their customers into haves and havenots, while actively stalling the entire field for the masses.
You could, alternatively, read less malice into the situation and realize that the majority of people buying an iPad pro don't even need 8gb of RAM to do what they want to do with the device (web browsing + video streaming).
Phones and tablets are effectively single-tasking. They don’t need much more ram than that in practice.
I use my iPad Pro constantly for heavy stuff and I don’t know how much ram is in it; it has never been something I needed to think about. The OS doesn’t even expose it.
Is a Pixel 8 a mid-range Android phone? Mine shipped with 8GB of RAM, and even with my frankly insane Chrome habits (I'm currently sitting at around 200 tabs) I'm only using about five and half gigabytes, and it runs a hell of a lot smoother than other phones with more RAM that I've used.
There's absolutely nothing a mobile device does that should require that much memory. That shitty OEMs bloat the hell out of their ROMs and slap on more memory to match isn't a good thing or something to emulate in my opinion.
Honestly though, that's basically every tablet you cant change the ram, you get what you get and thats it. Maybe they should call them by different names like Pro Max for the ones with 16GB in order to make it more palatable? Small psychological hack.
The Samsung tablets at least still retain the SD card slot, so you can focus more on the desired amount of RAM and not worry too much about the built-in storage size.
Just like I don't want an umbilical cord hanging out of me just to perform the full extent of my bodily functions, I also wouldn't want a dongle hanging off my tablet for it to be deemed usable.
It would be cool if regulators mandated that companies like Apple are obligated to provide models of devices with SD card slots and a seamless way to integrate this storage into the OS/applications.
That combined with replaceable batteries would go a long way to reduce the amount of ewaste.
And then people would stick alphabet-soup SD cards into their devices and complain about performance and data integrity, it's enough of a headache in the Android world already (or has been before Samsung and others finally decided to put in enough storage for people to not rely on SD cards any more).
In contrast, Apple's internal storage to my knowledge always is very durable NVMe, attached logically and physically directly to the CPU, which makes their shenanigans with low RAM size possible in the first place - they swap like hell but as a user you barely notice it because it's so blazing fast.
Yeah jackasses are always gonna jackass. There's still a public interest in making devices upgradable for the purpose of minimizing e-waste.
I'd just love to buy a device with a moderate amount of unupgreadable SSD and an SD slot so that I can put a more memory in it later so the device can last longer.
Agreed but please with something other than microSD cards. Yes, microSD Express is a thing, but both cards and hosts supporting it are rare, the size format doesn't exactly lend itself to durable flash chips, thermals are questionable, and even the most modern microSD Express cards barely hit 800 MB/sec speed, whereas Apple's stuff has hit twice or more that for years [2].
The fact that TSMC publishes their own metrics and target goals for each node makes it straightforward to compare the transistor density, power efficiency, etc.
The most interesting aspect of the M4 is simply it's debuting on the iPad lineup, whereas historically it's always been on the iPhone (for A-series) and Macbook (for M-series). Makes sense given low expected yielded for the newest node for one of Apple's lower volume products.
I'm very curious how much iPad Pros sell. Out of all the products in Apple's lineup, the iPad Pro confuses me the most. You can tell what a PM inside Apple thinks the iPad Pro is for, based on the presentation: super powerful M4 chip! Use Final Cut Pro, or Garageband, or other desktop apps on the go! Etc etc.
But in reality, who actually buys them, instead of an iPad Air? Maybe some people with too much money who want the latest gadgets? Ever since they debuted, the general consensus from tech reviewers on the iPad Pro has been "It's an amazing device, but no reason to buy it if you can buy a MacBook or an iPad Air"
Apple really wants this "Pro" concept to exist for iPad Pro, like someone who uses it as their daily work surface. And maybe some people exist like that (artists? architects?) but most of the time when I see an iPad in a "pro" environment (like a pilot using it for nav, or a nurse using it for notes) they're using an old 2018 "regular" iPad.
Ding ding ding ding ding! The iPad Pro is useful primarily for those people. Or at least it was. The original selling point of the Pro was that it had[0] the Apple Pencil and a larger screen to draw on. The 2021 upgrade gave the option to buy a tablet with 16GB of RAM, which you need for Procreate as that has very strict layer limits. If you look at the cost of dedicated drawing tablets with screens in them, dropping a grand on an iPad Pro and Pencil is surprisingly competitive.
As for every other use case... the fact that all these apps have iPad versions now is great, for people with cheaper tablets. The iPad Air comes in 13" now and that'll satisfy all but the most demanding Procreate users anyway, for about the same cost as the Pro had back in 2016 or so. So I dunno. Maybe someone at Apple's iPad division just figured they need a halo product? Or maybe they want to compete with the Microsoft Surface without having to offer the flexibility (and corresponding jank) of a real computer? I dunno.
[0] sold separately, which is one of my biggest pet peeves with tablets
What’s sad about the Air is that it’s only a 60hz screen. I’m spoilt now with 120hz on the first gen iPad Pro, the iPad needs it even more than phones (and they need it).
So I’m not a demanding user in all other ways but the Air is not satisfying to me, yet.
I use an iPad Pro as a teacher. I value the pencil and the screen size. Much of what I do ultimately involves A4 paper, so the screen size is a good match.
A lot of teachers now project their iPad screen wirelessly in class, sometimes almost to the exclusion of any other teaching method.
I value the high performance both for everyday ease of use and specifically for screen recordings.
It is not a laptop replacement; it is a wonderful complement.
Totally agree about "Pro". Imagine if they gave it a real OS. Someone yesterday suggested to dual-boot. At first I dismissed that idea. But after thinking about it, I can see the benefits. They could leave ipadOS alone and create a bespoke OS. They certainly have the resources to do so. It would open up so many new sales channels for a true tablet.
I presume the sequence of events was: some developer at Apple thought it would be a great idea to port hypervisor support to iPad and their manager approves it. It gets all the way into the OS, then an exec gets wind of it and orders its removal because it allows users to subvert the App Store and Apple Rent. I doubt it’s ever coming back.
This is everything wrong with the iPad Pro in a nutshell. Fantastic hardware ruined by greed.
EU and US regulators are slowly eroding that service monopoly.
> Fantastic hardware
Hopefully Apple leadership stops shackling their hardware under the ho-hum service bus.
It's been rumored for years that a touch-optimized version of macOS has been in development for use in iOS VMs. With the launch of M4 1TB 16GB iPad Pros for $2K (the price of two MacBook Airs), Apple can sell developers the freedom to carry one device instead of two, without loss of revenue, https://news.ycombinator.com/item?id=40287922
I bet that touch-optimized macOS will never see the light of day, or if it does it will be insanely crippled. Too much of an existential threat to Apple’s stock price.
Apple is in the midst of a cold war with regulators now. Every new feature will be scrutinized to check that it offers no threat to their golden goose if regulators force them to open it up. Allowing one type of VM means that regulators could force them to allow any type of VM.
Apple currently has 5 major build trains: macOS, iOS, watchOS, tvOS (which also runs HomePod), and visionOS. Huge amounts of the code are already the same between them: they literally just build the same stuff with different build settings… except for the UI. The UI has actually unique stuff in each train.
This has become more true over time… teams are likely sick of not having certain dependencies on certain trains, so they’re becoming more identical at the foundation/framework level every release.
Saying they’ll make a macOS with a touch UI is like saying Honda is finally going to make a motorcycle with four wheels and a full car frame. The UI is the differentiating factor in the OS’s. Everything else has already converged or is rapidly doing so.
If the goal is to support macOS apps on iOS then there’s a dilemma: how do you suddenly make apps that are designed from the ground up for a mouse, good for touch? The answer is you don’t: you just make the rest of the system identical (make the same APIs available everywhere) and ask developers to make the UI parts different.
I could almost believe that they’d make a macOS VM available for use with a keyboard and mouse within iOS. But to me it’d make more sense to do a sort of reverse version of how iOS apps are supported on macOS… where macOS apps are run natively on the iPad, but rendered with the iPad’s window management (modulo whatever multitasking features they still need to implement to make this seamless) and strictly require a keyboard and mouse to be in this mode. There’s just no reason to make a VM if you’re doing this: you can just run the binary directly. The kernel is the same, the required frameworks are the same. No VM is needed.
VMs are needed by professional developers who want to run CLI tools and services (e.g. web server, database) without the security restrictions of iOS, while retaining the OS integrity of the iPad Pro device.
Even if a macOS VM had only a CLI terminal and a few core apps made by Apple, using a Swift UI framework that was compatible with a touch interface, it would be a huge step forward for iPad owners who are currently limited to slow and power-expensive emulation (iSH, ashell). Apple could create a new app store or paid upgrade license entitlement for iOS-compatible macOS apps, so that users can pay ISVs for an app version with iOS touch input.
What you’re talking about sounds great but it’s not “a touch optimized version of macOS”. You’re describing a CLI environment in a sandbox.
Apple will never ever take macOS and change its UI to be optimized for touch. Or at least if they do, it’s time to sell the stock. They already have a touch UI, and it’s called iOS. They’re converging the two operating systems by making the underlying frameworks the same… the UI is literally the only thing they shouldn’t converge.
The mythical convertible iPad Pro "docking" to a "MBP Base" to use it as a touchscreen. ;)
I like the fact that a number of iPad and iPhone apps now run on macOS without a simulator or any ceremony. While they are touch-optimized, they're easy enough to use with a pointing device. The gotcha to such mythical OS convergence is the inverse is untrue since a desktop UI is unusable 1:1 on a tablet with the coarser granularity of tapping and less keyboard access.
Perhaps OS-level AI in the future will be able to automatically follow design guidelines and UX rules and generate a usable UI (Storyboards or such View parts) on any platform given a description of data, its importance, and a description of what it should try to look like.
> Microsoft/HP/Dell/Lenovo Arm laptops with M3-competitive performance are launching soon, with mainline Linux support.
I have been seeking someone who’ll be willing to put money on such a claim. I’ll bet the other way. Perchance you’re the person I seek, if you truly believe this?
perf >= M3 while power consumption <= M3, while booted Linux and, say 50%: streaming a video on youtube.com over wifi at min brightness, 50% compiling some C project in a loop, minimum brightness from and to internal SSD.
At Qualcomm SoC launch, OSS Linux can't possibly compete with the deep pockets of optimized-shenanigan Windows "drivers" or vertically integrated macOS on Apple Silicon.
But the incumbent landscape of Arm laptops for Linux is so desolate, that it can only be improved by the arrival of multiple Arm devices from Tier 1 PC OEMs based on a single SoC family, with skeletal support in mainline Linux. In time, as with Asahi reverse engineering of Apple firmware interfaces, we can have mainline Linux support and multiple Linux distros on enterprise Arm laptops.
One risk for MS/Asus/HP/Dell/Lenovo devices based on Qualcomm Nuvia/Oryon/EliteX is that Qualcomm + Arm licensing fees could push device pricing into "premium" territory. The affordable Apple Macbook Air, including used M1 devices, will provide price and performance competition. If enterprises buy Nuvia laptops in volume, then Linux will have a used Arm laptop market in 2-3 years.
So.. your test case might be feasible after a year or two of Linux development and optimization. Until then, WSL2 on Windows 11 could be a fallback. For iPad Pro users desperate for portable Linux/BSD VM development with long battery life, Qualcomm-based Arm laptops bring much needed competition to Apple Silicon. If Nuvia devices can run multiple OSS operating systems, it's already a win for users, making possible the Apple-impossible. Ongoing performance improvements will be a bonus.
Since the hardware already exists and has been benchmarked privately, this is less of a bet and more of an information asymmetry. So let's assume you would win :) Next question is why - is it a limitation of the SoC, power regulators, motherboard design, OS integration, Arm licensing, Apple patents, ..?
With Logic Pro for iPad they now have applications for all their traditional Mac use cases on iPad. If anything, it feels like Apple is pushing for a switch from low-tier Macs to iPad Pro.
And they surely can sell more gadgets and accessories for an iPad than for a laptop.
While it is true that the claimed performance for M4 is better than for the current Intel Meteor Lake and AMD Hawk Point, it is also significantly lower (e.g. around half) than the AI performance claimed for the laptop CPU+GPU+NPU models that both Intel and AMD will introduce in the second half of this year (Arrow Lake and Strix Point).
The point is that it is a very near future, a few months away.
Apple is also bragging very hyperbolically that the NPU they introduce right now is faster than all the older NPUs.
So, while what Apple says, "The Most Powerful Neural Engine Ever" is true now, it will be true for only a few months. Apple has done a good job, so as it is normal, at launch their NPU is the fastest. However this does not deserve any special praise, it is just normal, as normal as the fact that the next NPU launched by a competitor will be faster.
Only if the new Apple NPU would have been slower than the older models, that would have been a newsworthy failure. A newsworthy success would have been only if the new M4 would have had at least a triple performance than it has, so that the competitors would have needed more than a year to catch up with it.
Is this the first time you're seeing marketing copy? This is an entirely normal thing to do. Apple has an advantage with the SoC they are releasing today, and they are going to talk about it.
I expect we will see the same bragging from Apple's competitors whenever they actually launch the chips you're talking about.
Apple has real silicon shipping right now. What you're talking about doesn't yet exist.
> A newsworthy success would have been only if the new M4 would have had at least a triple performance than it has, so that the competitors would have needed more than a year to catch up with it.
So you decide what's newsworthy now? Triple? That's so arbitrary.
I certainly better not see you bragging about these supposed chips later if they're not three times faster than what Apple just released today.
I said triple, because the competitors are expected to have a double speed in a few months.
If M4 were 3 times faster than it is, it would have remained faster than Strix Point and Arrow Lake, which would have been replaced only next year, giving supremacy to M4 for more than a year.
If M4 were twice faster, it would have continued to share the first position for more than a year. As it is, it will be the fastest for one quarter, after which it will have only half of the top speed.
And then Apple will release M5 next year, presumably with another increase in TOPS that may well top their competitors. This is how product releases work.
I can’t tell what you’re criticizing. Yes, computers get faster over time, and future computers will be faster than the M4. If release cycles are offset by six months then it makes sense that leads only last six months in a neck-and-neck race. I’d assume after Arrow Lake and Strix Point the lead will then go back to M5 in six months, then Intel and AMD’s whatever in another six, etc. I guess that’s disappointing if you expected a multi-year leap ahead like the M1, but that’s just a bad expectation, it never happens and nobody predicted or claimed it.
Don’t worry. It’s Intel we’re talking about. They may say that it’s coming out in 6 months, but that’s never stopped them from releasing it in 3 years instead.
AMD is the one that has given more precise values (77 TOPS) for their launch, their partners are testing the engineering samples and some laptop product listings seem to have been already leaked, so the launch is expected soon (presentation in June, commercial availability no more than a few months later).
There's no Taiwanese silicon industrial complex, there's TSMC. The rest of Taiwanese fabs are irrelevant. Intel is the clear #3 (and looks likely-ish to overtake Samsung? We'll see).
My m2 pro is already more powerful than I can use. The screen is too small to do big work like using a daw or doing video editing, the Magic Keyboard is uncomfortable so I stopped writing on it. All that processing power, I don’t know what it will be used for on a tablet without even a good file system. Lousy ergonomics
Yeah, I'm broadly aware and have seen a few of the papers, though I definitely don't try and track the state of the art here closely.
My impression and experience trying low bit quants (which could easily be outdated by now) is that you are/were better off with a smaller model and a less aggressive quantization (provided you have access to said smaller model with otherwise equally good training). If that's changed I'd be interested to hear about it, but definitely don't want to make work for you digging up papers.
LLMs are parameterized by a ton of weights, when we say something like 400B we mean it has 400 billion parameters. In modern LLMs those parameters are basically always 16 bit floating point numbers.
It turns out you can get nearly as good results by reducing the precision of those numbers, for instance by using 4 bits per parameter instead of 16, meaning each parameter can only take on one of 16 possible values instead of one of 65536.
Most claims of "nearly as good results" are massively overblown.
Even the so called "good" quants of huge models are extremely crippled.
Nothing is ever free, and even going from 16 to 8bit will massively reduce the quality of your model, no matter whatever their hacked benchmarks claim.
No, it doesn't help because of "free regularization" either. Dropout and batch norm were also placebo BS that didn't actually help to back in the day when they were still being used.
Interestingly enough, Llama3 suffers more performance loss than Llama2 did at identical quantizations. https://arxiv.org/abs/2404.14047
There's some speculation that a net trained for more epochs on more data learns to pack more information into the weights, and so does worse when weight data is degraded.
Quantization is reducing the number of bits to store a parameter for a machine learning model.
Put simple, a parameter is a number that determines how likely it is that something will occur, ie if the number is < 0.5 say "goodbye" otherwise say "hello".
Now, if the parameter is a 32bit (unsigned) integer it can have a value of 0-4,294,967,296.
If you were using this 32bit value to represent physical objects, then you could represent 4,294,967,296 objects (each object gets given its own number).
However a lot of the time in machine learning, after training you can find that not quite so many different "things" need to be represented by a particular parameter, so if say you were representing types of fruit with this parameter (Google says there are over 2000 types of fruit, but let's just say there are exactly 2000). In that case 4,294,967,296/2000 means there are 2.1 million distinct values we assign each fruit, which is such a waste! Our perfect case would be that we use a number that only represents 0-2000 in the smallest way for this job.
Now is where quantization comes in, where the size of the number we use to represent a parameter is reduced, saving memory size at the expense of a small performance hit of the model accuracy - it's known that many models don't really take a large accuracy hit from this, meaning that the way the parameter is used inside the model doesn't really need/take advantage of being able to represent so many values.
So what we do is say, reduce that 32bit number to 16, or 8, or 4 bits. We go from being able to represent billions or millions of distinct values/states to maybe 16 (with 4bit quantization) and then we benchmark the model performance against the larger version with 32bit parameters - often finding that what training has decided to use that parameter for doesn't really need an incredibly granular value.
An NVIDIA RTX 4090 generates 73 TFLOPS. This iPad gives you nearly half that. The memory bandwidth of 120 GBps is roughly 1/10th of the NVIDIA hardware, but who’s counting!
The 4090 costs ~$1800 and doesn't have dual OLED screens, doesn't have a battery, doesn't weigh less than a pound, and doesn't actually do anything unless it is plugged into a larger motherboard, either.
When you line it up like that it's kinda surprising the 4090 is just $1800. They could sell it for $5,000 a pop and it would still be better value than the highest end Apple Silicon.
Comparing these directly like this is problematic.
The 4090 is highly specialized and not usable for general purpose computing.
Whether or not it's a better value than Apple Silicon will highly depend on what you intend to do with it. Especially if your goal is to have a device you can put in your backpack.
I'm not the one making the comparison, I'm just providing the compute numbers to the people who did. Decide for yourself what that means, the only conclusion I made on was compute-per-dollar.
And Nvidia annihilates those scores with CUBlas. I'm going to play nice and post the OpenCL scores since both sides get a fair opportunity to optimize for it.
Yeah where are the bfloat16 numbers for the neural engine? For AMD you can at least divide by four to get the real number. 16 TOPS -> 4 tflops within a mobile power envelope is pretty good for assisting CPU only inference on device. Not so good if you want to run an inference server but that wasn't the goal in the first place.
What irritates me the most though is people comparing a mobile accelerator with an extreme high end desktop GPU. Some models only run on a dual GPU stack of those. Smaller GPUs are not worth the money. NPUs are primarily eating the lunch of low end GPUs.
Why are we running these high end CPUs on tablets without the ability to run pro apps like Xcode?
Until I can run Xcode on an iPad (not Swift Playgrounds), it's a pass for me. Hear me out: I don't want to bring both an iPad and Macbook on trips, but I need Xcode. Because of this, I have to pick the Macbook every time. I want an iPad, but the iPad doesn't want me.
Didn't have to look long to find a comment mirroring how I feel about these devices. To me it feels like they're just adding power to an artificially castrated device I can barely do anything with. See no reason to upgrade from my original iPad Pro that's not really useful for anything. Just an overpowered device running phone software.
I feel the same way. I just can't justify upgrading from my 10.5" Pro from years ago. It's got pro motion and runs most apps fine. Sure, the battery isn't great after all these years, but it's not like it's getting used long enough to notice.
Something has changed with how the iPads behave at rest. When I got my first iPad in 2010
I could leave it for weeks, pick it up, and it would hardly use any battery at all. Today, it seems like my iPad mini will eat 10% or more per day just sitting on a table untouched. I don’t like leaving it plugged in all the time, but with it being dead every time I go to pick it up, I simply stop picking it up.
Even a good battery isn’t that good. That seems to be a software problem.
My only theory is it’s turning on the screen every time it gets a notification. However, I have a case that covers the screen, which should keep the screen off in my opinion. I have thought about disabling 100% of the notification, but without a global toggle that seems pretty annoying to do.
My guess is something to do with Find My/ offline finding. That would cause it to wake up all the time, maybe Apple thought it was worth the trade off.
I’ll have to play more with it for the other things. I haven’t invested much time in troubleshooting, since it seemed like that’s the way iPads just are now. Hopefully that’s not actually true.
When I looked at the top battery consumers in the past there wasn’t anything that stood out. I think home screen was at the top. It wasn’t one or two apps killing it with background activity.
Since the biggest battery consumption associated with home screen is the display, and users are only briefly on the home screen, before using it to navigate elsewhere, home screen should be near the bottom (1%) of power consumption.
I've been saying this for years, I would love to get a desktop Mac and use an iPad for the occasional bit of portable development I do away from a desk, like when I want to noodle on an idea in front of the TV.
I'm very happy with my MacBook, but I don't like that the mega expensive machine I want to keep for 5+ years needs to be tied to a limited-life lithium battery that's costly and labour intensive to replace, just so I can sometimes write code in other rooms in my house. I know there's numerous remote options but...the iPad is right there, just lemme use it!
I've been giving some thought to this. I wonder if an iPad would suffice in front of the tv and just ssh into a Mac Mini for dev work. I'd love an iPad but I can't justify it either because of the limitation of hardware capabilities. I also don't really want to purchase two machines just for dev tasks and travel. But, I think having that kind of lifestyle will be expensive no matter the approach.
I’ve had success using cloud dev environments with an iPad - the key for me was also using a mouse and keyboard - after things weren’t _that_ different
Visual studio code running from remote servers seemed like it was making great progress right until the AI trendiness thing took over... and hasn't seemed to advance much since. Hopefully the AI thing cools down and the efforts on remote tooling/dev environments continues onwards.
If we're going down that route then what's the point in putting good hardware in the device? It might as well just be a thin client. Having the same SoCs as their laptops and desktops but then relegating the iPad to something that needs to be chained to a "real" computer to do anything useful in the development space seems like a tremendous waste of potential.
iPadOS is still mainly a fork of iOS, a glorified mobile interface. They should really switch to a proper macOS system, now that the specs allow for it.
WWDC is a month away. I’m hoping for some iPadOS updates to let people actually take advantage of the power they put in these tablets. Apple has often released new hardware before showing off new OS features to take advantage of it.
I know people have been hoping for that for a long time, so I’m not holding my breath.
In all seriousness, you're right. Sandboxing Xcode but making it fully featured is surely a nightmare engineering problem for Apple. However, I feel like some kind of containerized macOS running in the app sandbox could be possible.
“…but the iPad doesn’t want me” is exactly it; I used iPad from the very first one - that chunky, hard-edged aluminum and glass slate, and remained a heavy user up until a few years ago. For half a decade, the iPad was my only computer on the go. I spent two years abroad with a 12.9” Pro.
The conclusion I came to was that I loved the hardware but found the software a huge letdown for doing real work; I tried SSHing into VPSs and the like, but that wasn’t enough.
But man, the power in these thin, elegant devices is huge, and greater with the M4 chips. If Asahi ran on the M4 iPads I’d probably give it a go! – in an alternate dream universe, that is…
I love and hate Apple as almost everyone else and have an iPad for 'consultation' only (reading, browsing, video), but on Android, you have IDEs for games dev (Godot), real android apps IDE (through F-Droid), Python, Java and C/C++ IDE (through Android Store) which are close enough of the Linux way...
So the iPad devices could handle that too if Apple allowed it...
Once Apple will enforce the European Union requirement to allow 'sideloading' on iPad, maybe we will be able to have nice things also on it.
That could also be a good thing for Apple himself. A lot of people in Europe have a bad opinion of Apple (partly?) because of the closed (walled) garden of iPad/iOS and other technology/IP which make their portable devices apart of the Android ecosystem.
> M4 has Apple’s fastest Neural Engine ever, capable of up to 38 trillion operations per second, which is faster than the neural processing unit of any AI PC today.
I always wonder what crazy meds Apple employees are on. Two RTX 4090s is quite common for hobbyist use, and that is 1321 TOPS each,
making two over 69 times more than what Apple claims to be the fastest in the world. That performance is literally less than 1 % of a single H200.
They said they had the fastest NPU in a PC. Not the fastest on earth (one of the nVidia cards, probably). Not the fastest way you could run something (probably a 4090 as you said). Just the fastest NPU shipping in a PC. Probably consumer PC.
It’s marketing, but it seems like a reasonable line to draw to me. It’s not like when companies draw a line like “fastest car under $70k with under 12 cylinders but available in green from the factory”.
Of course a GPU from Nvidia is also a NPU. People are spending billions each month on Nvidia, because it's a great NPU.
The fact is that a GPU from Nvidia is a much faster NPU than a CPU from Apple.
It is marketing as you say, but it's misleading marketing, on purpose. They could have simply written "the fastest integrated NPU of any CPU" instead. This is something Apple often does on purpose, and people believe it.
A GPU does other things. It’s designed to do something else. That’s why we call it a GPU.
It just happens to be it’s good at neural stuff too.
There’s another difference too. Apple’s NPU is integrated in their chip. Intel and AMD are going the same. A 4090 is not integrated into a CPU.
I’m somewhat guessing. Apple said NPU is the industry term, honestly I’d never heard it before today. I don’t know if the official definition draws a distinction that would exclude GPUs or not.
I simply think the way Apple presented things seemed reasonable. When they made that claim the fact that they might be comparing against a 4090 never entered my mind. If they had said it was the fastest way to run neural networks I would have questioned it, no doubt. But that wasn’t the wording they used.
> It just happens to be it’s good at neural stuff too.
No, it's no coincidence. Nvidia has been focusing on neural nets, same as Apple.
> There’s another difference too. Apple’s NPU is integrated in their chip.
The neural processing capabilities of Nvidia products(Tensor Cores) are also integrated in the chip.
> A 4090 is not integrated into a CPU.
Correct, but nobody ever stated that. Apple stated that M4 was faster than any AI PC today, not that it's the fastest NPU integrated into a CPU. And by the way, the M4 is also a GPU.
> I don’t know if the official definition draws a distinction that would exclude GPUs or not.
A NPU can be a part of a GPU, a CPU or it's own chip.
> If they had said it was the fastest way to run neural networks I would have questioned it,
They said fastest NPU, neural processing unit. It's the term Apple and a few others use for their AI accelerator. The whole point of a AI accelerator is performance and efficiency. If something does a better job at it then it's a better AI accelerator.
NVidia GPUs basically have an NPU, in the form of Tensor units. They don't just happen to be good at matmul, they have specific hardware designed to run neural networka.
There is no actual distinction. A GPU with Tensor cores(=matmul units) really does have an NPU just as much as a CPU with an NPU (=matmul units).
You know G in GPU stands for Graphics, right? So if you want to play a game of words, NVidia's device dedicated to something else is 30 times faster than "fastest" Apple's device dedicated specifically to neural processing.
Both Intel's and AMD's laptop CPUs include NPUs, and they are indeed slower than M4.
Nevertheless, Apple's bragging is a little weird, because both Intel and AMD have already announced that in a few months they will launch laptop CPUs with much faster NPUs than Apple M4 (e.g. 77 TOPS for AMD), so Apple will hold the first place for only a very short time.
It is NOT the fastest available today. I have a 1 year old PC bellow my desk which does faster neural network processing than M4. It has an Intel CPU and NVIDIA 4090. It runs Llama 7B models MUCH faster than any Apple's chip. And it is a PC. I am sorry, but the sentence "It’s the fastest available today." is a straightforward lie...
Why do you believe that? Announcements about future releases, by Intel and AMD, are not current future facts. If they deliver, then fine, but you speak like they're factual.
But I can't just say I have "the world's fastest GXZ", when GXZ is just some marketing phrase. If we're willing to accept a GPU ~= NPU then it's just a meaningless claim.
It's almost 70x more powerful. A 4 year old 3070 laptop was cheaper when it came out and has about 200 TOPS, 7 times as much. It's just factually incorrect to call it "faster than any AI PC", it's far slower than a cheaper laptop from 4 years ago.
In that context it seems fair to make the comparison between a MacBook and the PC version which is closest on perf/watt rather than absolute performance on a space heater.
You need to understand that the presentation, and product, wasn't made for the perf/watt crowd. That’s also why they’re so successful with consumer electronics: 9X% of the population is not part of the perf/watt crowd.
I understand the product and the use cases. 90+% of people don't care about perf/watt, because it's an irrelevant implementation detail. People care about speed and autonomy.
The average user who is comparing this against a PC for AI stuff would prefer a faster PC with a (much!) larger battery and the same autonomy at the end of day in that task but that's much faster every time. If I need to do intensive enough on-device AI workloads for efficiency to matter and on battery, why would I take an iPad with a 30Wh battery, 2x perf/watt, and 1x speed over a laptop with a 65Wh battery, 1x perf/watt, and 10x speed? The latter will give me a better experience, it will be more responsive and last longer.
So at the end of the day, Apple's comparison is extremely misleading at best, and a straightforward lie at worst.
> the average user who is comparing this against a PC for AI stuff
If your primary use case is local AI on battery, you're probably into some small hundredths of a percent of the population here, who would be better served on some budget cloud compute (self deployed even) rather than heating up your lap.
> why would I take an iPad with a 30Wh battery...
Because it fits between the pages of the book in your backpack, has a nice pencil, and can *also* do ok with local AI.
> If your primary use case is local AI on battery, you're probably into some small hundredths of a percent of the population here, who would be better served on some budget cloud compute (self deployed even) rather than heating up your lap.
There is no other use case where the power efficiency of the NPU matters, so you should direct this criticism at Apple.
> Because it fits between the pages of the book in your backpack, has a nice pencil, and can also do ok with local AI.
So if the only avantage is the form factor, why did Apple decide to compare it against PCs?
> The average user who is comparing this against a PC for AI stuff would prefer a faster PC
I'm saying this would not be an average user. The average user would not be comparing AI capabilities, they open an app that uses AI, with a high chance of not even being aware of it.
> So if the only avantage is the form factor, why did Apple decide to compare it against PCs?
What should they have compared it to? There's only one other tablet on the market, with an old chipset and a fraction of the performance, that just received its first real video editor late last year [1].
I agree though, they could/should have made clearer comparisons, along with Intel, AMD, Samsung, Nvidia, and everyone else. But, as with all general audience marketing presentations from all tech companies, vested technically literate users will look to the benchmarks, after release.
> I'm saying this would not be an average user. The average user would not be comparing AI capabilities, they open an app that uses AI, with a high chance of not even being aware of it
What is the point here? If the average user doesn't care about comparing AI capabilities, then clearly Apple wasn't targeting that user in their AI capability comparison.
> What should they have compared it to? There's only one other tablet on the market, with an old chipset and a fraction of the performance, that just received its first real video editor late last year [1].
It's very clear that Apple is trying to make the case that the iPad Pro is a replacement for a laptop. In fact, that was their core marketing claim when it came out : "What's a computer?" [1], and arguably that's the point of their latest ads where they market it as a general purpose computer. Given the price point and the marketing, yes, they should be comparing it with a computer when making the case it's a good replacement for a computer. And when they do, they should not be misleading to the point of falsehood.
If they thought that peak performance didn't matter, they wouldn't quote peak performance numbers in their comparison, and yet they did. Peak performance clearly matters, even in battery powered devices: many workloads are bursty and latency matters then, and there are workloads where you can be expected to be plugged in. In fact, one such workload is generative AI which is often characterized by burst usage where latency matters a lot, which is exactly what these NPUs are marketed towards.
But the word they used isn't "NPU" or "neural engine" but "AI PC"??? If I build a PC with a ton of GPU power with the intention of using that compute for machine learning then that's an "AI PC"
On paper you’re absolutely correct. AI PC is marketing rubbish out of Wintel. Apple’s doing a direct comparison to that marketing rubbish and just accepting that they’ll probably have to play along with it.
So going by the intended usage of this marketing rubbish, the comparison Apple is making isn’t to GPUs. It’s to Intel’s chips that like Apple’s, integrate, CPU, GPU, and NPU. They just don’t name drop Intel anymore when they don’t have to.
If they literally just said that the iPad's NPU is faster than the NPU of any other computer it'd be fine, I would have no issue with it (though it makes you wonder, maybe that wouldn't have been true? Maybe Qualcomm or Rockchip have SoCs with faster NPUs, so the "fastest of any AI PC" qualifier is necessary to exclude those?)
An AI PC is a PC suited to be used for AI... Dual 4090 is very suited for small scale AI.
It might be a marketing term by Microsoft, but that is just dumb, and has nothing to do with what Apple says. If this was in relation to Microsofts "AI PC" then Apple should have written "Slower than ANY AI PC." instead, as the minimum requirements for "AI PC by Microsoft" seems to be 45 TOPS, and the M4 is too slow to qualify by the Microsoft definition.
Are you heavily invested in Apple stock or somehting? When a company clearly lies and tries to mislead people, call them out on it, don't defend them. Companies are not your friend. Wtf.
> Are you heavily invested in Apple stock or somehting? When a company clearly lies and tries to mislead people, call them out on it, don't defend them. Companies are not your friend. Wtf.
I don't own any Apple stock, at least not directly. I'm not defending Apple, just trying to understand what their claim is. Apple does plenty of consumer un-friendly things but they aren't dumb and they have good lawyers so they tend not to directly lie about things in product claims.
Fair enough. You are correct that Apple aren't dumb, but they do mislead as much as they can in marketing, and by their own words from previous court cases you're not a reasonable person if you think it's facts.
In this case they do straight up lie, without a question. There is no reasonable explanation for the claim. If they had some absurd meaning behind it then they should have put a footnote on it.
Microsoft/Intel are trying to push this "AI-enabled PC" or whatever for few months, to obsolete laptops without NPU stuffed in unused I/O die space of CPU. Apple weaponized that in this instance.
“AI PC” is what Microsoft and the industry has deemed SoCs that have an NPU in it. It’s not a term that Apple made up. It’s what the industry is using.
Of course, Apple has had an NPU in their SoC since the first iPhone with FaceID.
when they made up that term, they also made up a TOPS requirement that is higher than what the m4 has. It's not by much, but technically the m4 is not even fast enough to qualify as an AI PC.
Ah. I guess you could argue that that's technically not directly false. That's an impressive level of being dishonest without being technically incorrect.
By comparing the non-existent neural engine in your typical AI PC, you could claim that the very first SoC with an "NPU" is infinitely faster than the typical AI PC
The phrase AI PC used by Intel and AMD is about having an NPU like in the Intel Ultra chips. These are ML only things, and can run without activating the GPU.
It’s the term Microsoft, Intel, AMD, and Qualcomm decided to rally around. No need to get upset at Apple for using the same term as reference for comparison.
Ps. Nvidia also doesn’t like the term because of precisely what you said. But it’s not Apple that decided to use this term.
If you want to adopt this new terminology, remember that Intel and Microsoft have a requirement of 40 TOPS for "AI PCs". How can the m4 be faster than any AI PC if it's too slow to even qualify as one?
Source? IIRC that 40 TOPS was just Microsoft saying that was the requirement for "next gen" AI PCs, not a requirement for any AI PC to be classed as one.
I’ve never heard anyone refer to an NPU before. I’ve heard of GPU and TPU. But in any case, I don’t know the right way to compare Apple’s hardware to a 4090.
All of the tech specs comparisons were extremely odd. Many things got compared to the M1, despite the most recent iPad having the M2. Heck, one of the comparisons was to the A11 chip that was introduced nearly 7 years ago.
I generally like Apple products, but I cannot stand the way they present them. They always hide how it compares against the directly previous product.
Also the 38 TOPS figure is kind of odd. Intel had already shown laptop CPUs with 45 TOPS NPU[1] though it hasn't shipped, and Windows 12 is rumored to require 40 TOPS. If I'm doing math right, (int)38 falls short of both.
I have a MacBook M2 and a PC with a 4090 ("just" one of them) - the VRAM barrier is usually what gets me with the 4090 when I try to run local LLMs (not train them). For a lot of things, my MacBook is fast enough, and with more RAM, I can run bigger models easily. And, it's portable and sips battery.
The marketing hype is overblown, but for many (most? almost all?) people, the MacBook is a much more useful choice.
Apple always had the fastest, the biggest, the best. Or at least they have to claim that to justify the price premium.
Previous iPads had the best screens in a tablet even if they weren't oleds. Now that they finally use oleds, they have the best oled screens and the best screens in a tablet.
This is standard apple advertising.The best whatever in the world that is the same as some standard thing with a different name. Apple is like clothing makers that "vanity size" their clothes. If you dont know that basically means a size 20 is 30 size 21 is 31 and so on.
Neural processing unit is basically a made up term at this point so of course they can have the fastest in the world.
Seeing an M series chip launch first in an iPad must be result of some mad supply chain and manufacturing related hangovers from COVID.
If the iPad had better software and could be considered a first class productivity machine then it would be less surprising but the one thing no one says about the iPads is “I wish this chip were faster”
Watch a 20-year old creative work on an iPad and you will quickly change your mind. Watch someone who has, "never really used a desktop, [I] just use an iPad" work in Procreate or LumaFusion.
The iPad has amazing software. Better, in many ways, than desktop alternatives if you know how to use it. There are some things they can't do, and the workflow can be less flexible or full featured in some cases, but the speed at which some people (not me) can work on an iPad is mindblowing.
I use a "pro" app on an iPad and I find myself looking around for how to do something and end up having to Google it half the time. When I watch someone who really knows how to use an iPad use the same app they know exactly what gesture to do or where to long tap. I'm like, "How did you know that clicking on that part of the timeline would trigger that selection," and they just look back at you like, "What do you mean? How else would you do it?"
There is a bizarre and almost undocumented design langauge of iPadOS that some people simply seem to know. It often pops up in those little "tap-torials" when a new feature roles out that I either ignore or forget… but other people internalize them.
Yeah, apparently some people book flights from their phones!? Nah man, that's a laptop activity. I'd never spend more than a couple hundred dollars on my phone. Haha
So, the product may not fit your use case or preference. That's ok. Others love it. That's also ok.
What's silly is thinking that a device that makes everyone happy is somehow trivial, or that a device intentionally made with a specific type of interface is bad because of that intent. If things were as bad or as trivial as some people suggest, someone else would have made a successful competing product by now, rather than the string failures across the industry, from those who have tried.
Just because people love it, doesn’t mean that it can’t be better. It also doesn’t mean that current way of doing things is efficient.
> What's silly is thinking that a device that makes everyone happy is somehow trivial, or that a device intentionally made with a specific type of interface is bad because of that intent.
Add a proper native terminal, proper virtualization framework á la what we have on Mac, side loading and third party browser support with plugins and you’ll shut up 99% of complaining users here.
> If things were as bad or as trivial as some people suggest, someone else would have made a successful competing product by now, rather than the string failures across the industry, from those who have tried.
Right, let me use my couple billion change in R&D to create iPad compatible system with all that I’ve mentioned before.
> Just because people love it, doesn’t mean that it can’t be better.
Better for who ? They sell like hot cakes, 50+ millions of them per year
According to google there are 25m software devs in the world, even if half (extremely generous) of them would buy a new ipad every single year (extremely generous) if they implemented what you ask it isn't going to change much for apple, ipads are like 10% of their revenues. So at best we're talking of 3% revenue increase
> 99% of complaining users here.
99% of not much is nothing for apple, they're already printing money too fast to know what to do with it
what you are describing is a certainly some kind of professional but we should look from a higher vantage point. A professional in any field will ultimately want to customize their tool or make their own tools and that is not possible with ipads or even really software on ipads. It is a massive step backwards to sit in a walled, proprietary garden and claim that these people are productive professionals as if they are comparable to the previous generation of professionals. They may be in some sense but from a more historical, higher viewpoint they all seem herded into tightly controlled, corporate tools that will be taken from them whenever it is convenient for someone else. Ipad users are effectively like vmware users who are just waiting for a Broadcom moment. The price hike will always come eventually, the support will always drop at some point. It is all borrowed time, tools someone else controls and makes. It might be necessary in a world where we all need to make money but to positively support it is something else entirely.
> A professional in any field will ultimately want to customize their tool or make their own tools
I suspect you’re a programmer. This is not the perspective or reality of most professional users. Most professional apps are not, themselves, customizable. Most professional users do not make their own tools, or want to. If you’re a programmer, you’ll understand this, because that’s why you’re employed: they want you to do it.
Sorry to be pedantic, but I feel that the distinction is important, that seems more like a UX than a programming job. Too often, UX, UI, coding, documentation, etc. are thrown together, viewed as tasks that can be handled by the same people interchangeably and it rarely yields great results, in part because programmers often start out with expectations that can differ from the vast majority of users.
Also, "most" and "any" aren't all too helpful in this discussion (not directed at anyone in particular, these can be read in comments throughout this thread) because there are going to be countless examples in either direction, but from my limited experience, I have seen professionals in various spaces, some which very much prefer a default workflow and others that heavily customize. I know talented professional programmers doing great work in the out-of-the-box setup of VSCode combined with GitHub Desktop, etc. but also have seen graphic designers, video editors, and even people focused purely on writing text that have created immensely impressive workflows, stringing macros together and relying heavily on templates and their preferred folder structures. Even on iPad OS, people can have their custom-tailored workflow regarding file placement, syncing with cloud storage, etc., just in a restricted manner and for what it's worth, I sometimes prefer using Alight Motion for certain video editing tasks on my smartphone over grabbing my laptop.
I have seen and feel strongly that any professional from any field can have a customized workflow and can benefit from the ability to customize their toolset, even those outside programming, but I also feel equally strongly that sane defaults must remain and the "iPad way of doing things", as much as I in my ancient mid-twenties will never fully adapt to it, must remain for people who prefer and thrief in that environment.
It has nothing to do with programming, I know mechanics who complain about car models where the manuals costs massive amounts of money if they are even allowed to get it at all and it takes weeks to order them. This should be a familiar story in every field. Do artists not have ateliers full of custom brushes, things they found work for them and they customized? Not to mention that artists these days are Maya, Autodesk and Photoshop users. Is that pen really powerful enough? Because a pen is really close to a mouse pointer anyway, so why even stick to an ipad then, you can simple buy a pen and board for the desktop computer. This is not about whether I am a programmer or not, this is about why some praise and use Apple devices for professionals even though they are not the best choice.
> A professional in any field will ultimately want to customize their tool or make their own tools and that is not possible with ipads or even really software on iPads.
I was responding to this main point.
Every professional drawing app, on the iPad, allows you to make your own brushes. There's no limitation there. That's just a fundamental requirement of a drawing app. They're not customizing the workflow or tool/apps itself, which is what I thought you were referring to.
> make their own tools
This requires programming, does it not? Do you have some examples?
> why some praise and use Apple devices for professionals even though they are not the best choice.
Especially for drawing, I think it would be best to ask the professionals why they chose a ~1lb iPad in their backpack with a pixel perfect stylus, over a desktop computer and mouse. The answers might surprise you.
Sure, which most professionals don't use. If you think the average professional can program, or use scripting engines, it's because you're a HN user, and probably a programmer, not an average professional, and less likely one that uses an iPad.
But, there's nothing technically stopping an app developer from implementing any of this, including desktop level apps. Compute, keyboard/mouse, and stylus is ready. I think the minuscule market that would serve is what's stopping them.
It's actually come a long way. The workflow is still… sub-optimal, but there are some really nice terminal apps (LaTerminal, Prompt, ShellFish, iSH) which are functional too. Working Copy is pretty dope for working with git once you get you adapt to it.
I do most of my dev on a Pi5 now, so actually working on the iPad is not that difficult.
If they ever release Xcode for iPadOS that would be a true gamechanger.
"Prevented from coding" is just not true. There are many python IDEs, Swift Playgrounds, etc. Pythonista, and the like, are neat because you get full access to all the iPhone sensors.
And maybe that's fine? Look at it from the opposite side. All those artists complaining about how terrible macbooks are because you can't draw on them.
This has nothing to do with age. I have an iPad Pro that I barely use because it has been designed for use cases that I just don't have.
I don't do any digital art, don't take handwritten notes, and don't need to scan and/or mark up documents very often. I don't edit photos or videos often enough to need a completely separate device for the task.
I mostly use my computers for software development, which is impossible on an iPad. I tried running my dev tools inside an iSH session, and also on a remote Linux box that I could SSH into. It wasn't a great experience. Why do this when I could just run VS Code or WebStorm on a Mac?
I also write a lot -- fiction, blog posts, journal entries, reading notes -- which should technically be possible to do well on the iPad. In practice there just aren't enough powerful apps for serious long-form writing on a tablet. Microsoft Word on iPad lacks most of the features of its desktop counterpart, Scrivener doesn't support cloud sync properly, iA Writer is too limited if you're writing anything over a few thousand words, and Obsidian's UI just doesn't work well on a touch device. The only viable app is Ulysses, which is ... okay, I guess? If it floats your boat.
I sometimes do music production. This is now possible on the iPad via Logic Pro. I suppose I could give it a try, but what does that get me? I already own an Ableton license and a couple of nice VSTs, none of which transfers over to the iPad. I can also download random third-party apps to manipulate audio on my Mac, or mess around with Max4Live, or use my Push 2 to make music. Again, this stuff doesn't work on an iPad and it never will, because the APIs to enable these things simply don't exist.
There are tons of people who use Windows because they need to use proprietary Windows-based CAD software. Or people who need the full desktop version of Excel to do their jobs. Or academics and researchers who need Python/R to crunch data. All of these people might LOVE to use something like these new iPads, but they can't because iPadOS just can't meet their use cases.
I really like the idea of a convertible tablet that can support touch, stylus, keyboard, and pointer input. The iPad does a great job at being this device as far as hardware and system software is concerned. But unfortunately, it's too limited for a the kinds of workflows people using laptops/desktops need to do.
Oh, I'm with you. But the funny thing is, they won't even want it.
I have two iPads and two pencils—that way each iPad is never without a penicl—and yet I rarely use the pencil. I just don't think about it. But then when I do, I'm like, "Why don't I use this more often? It's fantastic."
I have tried and tried to adapt and I can not. I need a mouse, keyboard, seperate numpad, and two 5K Displays to mostly arrive at the same output that someone can do with a single 11" or 13" screen and a bunch of differnt spaces that can be flicked through.
I desperatedly wanted to make the iPad my primary machine and I could not do it. But, honestly, I think it has more to do with me than the software. I've become old and stubborn. I want to do things my way.
The existence of vscode.dev always makes me wonder why Microsoft never released an iOS version of VSCode to get more users into its ecosystem. Sure, it's almost as locked down as the web environment, but there's a lot of space in that "almost" - you could do all sorts of things like let users run their code, or complex extensions, in containers in a web view using https://github.com/ktock/container2wasm or similar.
I watched someone do some incredibly impressive modelling on an iPad Pro via shapr3D, and yeah, it was a young person.
I’m into the idea of modelling like this, or drawing, but the reality is I spend most of my time and money on a desktop work station because the software I need most is there. I’m totally open to iPads being legit work machines, but they’re still too limited (for me) to make the time and cash investment for the transition.
You’re definitely right though. People are doing awesome work on them without the help of a traditional desktop or laptop computer.
It's not a matter of being young or old, it's that iPadOS is not tooled to be a productive machine for software developers, but IS tooled to be productive machine for artists.
The examples given are always artists, whose jobs are actively on the chopping block due to AI models and systems which checks notes don't even run that effectively on apple hardware yet!
Of course, SWE jobs are on the chopping block for the same reasons, but I claim that AI art models are ahead of AI coding models in terms of quality and flexibility.
To be honest, I wish my iPad's chip was slower! I can't do anything other than watch videos and use drawing programs on an iPad, why does it need a big expensive power hungry and environmentally impactful CPU when one 1/10 the speed would do?
If I could actually do something with an iPad there would be a different discussion, but the operating system is so incredibly gimped that the most demanding task it's really suited for is .. decoding video.
> Why does it need a big expensive power hungry and environmentally impactful CPU when one 1/10 the speed would do?
Well, it's not. Every process shrink improves power efficiency. For watching videos, you're sipping power on the M4. For drawing...well if you want low latency while drawing, which generally speaking, people do, you...want the processor and display to ramp up to compensate and carry strokes as fast as possible?
Obviously if your main concern is the environment, you shouldn't upgrade and you should hold onto your existing model(s) until they die.
From what I can tell, the 2020 iPad has perfectly fine latency while drawing, and Apple hasn't been advertising lower latencies for each generation; I think they pretty much got the latency thing nailed down. Surely you could make something with the peak performance of an A12Z use less power on average than an M4?
As for the environmental impact, whether I buy or don't buy this iPad (I won't, don't worry, my 2020 one still works), millions of people will. I don't mind people buying powerful machines when the software can make use of the performance, but for iPad OS..?
The M4 is built on the newest, best, most expensive process node (right?). They've got to amortize out those costs, and then they could work on something cheaper and less powerful. I agree that they probably won't, and that's a shame. But still, the M4 is most likely one of the best options for the best use of this new process node.
>To be honest, I wish my iPad's chip was slower! I can't do anything other than watch videos and use drawing programs on an iPad, why does it need a big expensive power hungry and environmentally impactful CPU when one 1/10 the speed would do?
A faster SoC can finish the task with better "work done/watt". Thus, it's more environmentally friendly. Unless you're referring to the resources dedicated to advancing computers such as the food engineers eat and the electricity chip fabs require.
A faster and more power hungry SoC can finish the task with better work done per joule if it is fast enough to offset the extra power consumption. It is my understanding that this is often not the case. See e.g efficiency cores compared to performance cores in these heterogeneous design; the E cores can get more done per joule AFAIU. If my understanding is correct, then removing the P cores from the M4 chip would let it get more work done per joule.
Regardless, the environmental impact I'm thinking about isn't mainly power consumption.
I don't know the details of iOS's scheduler or how it decides which tasks should go on which kind of core, but the idea is to put tasks which benefit from high performance on the P-cores, right?
I'm under the impression that this CPU is faster AND more efficient, so if you do equivalent tasks on the M4 vs an older processor, the M4 should be less power hungry, not more. Someone correct me if this is wrong!
It's more power efficient than the M3, sure, but surely it could've been even more power efficient if it had worse performance simply from having fewer transistors to switch? It would certainly be more environmentally friendly at the very least!
The most environmentally friendly thing to do is to keep your A12Z for as long as you can, ignoring the annual updates. And when the time comes that you must do a replacement, get the most up to date replacement that meets your needs. Change your mindset - you are not required to buy this one, or the next one.
Of course, I'm not buying this one or any other until something breaks. After all, my current A12Z is way too powerful for iPadOS. It just pains me to see amazing feats of hardware engineering like these iPads with M4 be completely squandered by a software stack which doesn't facilitate more demanding tasks than decoding video.
Millions of people will be buying these things regardless of what I'm doing.
They are for those tasks, where you do need high performance. Where you would wait for your device instead. A few tasks require all cpu power you can get, so that is what the performance cores are for. But most of the time, it will consume a fraction of that power.
My whole point is that iPadOS is such that there's really nothing useful to do with that performance. No task an iPad can do requires CPU power (except for maybe playing games but that'll throttle the M4 to hell and back anyway).
It has 6 efficiency cores. Every single of them is extremely power efficient, but still faster than an iPad 2-3 generations back. So unless you go full throttle, a M4 is going to be by far the most efficient CPU you can have.
If I had to ballpark estimate this, your iPad probably uses less energy per year than a strand of incandescent holiday lights does in a week. Maybe somebody can work out that math.
How do you suggest they make new iPads for people who want them? Someone has to make new CPUs and if you can improve perf/W while you're doing so you might as well.
They already released all their macbooks and latest iphone on N3B which is the worst-yielding 3nm from TSMC. I doubt yields are the issue here.
It's suspected that the fast release for M4 is so TSMC can move away from the horrible-yielding N3B to N3E.
Unfortunately, N3E is less dense. Paired with a couple more little cores, an increase in little core size, 2x larger NPU, etc, I'd guess that while M3 seems to be around 145mm2, this one is going to be quite a bit larger (160mm2?) with the size hopefully being offset by decreased wafer costs.
I think this is the most likely explanation. Lower volume for the given product matches supply better, and since it's clocked down and has a lower target for GPU cores it has better yields.
Yeah, I don't think about charging my iPad throughout the day, and I constantly use it. Maybe it's in the low 20s late at night, but it never bothered me.
My current assumption is that this has to do with whatever "AI" Apple is planning to launch at WWDC. If they launched a new iPad with an M3 that wasn't able to sufficiently run on-device LLMs or whatever new models they are going to announce in a month, it would be a bad move. The iPhones in the fall will certainly run some new chip capable of on-device models, but the iPads (being announced in the Spring just before WWDC) are slightly inconveniently timed since they have to announce the hardware before the software.
You probably have people (like myself) trying to keep up with the latest MacBook Air who get fatigued having to get a new laptop every year (I just upgraded to the M3 not too long ago, from the M2, and before that... the M1... is there any reason to? Not really...), so now they are trying to entice people who don't have iPads yet / who are waiting for a reason to do an iPad upgrade.
For $1,300 configured with the keyboard, I have no clue what I'd do with this device. They very deliberately are keeping iPadOS + MacOS separate.
The only reason I upgraded is my wife “stole” my M1 air. I bought a loaded M3 MBP and then they came out with a 15” Air with dual monitor capabilities. Kinda wish I had the air again. It’s not like I move it around much but the form factor is awesome.
I love the air form factor. I do serious work on it as well. I have used a pro, but the air does everything I need without breaking a sweat, and it's super convenient to throw in a bag and carry around the house.
Same here, my M1 Air still looks and feels like a brand new computer. Like, I still think of it as "my new MacBook". It's my main machine for dev work and some hobby photography and I'm just so happy with it.
I'm sort of "incentivized" to by Apple because as soon as they release a new one, the current device you have will be at "peak trade in value" and deteriorate over time.
It's a negligible amount of money. It's like, brand new $999, trade in for like $450. Once a year... $550 remainder/12 months is $45.75/mo to have the latest and greatest laptop.
How much is a 2-year old laptop worth? Because if you buy a new laptop every two years and don't even sell the old one, you're only spending $500 a year, which is less than you are now.
You really shouldn't trade-in your laptop on the basis of trying to maximise its trade-in value, that doesn't make economic sense.
You should be incentivised by trying to minimise depreciation. You incur the greatest amount of depreciation closest to the date of purchase, so the longer you go between purchases, the less depreciation you'll realise.
If I expected to say get, $450 after 1 year, and $250 after 2 years. By trading in every 2 years, I'm getting a laptop that's a bit older, but you're also saving $14.58/month on depreciation. If the year after that becomes $150, you'd be saving $22.22/month. If the price is worth it is subjective, I'm just saying going for maximal trade in value doesn't really make sense, since you save more money the lower the trade-in value you get.
Still using my M1 Air and had no interest in updating to M3. Battery life has dropped a fair amount, but still like 8+ hours. That's going to be the trigger to get a new one. If only batteries lasted longer.
I'm wondering if it's because they're hitting the limits of the architecture, and it sounds way better to compare M4 vs M2 as opposed to vs M3, which they'd have to do if it launched in a Macbook Pro.
Eh, they compared the M3 to the M1 when they launched it. People grumbled and then went on with their lives. I don't think they'd use that as a reason for making actual product decisions.
My guess is that the M4 and M3 are functionally almost identical so there's no real reason for them to restrict the iPad M4 launch until they get the chip into the MacBook / Air.
I get frustrated seeing this go into the iPad and knowing that we can't get a shell, and run our own binaries there. Not even as a VM like [UserLAnd](https://userland.tech). I could effectively travel with one device less in my backpack but instead I have to carry two M chips, two displays, batteries, and so on...
It's great to see this tech moving forward but it's frustrating to not see it translate into a more significant impact in the ways we work, travel and develop software.
> A leaker has claimed that Apple is working on a version of macOS exclusive for the M2 iPad Pro ... the exclusivity to M2 iPad Pro could be a marketing push. If the feature is only available on that iPad, more people would buy it.
Based on the M4 announcement, vMacOS could be exclusive to the 1TB/2TB iPad Pro with 16GB RAM that would be helpful for VMs.
gpu passthrough for VMs is not supported on apple silicon period afaik. there may be some "native" renderer built on top of metal but apple doesn't support SR-IOV or "headless passthrough".
otoh no, it is not "more or less [automatic]" in other hardware either, SR-IOV has been on the enthusiast wishlist for a ridiculously long time now because basically nobody implements it (or, they restrict it to the most datacenter-y of products).
intel iGPUs from the HD/UHD Intel Graphics Technology era have a concept called GVT-g which isn't quite SR-IOV but generally does the thing. Newer Xe-based iGPUs do not support this, nor do the discrete graphics cards.
AMD's iGPUs do not have anything at all afaik. Their dGPUs don't even implement reset properly, which is becoming a big problem with people trying to set up GPU clouds for AI stuff - a lot of times the AMD machines will need a hard power reset to come back.
NVIDIA GPUs do work properly, and do implement SR-IOV properly... but they only started letting you do passthrough recently, and only 1 VM instance per card (so, 1 real + 1 virtual).
Curious what you're using (I'm guessing intel iGPU or nvidia dGPU) but generally this is still something that gets Wendell Level1techs hot and bothered about the mere possibility of this feature being in something without a five-figure subscription attached.
It does suck that Apple refuses to implement vulkan support (or sign graphics drivers), I think that's de-facto how people interact with most "hardware accelerated graphics" solutions in vmware or virtualbox, but SR-IOV is actually quite a rare feature, and "passthrough" is not sufficient here since the outer machine still needs to use the GPU as well. The feature point is SR-IOV not just passthrough.
Call me crazy, but I want all that power in a 7" tablet. I like 7" tablets most because they feel less clunky to carry around and take with you. Same with 13" laptops, I'm willing to sacrifice on screen real estate for saving myself from the back pain of carrying a 15" or larger laptop.
Some of this is insanely impressive. I wonder how big the OS ROM (or whatever) is with all these models. For context, even if the entire OS is about 15GB, in order to get some of these features locally just for an LLM on its own, its about 60GB or more, for something ChatGPT esque. Which requires me to spend thousands on a GPU.
Apologies for the many thoughts, I'm quite excited by all these advancements. I always say I want AI to work offline and people tell me I'm moving the goalpost, but it is truly the only way it will become mainstream.
I'd love to see them add something to that form factor.
I do see a lot of iPad Minis out there, but usually, as part of dedicated systems (like PoS, and restaurant systems).
On the other hand, I have heard rumblings that Apple may release an even bigger phone, which I think might be overkill (but what do I know. I see a lot of those monster Samsung beasts, out there).
Not sure that is for me. I still use an iPhone 13 Mini.
I suspect that my next Mac will be a Studio. I guess it will be an M4 Studio.
I wanted to buy a Mini, but they had not updated the processors for them when I was buying, and they cost way more than a regular iPad at the time, I wanted to be budget conscious. I still sometimes regret not just going for the Mini, but I know eventually I'll get one sooner or later.
You know whats even funnier, when the mini came out originally, I made fun of it. I thought it was a dumb concept, oh my ignorance.
I have access to multiple iPad sizes and I personally only use the mini. Is almost perfect. Last year of its long life cycle you start to feel the age of the processor but still better than holding the larger devices. Can’t wait for it to be updated again.
I wish they would stop doing this weird release cycle where some of their tablets don't get the updated chips. It's really frustrating. Makes me hesitant to buy a tablet if I feel like it could get an upgrade a week later or whatever.
It certainly seems less than ideal for pro/prosumer buyers who care about the chips inside.
I would guess that Apple doesn't love it either; one suspects that the weird release cycle is at least partially related to availability of chips and other components.
I probably would have pulled the trigger on a price drop, but at 600+eur for an old version, I'm just not as into that, as I really expect it to be lasting many years.
If your back is hurting from the ~1lb extra going from 13" to 15", I would recommend some body weight exercises. Your back will thank you, and you'll find getting older to be much less painful.
Regarding a small iPad, isn't that the iPad mini? 8" vs 7" is pretty close to what you're asking for.
I highly recommend doing pull-ups for your posture and health. It was shocking to me how much the state of my spine improved after doing pull-ups as a daily exercise.
I just have a bar in my apartment in a doorway. Sometimes when I walk by I do 3 - 8 pull ups, then go on my way. Do that a few times a day and you’re doing pretty good. Sometimes I’ll do a few L pull ups as well.
If I’m doing pull ups in the gym I’ll do 3 sets of 7. That’s the most I can do at the moment.
The average HN based-boy apple user has almost negative arm strength. You're asking them to start with pull ups? They need to be able to do a real push-up first!
> Call me crazy, but I want all that power in a 7" tablet. I like 7" tablets most because they feel less clunky to carry around and take with you.
iPhone Pro Max screen size is 6.7" and the the upcoming iPhone 16 Pro Max is rumored to be 6.9" with 12GB of RAM. That's your 7" tablet right there.
The thing is - You're an extreme edge case of an edge case. Furthermore, I'm guessing if Apple did roll out a 7" tablet, you'd find some other thing where it isn't exactly 100% perfectly meeting your desired specifications. For example, Apple is about to release a high powered 6.9" tablet-like device (the iPhone 16 Pro Max) but I'm guessing there's another reason why it doesn't fit your needs.
Which is why companies like Apple ignore these niche use cases and focus on mainstream demands. The niche demands always gain a lot of internet chatter, but when the products come out they sell very poorly.
I’ve got an iPad mini. The main issue is the screen scratches. The other main issue is the screen is like a mirror, so it can’t be used everywhere to watch videos (which is the main thing the iPad is useful for). The third main issue is that videos nowadays are way too dark and you can’t adjust brightness/gamma on the iPad to compensate.
How does one differentiate between cheap and good products? How do I know they actually work? How does one know they don‘t destroy the coating on apple devices?
Apple doesn‘t even sell an official matte screen protector last I checked.
Currently building an AI creative studio (make stories, art, music, videos, etc.) that runs locally/offline (https://github.com/bennyschmidt/ragdoll-studio). There is a lot of focus on cloud with LLMs but I can't see how the cost will make much sense for involved creative apps like video creation, etc. Present day users might not have high-end machines, but I think they all will pretty soon - this will make them buy them the way MMORPGs made everyone buy more RAM. Especially the artists and creators. Remember, Photoshop was once pretty difficult to run, you needed a great machine.
I can imagine offline music/movies apps, offline search engines, back office software, etc.
I am not a large person by any means, yet I have no problem to carry a MBP 16...But then I have a backpack and not a messenger like bag, which I would agree, would be a pain to carry.
> or Apple’s silicon game is racing far ahead of what I considered possible
Gruber's strange assumption here is that a new number means some major improvements. Apple has never really been consistent about sticking to patterns in product releases.
He spills Apple's secrets. Gruber had him on his podcast once and called him a super villain in the Apple's universe, or something like this. It was cringeworthy
Wasn't it relatively well known that the M3 is on an expensive process and quickly getting to an M4 on a cheaper/higher yield process would be worth it?
Only iPad Pro has M4? Once upon a time during the personal computer revolution in the 1980s, little more than a decade after man walked the moon, humans had sufficiently technologically developed that it was possible to compile and run programs on the computers we bought, whether the computer was Apple (I,II,III, Mac), PC, Commodore, Amiga, or whatever. But these old ways were lost to the mists of time. Is there any hope this ancient technology will be redeveloped for iPad Pro within the next 100 years? Specifically within Q4 of 2124, when Prime will finally offer deliveries to polar Mars colonies? I want to buy an iPad Pro M117 for my great-great-great-great-granddaughter but only if she can install a C++ 212X compiler on it.
I've got a Mac Pro paperweight because the motherboard went. It's going to the landfill. I can't even sell it for parts because I can't erase the SSD. If they didn't solder everything to the board you could actually repair it. When I replace my current Dell laptop, it will be with a repairable framework laptop.
It will be easy to break in time. Eventually you'll just be able to use a tool that shines a laser at the right bit and breaks the rate limiting. We've already seen similar attacks on hardware wallets previously thought invulnerable.
I don't think any cryptography has stood the test of time. It's unlikely anything today will survive post-quantum.
Even repairable only buys you a few years repairability that actually makes sense. For example something similar happened to me, lost the mac mobo on a pre solder addiction model. Only thing is guess how much a used mobo is for an old mac: nearly as much as the entire old mac in working shape. It makes no sense to repair it once the computer hits a certain age between the prices of oem parts and the depreciation of computers.
ok but now get this: what if we started a program where people prepay part of the repair with an initial fee, and then for a couple years they can have their laptop repaired at a reduced, fixed price? That helps secure the supply chain. You could then partner with a retail computer store (or start your own!) and have a network of brick-and-mortar stores with subject-matter experts to perform the repairs as well as more minor troubleshooting etc. It’d basically be like healthcare, but for your computer!
I think if you partnered with a major computer brand, that kind of thing could really be huge. Maybe someone like framework perhaps. Could be a big brand discriminator - bring that on-site service feel to average consumers.
Thats basically applecare+ already. You pay like $100 upfront to get things like your phone screen fixed for $29 instead of $129. So it works out in your favor if you are one to go through a few phone screens per device. Past couple phones I've had I'm under 1 screen per phone's life on average so it works in my favor not to get applecare+ and just pay out of pocket the few times I go through a screen.
Get a heat gun and remove the NAND. Then sell the rest of it to a local repair store or just give them for free if it's an old Mac Pro. The parts in your Mac Pro are something someone can reuse to restore their Mac Pro instead of a landfill. Not every part is security related. Also Apple may take the Mac Pro itself and give you store credit cause they do recycle it.
i don't think you can do that. there was just a video on here last week of a repair shop drilling the memory out, as that was the only way to remove it without damaging the motherboard.
They accept Apple branded computers for recycling if it has no trade in value (they'll try to get you an offer if it has any value). I have recycled damaged apple computers at the store before without trading in.
Now you claim you "went there" and discovered they do accept recycling but only if you mail it.
One of those is necessarily false, since I doubt you went to the Apple Store in between your comments.
However, I suspect both your claims are wrong, because Apple stores absolutely accept old devices to recycle directly. (They also provide mail-in options for people who don't have one they can visit directly.)
From your many comments, it seems like you have an ideological axe to grind that somehow your device can't be recycled, despite abundant evidence to the contrary and lots of people here trying to help you.
If they took it at their store, fine. If they want me to take an hour to go print a label (I don't have a printer), and then another hour to package it up and ship it. I'll pass.
They also say to erase the data before shipping it - which I can't do.
That's mostly my conclusion, unfortunately, also. There is also some nonzero culpability of cost of time-money-hassle Apple and local municipalities shift onto the owner too.
Depending on where, a lot of electronics "recyclers" are actually resellers. Some of them are even cheeky enough to deny electronics they know they can't resell (If they're manned.. many are cage-drops in the back of eg Staples)
Just because you lack the skills to fix it, doesn't mean it's not repairable. People desolder components all the time to fix phones and ipads and laptops.
That stuff makes it more difficult to work on, but it doesn't make it impossible for Apple to sell replacement motherboards... nor does making a "thin desktop" require soldering on SSDs, M.2 SSDs are plenty thin for any small form factor desktop use case.
It is being optimized, it's just that the optimization is geared towards vacuuming money from brainwashed pockets instead of making a product that's worth the money.
In this case, you need to find working motherboard without soldered parts to be able to fix it cost efficiently. Otherwise you need to buy factory component (for extra price, with soldered components...)
> M4 makes the new iPad Pro an outrageously powerful device for artificial intelligence.
Isn’t there a ToS prohibition about “custom coding” in iOS? Like, the only way you can ever use that hardware directly is for developers who go through Apple Developer Program, which last time I heard was bitter lemon? Tell me if I’m wrong.
Well, this is the heart of the "appliance" model. iPads are appliances. You wouldn't ask about running custom code on your toaster or your blender, so you shouldn't ask about that for your iPad. Also all the common reasons apply: Security and Privacy, Quality Control, Platform Stability and Compatibility, and Integrated User Experience. All of these things are harmed when you are allowed to run custom coding.
(disclaimer: My personal opinion is that the "appliance" model is absurd, but I've tried to steel-man the case for it)
That may be your personal preference, but you should accept that 99% of people don't care about programming their toaster, so you're very unlikely to ever make progress in this fight.
Could apply this for anything complex and packaged.
I’m annoyed that I can’t buy particular engines off the shelf and use them in my bespoke approach, why dont car manufacturers give the approach that crate engine providers do?
Then I wish you the best of luck in your fight. In the meantime, don't drag me down or tell me that I'm wrong just because you, personally, don't want something that I want that also doesn't harm you in the slightest.
> If you sell me a CPU, I want the power to program it, period.
Uhhh, there are CPUs in your frickin' wires now, dude! There are several CPUs in you car for which you generally don't have access. Ditto for your fridge. Your microwave. Your oven. Even your toaster.
We're literally awash in CPUs. You need to update your thinking.
Now, if you said something like "if you sell me a general-purpose computing device, then I want the power to program it, period" then I would fully agree with you. BTW, you can develop software for your own personal use on the iPad. It's not cheap or easy (doesn't utilize commonly-used developer tooling), but it can be done without having to jump through any special hoops.
Armed with that, we can amend your statement to "if you sell me a general-purpose computing device, then I want the power to program it using readily-available, and commonly-utilized programming tools."
I think that statement better captures what I presume to be your intent.
> but it can be done without having to jump through any special hoops.
You are really stretching the definition of "special hoops" here. On Android sideloading is a switch hidden in your settings menu; on iOS it's either a municipal feature or a paid benefit of their developer program.
Relative to every single other commercial, general-purpose operating system I've used, I would say yeah, Apple practically defines what "special hoops" look like online.
I do actually want the ability to program the CPUs in my car the same way I'm able to buy parts and mods for every mechanical bit in there down to the engine. In fact we have laws about that sort of thing that don't apply to the software.
I mean this sincerely, are you really an Apple customer then? I feel exactly the same as you, and for that reason I don't buy Apple products. They are honest about what they sell, which I appreciate.
Ever notice people don't build their own cars anymore? They used to even up through the 60's. I mean ordering a kit or otherwise purchasing all the components and building the car. Nowadays it's very rare that people do that.
I'm old enough to remember when people literally built their own computers, soldering iron in hand. People haven't done that since the early 80's.
Steve Jobs' vision of the Mac, released in 1984, was for it to be a computing appliance - "the computer for the rest of us." The technology of the day prevented that. Though they pushed that as hard as they could.
Today's iPad? It's the fulfillment of Steve Jobs' original vision of the Mac: a computing appliance. It took 40 years, but we're here.
If you don't want a computing appliance then don't buy an iPad. I'd go further and argue don't buy any tablet device. Those that don't want computing appliances don't have to buy them. It's not like laptops, or even desktops, are going anywhere anytime soon.
> If you don't want a computing appliance then don't buy an iPad.
If you do want a computing appliance, then there's nothing wrong with having a machine that could be reprogrammed that you simply choose not to reprogram. Please stop advocating for a worse world for the rest of us when it doesn't benefit you in the slightest to have a machine that you don't control.
Stop being so damned melodramatic. I'm not advocating for a "worse world for the rest of us." There are a plethora of choices for machines that aren't appliances. In fact, the overwhelming majority of machines are programmable. Apple thinks the market wants a computing appliance. The market will decide. Meanwhile, you have lots of other choices.
Agree completely. I think it's absurd that they talk about technical things like CPU and memory in these announcements. It seems to me like an admission that it's not really an "appliance" but trying to translate Apple marketing into logical/coherent concepts can be a frustrating experience. I just don't try anymore.
I appreciate the steel-man. A strong counter argument for me is that you actually can run any custom code on an iPad, as long as it's in a web-browser. This is very unlike an appliance where doing so is not possible. Clearly the intention is for arbitrary custom code to run on it, which makes it a personal computer and not an appliance (and should be regulated as such).
That's a fair point, although (steel-manning) the "custom code" in the browser is severely restricted/sandboxed, unlike "native" code would be. So from that perspective, you could maybe expand it to be like a toaster that has thousands of buttons that can make for hyper-specific stuff, but can't go outside of the limits the manufacturer built in.
As with any Apple device — or honestly, any computing device in general — my criteria of evaluation would be the resulting performance if I install Linux on it. (If Linux is not installable on the device, the performance is zero. If Linux driver support is limited, causing performance issues, that is also part of the equation.)
NB: those are my criteria of evaluation. Very personally. I'm a software engineer, with a focus on systems/embedded. Your criteria are yours.
(But maybe don't complain if you buy this for its "AI" capabilities only to find out that Apple doesn't let you do anything "unapproved" with it. You had sufficient chance to see the warning signs.)
There's the potential option of Swift Playgrounds which would let you write / run code directly on the iPad without any involvement in the developer program.
You're not wrong. It's why I don't use apple hardware anymore for work or play. On Android and Windows I can build and install whatever I like, without having to go through mother-Apple for permission.
Generally, I feel that telling a company how to handle a product line as successful as the iPads doesn't make much sense (what does my opinion matter vs their success), but I beg you, please make Xcode available on iPad OS or provide an optional and separate MacOS mode similar to Dex on Samsung tablets. Being totally honest, I don't like MacOS that much in comparison to other options, but we have to face the fact that even with the M1, the iPads raw performance was far beyond the vast majority of laptops and tablets in a wide range of use cases, yet the restrictive software made that all for naught. Consider that the "average" customer is equally happy with and, due to pricing, generally steered towards the iPad Air, which are great devices that cover the vast majority of use cases essentially identical to the Pro.
Please find a way beyond local transformer models to offer a true use case that differentiates the Pro from the Air (ideally development). The second that gets announced, I'd order the 13-inch model straight away. As it stands, Apple's stance is at least saving me from spending 3,5k as I've resigned myself to accept that the best hardware in tablets simply cannot be used in any meaningful way. Xcode would be a start, MacOS a bearable compromise (unless they start to address the instability and bugs I deal with on my MBP, which would make MacOS more than just a compromise), Asahi a ridiculous, yet beautiful pipedream. Fedora on an iPad, the best of hardware and software, at least in my personal opinion.
"The M4 is so fast, it'll probably finish your Final Cut export before you accidentally switch apps and remember that that cancels the export entirely. That's the amazing power performance lead that Apple Silicon provides." #AppleEvent
The export progress dialog says "Keep Final Cut Pro open until the export is complete", and the standard iPadOS limitations are that background tasks are killed after either 10 minutes or when some foreground app wants more RAM. So it it's not instantly cancelled but it's a precarious workflow compared to on a Mac.
The M1 Max that I have is easily the greatest laptop I've ever owned.
It is fast and handles everything I've ever thrown at it (I got 32 GB RAM), it never, ever gets hot, I've never heard a fan in 2+ years (maybe a very soft fan if you put your ear next to it). And the battery life is so incredible that I often use it unplugged.
It's just been a no-compromise machine. And I was thinking of upgrading to an M3 but will probably upgrade to an M4 instead at the end of this year when the M4 maxes come out.
Unlike the PC industry, Apple is/was able to move their entire ecosystem to a completely different architecture, essentially one developed exactly for low power use. Windows on ARM efforts will for the foreseeable future be plagued by application support and driver support. It's a great shame, as Intel hardware is no longer competitive for mobile devices.
M1 Max here too. I don't see the point of if/when M4 Max comes out as an upgrade because these updates are too frequent and too small in performance bumps to be justified. Not speed, RAM, storage, or features are a limiting factor. They're going to have to create more "only works on X" iDevice upgrade lock-in killer features to get people to upgrade artificially and unnecessarily.
Thats surprising you haven’t heard the fans. Must be the use case. There’s a few games that will get it quite hot and spool up the fans. I have also noticed its got somewhat poor sleep management and remains hot while asleep. Sometimes I pick up the computer for the first time that day and its already very hot from whatever kept it out of sleep with a shut lid all night.
Not sure what app you’ve installed to make it do that, but I’ve only experienced the opposite. Every Windows 10 laptop I’ve owned (4 of them) would never go to sleep and turn my bag into an oven if I forgot to manually shut down instead of closing the lid. Whereas my M1 MBP has successfully gone to sleep every lid close.
The Windows 10 image my employer uses for our Dell shitboxes has sleep completely disabled for some reason I cannot possibly comprehend. The only options in the power menu are Shut Down, Restart, and Hibernate.
If I forget to hibernate before I put it in my bag it either burns through its battery before the next day, or overheats until it shuts itself down. If I'm working from home and get up to pee in the night, I often walk past my office and hear the fans screaming into an empty room, burning god knows how much electricity. Even though the only thing running on it was Slack and an editor window.
It's an absolute joke of a machine and, while it's a few years old now, its original list price was equivalent to a very well specced MacBook Pro. I hope they were getting a substantial discount on them.
Now that there's the 13 inch iPad I am praying they remove the display notch on the Macbooks. It's a little wacky when you've intentionally cut a hole out of your laptop screen just to make it look like your phones did 2 generations ago and now you sell a tablet with the same screen size without that hole.
I really hate the notch[0], but I do like that the screen stretches into the top that would otherwise be entry. It's unsightly, but we did gain from it.
[0] Many people report that they stop noticing the notch pretty quickly, but that's never been the case for me. It's a constant eyesore.
What I've done is use a wallpaper that is black at the top. On the MBP's OLED screen that means the black bezel perfectly blends into the now black menu bar. It's pretty much a perfect solution but the problem it's solving is ridiculous IMO.
I do the same, I can’t see the notch and got a surprise the other day when my mouse cursor disappeared for a moment.
I don’t get the hate for the notch tho. The way I see it, they pushed the menus out of the screen and up into their own dedicated little area. We get more room for content.
It’s like the touchbar for menus. Oh, ok, now I know why people hate it. /jk
> The way I see it, they pushed the menus out of the screen and up into their own dedicated little area. We get more room for content.
Exactly - laughed at first but it quickly made sense if they are prioritizing decent webcam quality. Before my M1 Pro 14 I had a Dell XPS 13 that also had tiny bezels but squeezed the camera into the very thin top bezel. The result was a terrible webcam that I gladly traded for a notch and a better camera quality.
However, that Dell did still fit Windows Hello (face unlock) capability into that small bezel, so the absence of FaceID despite having the notch is a bit shit.
Will be interesting to see how it holds up on devices that get fingerprints and could be scratched though. Sort of wish Apple would offer it as a replaceable screen film.
Apple claimed M3 was 1.35 the speed of the M2. So the M3 vs M4 comparison isn't that impressive. Certainly not bad by any means, just pointing out why it is compared to the M2 here.
The other reason it is compared to the M2 is that there are no iPads with M3s in them, so it makes sense to compare to the processor used in the previous generation product.
People make this comment after every single m series release. Its true for intel too, worse even. Changes between like 8th and 9th and 10th gen were like nill, small clock bump same igpu even.
I was hoping they would come out and say "and now developers can develop apps directly on their iPads with our new release of Xcode" but yeah, no. Don't know if the M4 with just 16GB of memory would be very comfortable for any pro workload.
As an engineer, I find it extremelly frustrating to read Apple’s marketing speak. It almost sounds like ChatGPT and StarTrek techno-babble. Engineers cannot stomach reading the text, and non engineers wont bother reading it anyway.
Whats wrong with plain old bullet-points and sticking to the technical data?
The target audience is neither engineers nor general public; these announcements are meant for tech journalists/youtubers etc to refer to when writing or talking about it.
The only real issue is aside from the screen eventually wearing out ( it already has a bit of flex), I can't imagine a reason to upgrade. It's powerful enough to do anything you'd use an iPad for. I primarily make music on mine, I've made full songs with vocals and everything ( although without any mastering - I think this is possible in Logic on iPad).
It's really fun for quick jam sessions, but I can't imagine what else I'd do with it. IO is really bad for media creation, you have a single USB C port( this bothers me the most, the moment that port dies it becomes E Waste), no headphone jack...
I wish it had two USB C ports, one on the bottom and one on the side. Even if they only really were one internally at least you’d have more mounting options.
120 GB/s memory bandwidth. The M4 Max will probably top out at 4x that and the M4 Ultra at 2x that again. The M4 Ultra will be very close to 1TB/s of bandwidth. That would put the M4 Ultra in line with the 4090.
Rumours are that the Mac Studio and Mac Pro will skip M3 and go straight to M4 at WWDC this summer, which would be very interesting. There has also been some talk about an M4 Extreme, but we've heard rumours about the M1 Extreme and M2 Extreme without any of those showing up.
Well at the end of the day the processors are bottlenecked by its OS. What real value does an iPad bring that a typical iPhone + Mac combo misses? (Other than being a digital notebook…)
I wound up getting a 2019 iPad Pro for 50% off, so $500 or so. Thought I would use it as a work/play hybrid.
Surprisingly (at least to me) I feel that I've more than gotten my money's worth out of it despite it being almost entirely strictly a consumption device.
I tote it around the house so I can watch or listen to things while I'm doing other things. It's also nice to keep on the dining room table so I can read the news or watch something while we're eating. I could do every single one of these things with my laptop, but... that laptop is my primary work tool. I don't like to carry it all over the place, exposing it to spills and dust, etc.
The only real work-related task is serving as a secondary monitor (via AirPlay) for my laptop when I travel.
$500 isn't pocket change, but I've gotten 48 months of enjoyment and would expect at least another 24 to 36 months. That's about $6 a month, or possibly more like $3-4 per month if I resell it eventually.
Yeah I had the same experience and ultimately got rid of an M1 iPad Pro for an M3 MacBook Air. I still have the ease and portability for watching videos while doing the dishes or on an airplane, with the added benefit of a keyboard and OS for productivity in case the muse visits.
My wife has a new iPad for grad school, and I'm convinced it's mainly an extra category for some customers to spend more money on if they already have a Mac and iPhone. The school supplied it, then she spent $400+ on the keyboard and other damn dongles to bring the hardware sorta up to par with a laptop, hoping to replace her 2013 MBP.
In the end, she still has to rely on the MBP daily because there's always something the iPad can't do. Usually something small like a website not fully working on it.
This.
If you’re anything other than a digital artist/someone who genuinely prefers writing over typing, an iPad is just an extra tool for you to waste your money on.
I had one of the earlier versions and this was pretty much its only use case…
Maybe I'm getting blasé about the ever improving technical capabilities, but I find the most astounding thing is that the M4 chip, an OLED screen on one side, and aluminium case on the other can fit in 5.1mm!
Pretty much. We've reached peak technical capabilities and hype. Perhaps new sensors, make it sprout propellers and turn into a UAV that follows you around the room and can never be dropped, or at least make it waterproof to 4m.
>> And with AI features in iPadOS like Live Captions for real-time audio captions, and Visual Look Up, which identifies objects in video and photos, the new iPad Pro allows users to accomplish amazing AI tasks quickly and on device.
iPad Pro with M4 can easily isolate a subject from its background throughout a 4K video in Final Cut Pro with just a tap, and can automatically create musical notation in real time in StaffPad by simply listening to someone play the piano. And inference workloads can be done efficiently and privately...
These are really great uses of AI hardware. All of them benefit the user, where many of the other companies doing AI are somehow trying to benefit themselves. AI as a feature vs AI as a service or hook.
The new document scanning functionality the camera bump helps enable is really nice. That previously required third party apps, which started out great (Swiftscan) then got greedy and turned into monthly subscriptions. I will happily enjoy Apple erasing entire categories of simple apps that turned one time purchases into monthly subscriptions.
I understand that they have delayed the announcement of these iPads until the M4 is ready, otherwise there is nothing interesting to offer to those who have an iPad Pro M2. I don't see the convenience of having a MacBook M3 and an iPad M4. If I can't run Xcode on an iPad M4, the MacBook is the smartest option; it has a bigger screen, more memory, and if you complement it with an iPad Air, you don't miss out on anything.
> while Apple touts the performance jump of the 10-core CPU found inside the new M4 chip, that chip variant is exclusive to the 1 TB and 2 TB iPad Pro models. Lower storage iPads get a 9-core CPU. They also have half the RAM
You're probably correct about it being hard to make a decent iOS app in Swift Playgrounds, but it's definitely not a toy
I use it for work several times per week. I often want to test out some Swift API, or build something in SwiftUI, and for some reason it's way faster to tap it out on my iPad in Swift Playgrounds than to create a new project or playground in Xcode on my Mac — even when I'm sitting directly in front of my Mac
The iPad just doesn't have the clutter of windows and communication open like my mac does that makes it hard to focus on resolving one particular idea
I have so many playground files on my iPad, a quick glance at my project list: interactive gesture-driven animations, testing out time and date logic, rendering perceptual gradients, checking baseline alignment in SF Symbols, messing with NSFilePresenter, mocking out a UI design, animated text transitions, etc
Why would you need it? Modern iPads have thunderbolt ports (minimally USB-C) and already allow keyboards, network adapters, etc. to be connected. It would be like an iMac without the stand and an option to put it in a keyboard enclosure. Sounds awesome.
It might work if running in Mac mode required a reboot (no on the fly switching between iOS and macOS) and a connected KB+mouse, with the touch part of the screen (aside from Pencil usage) turning inert in Mac mode.
Otherwise yes, desktop operating systems are a terrible experience on touch devices.
> It might work if running in Mac mode required a reboot (no on the fly switching between iOS and macOS) and a connected KB+mouse, with the touch part of the screen (aside from Pencil usage) turning inert in Mac mode.
Sounds like strictly worse version of Macbook. Might be useful for occasional work, but I expect people who would use this mode continuously just to switch to Macbook.
The biggest market would be for travelers who essentially want a work/leisure toggle.
It’s not too uncommon for people to carry both an iPad and MacBook for example, but a 12.9” iPad that could reboot into macOS to get some work done and then drop back to iPadOS for watching movies or sketching could replace both without too much sacrifice. There’s tradeoffs, but nothing worse than what you see on PC 2-in-1’s, plus no questionable hinges to fail.
This is what I want, but with an iPhone (with an iPad would be cool, too). Sell me some insanely expensive dock with a USB-C display port output for a monitor (and a few more for peripherals) and when the phone is plugged in, it becomes macOS.
Maybe this June there'll be an announcement, but like Lucy with the football, I'm not expecting it. I would instabuy if this was the case, especially with a cellular iPad.
I just went to store.apple.com and specced out a 13" iPad Pro with 2TB of storage, nano-texture glass, and a cell modem for $2,599.
MacBook Pros start at $1,599. There's an enormous overlap in the price ranges of the mortal-person models of those products. It's not like the iPad Pro is the cheap alternative to a MBP. I mean, I couldn't even spec out a MacBook Air to cost as much.
> M4 has Apple’s fastest Neural Engine ever, capable of up to 38 trillion operations per second, which is faster than the neural processing unit of any AI PC today.
How useful this is for free libraries? Can you invoke it from your Python or C++ code in a straightforward manner? Does it not rely on proprietary drivers only available for their own OS?
It's a piece of hardware so it'll have their own driver but you can access it through the OS APIs just like any other piece of hardware. What's new is the improvement in speed, Apple has exposed their NPUs for years. You can e.g. run Stable Diffusion on them.
So long as it lets me play some of the less-intense 00's-10's era PC games in some sort of virtualization framework at decent framerates one day, and delivers great battery life as a backend web dev workstation-on-the-go the next, it's a good chip. The M2 Pro does.
Does anyone know how much of this giant leap performance as Apple puts it is really useful and perceived by end users of iPad. I am thinking gaming, art applications on iPad. What other major ipad use cases are out there that need this kind of performance boost.
Making music. The iPad is much better for performing than a computer. There is a huge range of instruments, effects, sequencers, etc. available on the iPad. Things like physical modeling and chained reverb can eat up processor cycles so more performance is always welcomed.
Both Final Cut Pro and Davinci resolve can also use as much power as you can give them though it isn’t clear to me why you’d use an iPad instead of a Mac. They also announced a crazy multicam app for iPads and iPhones that allows remote control of a bunch of iPhones at the same time.
I have a 3rd gen iPad Pro 12.9 for reading and other light activity. I haven't found any reason to upgrade for the past few years. I don't see myself getting another iPad unless this one dies or if Apple actually unlocks the potential of the hardware.
I'm guessing that the "ML accelerator" in the CPU cores means one of ARM's SME extensions for matrix multiplication. SME in ARM v8.4-A adds dot product instructions. v8.6-A adds more, including BF16 support.
Apple has the NPU (also called Apple Neural Engine), which is specific hardware for running inference. Can't be used for LLMs though at the moment, maybe the M4 will be different. They also have a vector processor attached to the performance cluster of the CPU, they call the instruction set for it AMX. I believe that that one can be leveraged for faster LLM inferencing.
Clever wording on their part: 2x performance per watt over M2. Took me a minute, had to reason through this is their 2nd generation 3nm chip, so it wasn't from a die shrink, then go spelunking.
This claim can only be evaluated in the context of a specific operating point. I can 6x the performance per watt of the CPU in this machine I am using by running everything on the efficiency cores and clocking them down to 1100MHz. But performance per watt is not the only metric of interest.
It surprised me they called it an M4 vs an M3 something. The display engine seems to be the largest change I don't know what that looked like on previous processors. Completely hypothesizing but could be a significant efficiency improvement if its offloading display stuff.
I'd rather they just keep counting up than some companies where they get into wonky product line naming convention hell.
It's ok if 3 to 4 is or isn't a big jump, it's the next one is really all I want to know. If I need to peek at the specs, the name really won't tell me anything anyhow and I'll be on a webpage.
All I want is more memory bandwidth at lower latency. I've learnt that's the vast majority of felt responsiveness today. I could care less about AI and Neural Engine party tricks, stuff I might use once a day or week.
Haven't they been announcing Pros and Max's around December? I don't remember. If they're debuting them at WWDC I'll definitely upgrade my M1. I don't even feel the need to, but it's been 2.5 years.
So is the iPad mini abandoned due to the profit margins being too small or what? I wish they'd just make it clear so I could upgrade without worrying a mini replacement will come out right after I buy something. And I don't really understand why there are so many different iPads now (Air/Pro/Standard). It just feels like Apple is slowly becoming like Dell... offer a bunch of SKUs and barely differentiated products. I liked when Apple had fewer products but they actually had a more distinct purpose.
They are talking about iPad Pro as the primary example of M4 devices. But iPads don't really seem to be limited by performance. Nobody I know compiles Chrome or does 3D renders on an iPad.
It’s all marketing toward people who aspire to be these creative types. Very, few people actually need it but it feels good when the iPad Air is missing a few key features that push you to the Pro.
More practically, it should help with battery life. My understanding is energy usage scales non-linearly with demand. A more powerful chip running at 10% may be more battery efficient than a less powerful chip running at 20%
~38 TOPS at fp16 is amazing, if the quoted number if fp16 (ANE is fp16 according to this [1] but that honestly seems like a bad choice when people are going smaller and smaller even at the higher level datacenter cards so not sure why apple would use it instead of fp8 natively)
Interesting, it seems Apple knows they are up against the wall where there isn't really much more their devices NEED to do. The cases that Apple gives for using these devices incredible compute is very marginal. Their phones computers and iPads are fantastic and mainly limited by physics. I have basically one of everything, love it all, and do not feel constrained by the devices of the last 3-4 years. Vision and Watch still leave room for improvement, but those are small lines. Limited opportunity to innovate, hence the Vision being pushed to market without a real path forward for customer need/use. Very few people read about the latest iPad m4 and think "oh wow my current iPad cant do that..."
Curious what steps they will take, or if they shouldn't just continue returning large amounts of cash to shareholders.
I got somewhat accustomed to new outrageous specs every year, but reading near the end that by 2030 Apple plans to be 'carbon neutral across the entire manufacturing supply chain and life cycle of every product' makes me hope one day my devices are not just a SUV on the data highway.
I don't think they included it in the video, but https://www.apple.com/ipad-pro/specs/ says it's 8 GB of RAM in 256/512 GB models, 16 GB RAM in 1/2 TB ones.
If you are not in a hurry, you almost never should buy new hardware as the next generation will be around the corner. On the other side, it could be up to 12 months, until the M4 is available across the line. And for most tasks, a M3 is a great value too. One might watch how many AI features that would benefit from a M4 are presented at WWDC. But then, the next Mac OS release won't be out before October.
The Macbook Airs with M3 have been launched 2 months ago. 2 months is really not that long ago, even in the Apple universe. For sure I'm waiting on what happens on WWDC!
Given that recent Apple laptops already have solid all-day battery life, with such a big performance per watt improvement, I wonder if they'll end up reducing how much battery any laptops ship with to make them lighter.
No, because battery life isn't just about the CPU. The CPU sits idle most of the time and when it's not idle, it's at workloads like 20% or whatever. It's the screens that eat batteries because they're on most or all of the time and sucking juice. Look at Apple's docs and you'll see the battery life is the exact same as the previous model. They have a battery budget and if they save 10% on CPU, they give that 10% to a better screen or something. They can't shrink the battery by half until they make screens twice as efficient, not CPUs which account for only a small fraction of power draw.
I'm just venting that their processor strategy doesn't make much sense. The iPad gets the M4, but the Mini and Studio and Mac Pro are still on M2 and the MacBooks are on M3.
They've essentially undercut every Mac they currently sell by putting the M4 in the iPad and most people will never use that kind of power in an iPad.
If you are going to spend $4k on a Mac don't you expect it to have the latest processor?
Probably 80%+ of the population can do everything they need or want to do for the next 5 (maybe even 8) years on an M2 Air available for less than $1,500.
I write this on a $1,000 late 2015 Intel MacBook Air.
Honestly only reason I want a studio is because I run several monitors and my Mac Mini can’t run all my monitors unless I use displaylink, which doesn’t allow me to run any HDCP protected content and is just glitchy and hacky in general.
I think for the past 10 years you are correct, but I think we are currently entering the AI age and the base M4 has 38 TOPS (trillion operations per second) and we aren’t going to be able to run the AI models on device with lower latency that they will surely be releasing this summer without a more recent chip, so I don’t think things are as future proof as they used to be.
But that’s not really the point, the point is that I don’t want to spend $4k to buy a Mac Studio with an M2 chip while the M3 Macbook Pro has on par performance and the iPad has an M4. Apple should come up with a better update strategy then randomly updating devices based on previous update cycles.
People with a Macbook. You use the Macbook to work and the iPad to play, read, movies, draw, etc. plus you can use it as a second monitor for the Macbook.
I'm most interested in what this means for the next Vision device.
Half the power budget could well translate to a very significant improvement in heat on the device, battery size and other benefits. Could they even dispense with the puck and get the battery back onto the headset for a consumer version that runs at slightly lower resolution and doesn't have the EyeSight feature?
If they could do that for $2000 I think we'd have a totally different ball game for that device.
If M4 Max could finally break the 400GBps limit of the past few years and hit 600 GBps, it would be huge for local AI since it could directly translate into inference speedups.
The M4 has increased the bandwidth from 100 to 120 GB/s. The M4 Max would probably be 4x that at 480 GB/s, but the M4 Ultra would be 960 GB/s compared to M2 Ultra at 800 GB/s.
Dang. +20% is still a nice difference, but not sure how they did that. Here's hoping M4 Max can include more tech, but that's copium.
960 GB/s is 3090 level so that's pretty good. I'm curious if the Macbooks right now are actually more so compute limited due to tensor throughput being relatively weak, not sure about real-world perf.
> M4 has Apple’s fastest Neural Engine ever, capable of up to 38 trillion operations per second, which is faster than the neural processing unit of any AI PC today.
I didn't even realize there is other PC-level hardware with AI-specific compute. What's the AMD and Intel equivalent of Neural Engine? (not that it matters since it seems the GPU where most of the AI workload is handled anyway)
I own M1, A10X and A12X iPad Pros. I have yet to see any of them ever max out their processor or get slow. I have no idea why anyone would need an M4 one. Sure, it's because Apple no longer has M1s being fabbed at TSMC. But seriously, who would upgrade.
Put MacOS on iPad Pro, then it gets interesting. The most interesting thing my ipad pros do are look at security cameras or read ODB-II settings on my vehicle. Hell, they can't even maintain an SSH connection correctly. Ridiculous.
I see Apple always show videos of people editing video on their iPad Pro. Who does that??? We use them for watching videos (kids). One is in a car as a mapping system - that's a solid use case. One I gave my Dad and he did know what to do with it - so its collecting dust. And one lives in the kitchen doing recipes.
Functionally, a 4 year old Chromebook is 3x as useful as a new iPad Pro.
Seems like we are clearly in the _“post peak Apple”_ era now. This update is just for the sake of update; iPad lineup is (even more) confusing; iPhone cash-cow continues but at slower growth rate; new product launches - ahem! Vision Pro - have seen low to negligible adoption; marginal improvements to products so consumers are holding out longer on to their devices.
What's the endgame with iPads though? I mainly use it for consumption, taking notes and jotting annotations on PDFs. Well, it's a significant companion for my work, but I cannot see if I've any reason to upgrade from iPad Air 5, especially given the incompatibility of the Pencil 2nd gen.
The only reason I'm buying a new iPad Pro is the screen and because the battery on my 2021 iPad Pro is slowly dying.
I could care less that the M4 chip is in the iPad Pro ... all I use it for is browsing the web, watching movies, playing chess, and posting on Hacker News (and some other social media as well).
> the latest chip delivering phenomenal performance to the all-new iPad Pro
What a joke. They have M4 and they still run iOS? Why can't they run MacOS instead?
If you take it a bit deeper: if an iPad would have keyboard, mouse and MacOS → it would basically be a 10/12 inch macbook.
Built in to a SoC* e.g. a 2060 from 5 years ago had double that in tensor cores it just wasn't part of the SoC. Great improvement for its type/application, dubious marketing claim with the wording.
And it's really not that great even though it's a welcomed improvement - Microsoft is pushing 45 TOPs as the baseline, you just can't get that in an APU yet (well, at least for the next couple months).
I dread using a Mac for any serious work, you lose a lot of the advantages of Linux (proper package and window management, native containers, built-in drivers for every hardware out there, excellent filesystems support, etc). And you get what exactly?
I don't need to fiddle with settings and as a developer all my tools are there (git, clang, ...). I don't need to care about the version of my kernel and can be worry free to click update.
Worry free update has not been a thing with macOS in ages. I dread any OS update, since it will take ages at the very least and very likely to bring more bugs than the one, they were supposed to fix.
And sometimes they outright remove support for useful stuff (subpixel rendering, hello) or obsolete still useful software just because they decided to (32bits apps hello).
I don't know how people can still things like that about the Mac, even Windows is better these days.
All this powerful hardware on a laptop computer is like driving a Ferrari at 40 mph. It is begging for better use. If apple ever releases an ai robot that's going to change everything. Long ways to go, but when it arrives, it will be chatgptx100.
It's a 10-core CPU + 10-core GPU + 16-core "NPU" (neural processing unit) for AI all on a consumer handheld. It's like a Ferrari engine in a Honda Civic - all we know is it's going to be fast and hopefully it doesn't catch on fire.
When this arrives in MacBooks, what would that mean in practice? Assuming base M4 config (not max, not ultra - those were already powerful in earlier iterations), what kind of LLM could I run on it locally?
Anything up to the memory configuration it is limited to. So for base model M4, that likely means you have 8gb of memory with 4-5 of it realistically usable.
Probably they had some contractual commitments with TSMC and had to use up their N3B capacity somehow. But as soon as N3E became available it’s a much better process overall.
Ramping up production on a new die also takes time. The lower volume and requirements of the M4 as used in the iPad can give them time to mature the line for the Macs.
They don't care because they can't make money off it. They know very well that Steam would outsell their App Store offering so greatly (with good reasons) if they supported Vulkan that they don't see the point.
They push their Metal API not just for the hardware matchup but because it allows them to trap developers into optimizing for their technology only and getting them to sell their games in the App Store.
Because the minute devs start looking into options for cross-platform dev/deploy their Metal/App Store combo is just not very enticing compared to the alternative.
But they would rather lie about caring for this use case and they would need to have better GPUs in the first place, but at least it looks like this should be improved with M4.
For all the big talk they give, their GPU in the entry level Apple Silicon chip doesn't even match the power of a current gen console (either PS5 or Xbox series X have better GFLOPS).
The Max versions are better but they only come in devices starting at over 2.5k€ starting price (still with lots of compromises most likely) and those get absolutely trounced by hardware at this price point that features dedicated GPUs, obviously.
But they will very much remind you about how great battery life is, because it is all that matters, right? right? (Desktops, hello?)
Unless you can justify one of the Max variants for other things, there is not much point bothering with it. If you really want a Mac, it will still be better, faster and cheaper to just buy the most basic Mac you can get away with and get a console or a custom PC with the rest of the money.
It is very sad, but this is the state of things. Gaming on Apple was never very good, but at least before Apple Silicon there were some variants of iMacs/MBPs that were decent enough and not breaking the bank too much (with Windows support for even better performance).
Why does Apple hurry to push M4 before A18 Pro?
Who can support the hypotheses below?
1) M3 follows M2 and A16 Pro in part, and
2) M4 follows M2 and A17 Pro.
I always wonder how constraining it is to design these chips subject to thermal and energy limitations. I paid a lot of money for my hardware and I want it to go as fast as possible. I don't want my fans to be quiet, and I don't want my battery life to be 30 minutes longer, if it means I get more raw performance in return. But instead, Apple's engineers have unilaterally decided to handicap their own processors for no real good reason.
Thermal load has been a major limiting design factor in high end CPU design for two decades (remember Pentium 4?).
Apart from that, I think you might me in a minority if you want a loud, hot iPad with a heavy battery to power all of this (for a short time, because physics). There are plenty of Windows devices that work exactly like that though if that's really what makes you happy. Just don't expect great performance either, because of diminishing returns of using higher power and also because the chips in these devices usually suck.
Most of what Apple sells goes into mobile devices: phone, tablet, laptop. In their prior incarnation, they ran up real hard against the thermal limits of what they could put in their laptops with the IBM PowerPC G5 chip.
Pure compute power has never been Apple's center of gravity when selling products. The Mac Pro and the XServe are/were minuscule portions of Apple's sales, and the latter product was killed after a short while.
> Apple's engineers have unilaterally decided to handicap their own processors for no real good reason
This is a misunderstanding of what the limiting factor is of Apple products' capability. The mobile devices all have battery as the limfac. The processors being energy efficient in compute-per-watt isn't a handicap, it's an enabler. And it's a very good reason.
> I don't want my fans to be quiet, and I don't want my battery life to be 30 minutes longer
I agree with you. I don’t want fans to be quiet, I want them completely gone. And with battery life too, not 30 minutes, but 300 minutes.
Modern chips are plenty fast, developers need to optimize their shit instead of churning crapware.
Yeah, one of my biggest frustrations as a person who likes keeping around both recent-ish Mac and Windows/Linux laptops is that x86 laptop manufacturers seem to have a severe allergy to building laptops that are good all-rounders… they always have one or multiple specs that are terrible, usually heat, fan noise, and battery life.
Paradoxically this effect is the worst in ultraportables, where the norm is to cram in CPUs that run too hot for the chassis with tiny batteries, making them weirdly bad at the one thing they’re supposed to be good at. Portability isn’t just physical size and weight, but also runtime and if one needs to bring cables and chargers.
On that note, Apple really needs to resurrect the 12” MacBook with an M-series or even A-series SoC. There’d be absolutely nothing remotely comparable in the x86 ultraportable market.
part of me wonders if the reason apple went with an unusual double jump in processor generation is that they are fearing or at least trying to delay comparison with other desktop class arm processors. wonder if mac lineup will get m4 at all or start with m4 pro or something. we'll see.
Doesn't add up, if you fear something about to launch will outdo your product in a given segment then you push that launch earlier, not later so your competitor is also first to market.
Yeah. Basically specialises in fast matrix operations so focused on ai and ML. Vaguely like a gpu minus the graphic output and the pipeline needed for that.
Its really saying something about how the tech sector has shifted due to the recent AI wave that Apple is announcing a chipset entirely apart from a product.
This has never happened to my knowledge in this companies history? I could be wrong though, even the G3/G4s were launched as PowerMacs.
Why would it be? They announced the same for the A17 in the iPhone. Turns out it was a gimmick that caused over 11W of power draw. Raytracing is a brute force approach that cannot be optimized to the same level as rasterization. For now at least, it is unsuitable for mobile devices.
Now if we could use the RT units for Blender that'd be great, but it's iPad OS...
It is kind of crazy to look back on. In the future we might look forward to path tracing and more physically accurate renderers. (Or perhaps all the lighting will be hallucinated by AI...?)
> M4 makes the new iPad Pro an outrageously powerful device for artificial intelligence
Yeah, well, I'm an enthusiastic M3 user, and I'm sure the new AI capabilities are nice, but hyperbole like this is just asking for snark like "my RTX4090 would like a word".
Other than that: looking forward to when/how this chipset will be available in Macbooks!
No, when using wording like "outrageously powerful", that's exactly the comparison you elicit.
I'd be fine with "best in class" or even "unbeatable performance per Watt", but I can absolutely guarantee you that an iPad does not outperform any current popular-with-the-ML-crowd GPUs...
This is true, but that is only an advantage when running a model larger than the VRAM. If your models are smaller, you'll get substantially better performance in a 4090. So it all comes down to which models you want to run.
It seems like 13b was running fine on 4090, but when I tried all the more fun or intelligent ones became very slow and would have peformed better on m3.
Yes, M3 chips are available with 36GB unified RAM when embedded in a MacBook, although 18GB and below are the norm for most models.
And even though the Apple press release does not even mention memory capacity, I can guarantee you that it will be even less than that on an iPad (simply because RAM is very battery-hungry and most consumers won't care).
So, therefore my remark: it will be interesting to see how this chipset lands in MacBooks.
Everyone seems as confused as I am about Apple's strategy here. I wasn't sure the M4 existed, now it can be bought in a format noone wants. How will this bring in a lot of revenue?
Is it just me or is there not a single performance chart here? Their previous CPU announcements have all had perf-per-watt charts, and that's conspicuously missing here. If this is an improvement over previous gens, wouldn't they want to show that off?
Since Intel->M1 the performance gains haven't been the headliners they once were, although the uplifts haven't been terrible. It also lets them hide behind the more impressive sounding multiplier which can reference something more specific but not necessarily applicable to broader tasks.
“The next-generation cores feature improved branch prediction, with wider decode and execution engines for the performance cores, and a deeper execution engine for the efficiency cores. And both types of cores also feature enhanced, next-generation ML accelerators.”
I thought the same thing when I read it, but considering the attacks were known when doing this improvement, I hope that it was under consideration during the design.
I'm hoping the higher efficiency gains and improved thermals offset that. The efficiency cores tend to have more impact on the Macs where multitasking is heavier.
glad I held out. M4 is going to put downward pressure across all previous gen.
edit: nvm, AMD is coming out with twice the performance of M4 in two months or less. If the M2s become super cheap I will consider it but M4 came far too late. There's just way better alternatives now and very soon.
> AMD is coming out with twice the performance of M4 in two months or less
M4 Pro/Max/Ultra with variants double+ the performance by just scaling cores are probably also going to be announced at WWDC in a month when they also announce their AI roadmap
Why does this feel like it was hastily added at the last minute? Developing a chip like the M4 presumably takes years, so hastily incorporating 'AI' to meet the hype and demand could inevitably lead to problems.
To me, cutting wattage in half is not interesting, but doubling performance is interesting. So performance per watt is actually a pretty useless metric since it doesn't differentiate between the two.
of course efficiency matters for a battery-powered device, but I still tend to lean towards raw power over all else. Others may choose differently, which is why other metrics exist I guess.
Huh, never considered cooling. I suppose that contributes to the device's incredible thinness. Generally thin-and-light has always been an incredible turnoff for me, but tech is finally starting to catch up to thicker devices.
Thin and light is easier to cool. The entire device is a big heat sink fin. Put another way, as the device gets thinner, the ratio of surface area to volume goes to infinity.
If you want to go thicker, then you have to screw around with heat pipes, fans, etc, etc, to move the heat a few cm to the outside surface of the device.
That's not why thin-and-light bothers me. Historically, ultrabooks and similarly thin-and-light focused devices have been utterly insufferable in terms of performance compared to something that's even a single cm thicker. But Apple Silicon seems extremely promising, it seems quite competitive with thicker and heavier devices.
I never understood why everyone [looking at PC laptop manufacturers] took thin-and-light to such an extreme that their machines became basically useless. Now Apple is releasing thin-and-light machines that are incredibly powerful, and that is genuinely innovative. I hadn't seen something like that from them since the launch of the original iPhone, that's how big I think this was.
That's not exactly true, just the other day Snazzy lab complained about a MBP M3 Max throttling and making a lot of fan noise.
Those are barely competitive with the heavier but more powerful gaming/creation laptops Apple's aficionados keep deriding (if it has a 4090 it's not even competitive).
They have focused on mobility (power consumption and size) at the cost of everything else.
For this exact reason their desktop offering is really not competitive with offering around the same prices in the PC world. The only thing they do better is size (considering you can make a 3-4L top of the line PC; the Mac Studio isn't even impressive) and power consumption. But who actually cares, even if using a desktop heavily its consumption is dwarfed by most other use in a typical house, so whatever?
Thin and light are indeed a small use case overall and people who care about that have a ton of decently good options in the PC world already. It's not like the performance per watt benefits of Apple Silicon is really that relevant to most potential customers. If one is content enough with such a laptop, using it for the typical light task, thin and light PC laptops are just small enough, silent enough and have good enough battery life for the most part.
It means a lot to me, because cutting power consumption in half for millions of devices means we can turn off power plants (in aggregate). It’s the same as lightbulbs; I’ll never understand why people bragged about how much power they were wasting with incandescents.
>cutting power consumption in half for millions of devices means we can turn off power plants
It is well known that software inefficiency doubles every couple years, that is, the same scenario would take 2x as much compute, given entire software stack (not disembodied algorithm which will indeed be faster).
The extra compute will be spent on a more abstract UI stack or on new features, unless forced by physical constraints (e.g. inefficient batteries of early smartphone), which is not the case at present.
That's weird - if software gets 2x worse every time hardware gets 2x better, why did my laptop in 2010 last 2 hours on battery while the current one lasts 16 doing much more complex tasks for me?
Elsewhere in the comments, it is noted Apple's own estimates are identical despite allegedly 2x better hardware.
Aside, 2 hours is very low even for 2010. There's a strongly usability advantage for going to 16. But going from 16 to 128 won't add as much. The natural course of things is to converge on a decent enough number and 'spend' the rest on more complex software, a lighter laptop etc.
I have dimmable LED strips around my rooms, hidden by cove molding, reflecting off the whole ceiling, which becomes a super diffuse, super bright “light”.
I don’t boast about power use, but they are certainly hungry.
For that I get softly defuse lighting with a max brightness comparable to outdoor clear sky daylight. Working from home, this is so nice for my brain and depression.
First, only CPU power consumption is reduced, not other components, second, I doubt tablets contribute significantly to global power consumption, so I think no power plants will be turned off.
Intel to M1 is an entire architectural switch where even old software couldn't be run and had to be emulated.
This is a small generational upgrade that doesn't necessitate an event.
Other companies started having events like this because they were copying apples amazing events. Apples events now are just parodies of what Apple was.
I think they are over-engineering it. I have never liked gestures because it's difficult to discover something you can't see. A button would have been better than an invisible squeeze gesture.
I used Android phones forever until the iphone 13 came out and I switched to IOS because I had to de-Google my life completely after they (for no reason at all, "fraud" that I did not commit) blocked my Google Play account.
The amount of things I have to google to use the phone how I normally used Android is crazy. So many gestures required with NOTHING telling you how to use them.
I recently sat around a table with 5 of my friends trying to figure out how to do that "Tap to share contact info" thing. Nobody at the table, all long term IOS users, knew how to do it. I thought that if we tapped the phones together it would give me some popup on how to finish the process. We tried all sorts of tapping/phone versions until we realized we had to unlock both phones.
And one of the people there with the same phone as me (13 pro) couldn't get it to work at all. It just did nothing.
And the keyboard. My god is the keyboard awful. I have never typoed so much, and I have no idea how to copy a URL out of Safari to send to someone without using the annoying Share button which doesn't even have the app I share to the most without clicking the More.. button to show all my apps. Holding my finger over the URL doesn't give me a copy option or anything, and changing the URL with their highlight/delete system is terrible. I get so frustrated with it and mostly just give up. The cursor NEVER lands where I want it to land and almost always highlights an entire word when I want to make a one letter typo fix. I don't have big fingers at all. Changing a long URL that goes past the length of the Safari address bar is a nightmare.
I'm sure (maybe?) that's some option I need to change but I don't even feel like looking into it anymore. I've given up on learning about the phones hidden gestures and just use it probably 1/10th of how I could.
Carplay, Messages and the easy-to-connect-devices ecosystem is the only thing keeping me on it.
Oh, I see what I'm doing wrong now. If I hold it down it highlights it and gives me the "edit letter/line" thing, and then if I let go I get the Copy option. I guess in the past I've seen it highlight the word and just stopped before it got to that point.
No mention of battery life. They keep making stuff thin, and unupgradeable. What's the point of buying an apple device that is going to wear out in 5 years?
That will also be true for any laptop that is not bottom of the barrel and got taken care of correctly by its owner.
I have non techies' friends using Windows laptops who greatly outlived any Mac laptop because of software support anyway, and as long as they are content with the performance they still work just fine.
The MacBook's lasting longer is just a meme at this point, any laptop of comparable price/performance will last just as long if you don't trash for some reason.
Are these M-class chips available to be purchased on Digi-Key and Mouser? Do they have data sheets and recommended circuitry? I’d love to play with one just to see how difficult it is to integrate compared to, say, an stm8/32 or something.
As a professional EE, I know that ARM Cortex-M4 is not a chip. It's an embedded processor that is put into an SOC (which is a chip), such as the STM32-family from ST.
In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).
Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity and hence minimal latency.
If (and that's a big if) they keep their APIs open to run any kind of AI workload on their chips it's a strategy that I personally really really welcome as I don't want the AI future to be centralised in the hands of a few powerful cloud providers.