It’s a really a shame to put such a good cpu in such an artificially limited phone. Apple already has all the elements required for effective convergence but would rather have us lug gimped devices out of greed. The current situation is so sad for actual innovation.
I don’t want a MacBook with an A19, I want to use the A19 I already have connected to a screen and a keyboard with a proper software stack.
The vast majority of the market didn't need an iPhone at all until it was introduced. If you enable compelling enough use cases people will find ways for them to fit their workflows, and/or come up with brand new ones.
The biggest hurdle standing in the way is that making a single one of iPhone/iPad/Mac too powerful (in terms of usability, not just raw processing power) will take away sales from the other two.
The vast majority of the market didn't need an iPhone at all until it was introduced. If you enable compelling enough use cases people will find ways for them to fit their workflows, and/or come up with brand new ones.
It was a compelling value proposition to Crackberries of the era; Apple clearly did market research before flinging it to the masses. Danger Inc / Google were already converging on this with their first Android.
The biggest hurdle standing in the way is that making a single one of iPhone/iPad/Mac too powerful (in terms of usability, not just raw processing power) will take away sales from the other two.
Cannibalising their product line is probably the most plausible explanation, but I'm sceptical. I reckon it's just that they haven't found a "killer" use case for it and probably bottom in the list of ideas for improving the product (if it even made a list).
Yeah, the appetite for something iPhone like had been clear for quite some time, Apple was just the first to pull off a really solid version. Getting the touch interface right was a big factor.
But for earlier examples, I had a Palm VII back in 1999. I was working for a CRM reseller and we got one to play with as a potential solution for a client project.
It was super limited but being able to browse the web while on the go was immediately obvious as a very big deal. BlackBerries didn't get web until a couple years later but I'm sure users of that would say something similar.
There have been numerous attempts at dockable phones over the years.
> The biggest hurdle standing in the way is that making a single one of iPhone/iPad/Mac too powerful (in terms of usability, not just raw processing power) will take away sales from the other two.
No it won’t. Nobody is cross-shopping a full size laptop with screen and keyboard or a phone with a tiny screen and no keyboard.
the most recent attempt with linux is pretty good, but it was held back by the slow hardware and driver support.
If the pinephone had enough power to record a video, and maybe better waydroid integration I would use it. I like the convenience of using the same apps as on my laptop, and being able to develop on the same platform that I am targeting. That is a unique selling point.
Apple has the funding to do this, but they choose not to. It would damage their whole market segmentation scheme. It is a penetration strategy
> I like the convenience of using the same apps as on my laptop, and being able to develop on the same platform that I am targeting. That is a unique selling point.
And almost all of my apps have iPhone, iPad and Mac versions with cloud syncing of the data between them.
It’s just a flag for developers to allow iPad apps to run on ARM based Macs without any modifications.
> *and being able to develop on the same platform that I am targeting. That is a unique selling point.*
With ARM based Macs, they basically are the same except for the screen size. Compilation speed would be much slower on an iPhone than a Mac.
When the iPhone was introduced, the worldwide penetration of cell phones was already 1 billion. Jobs said he wanted to sell 10 million or 1% of the market during the first year.
The iPad and the Mac combined is 20% of Apple’s revenue. People buy iPads because they want a larger screen.
I mean you can carry around a portable USB C monitor and plug it into your iPhone today. I have one that gets power and video from one USB cord for my laptop. But most people don’t want to do that.
> I mean you can carry around a portable USB C monitor and plug it into your iPhone today. I have one that gets power and video from one USB cord for my laptop. But most people don’t want to do that.
Have some imagination? I know plenty of folks that use their phone as their main computer, but could use more screen space on occasion to finish a complex task at a desk with a mouse and keyboard.
Something like phone mirroring that utilizes the full display (Perhaps full macOS?) would be amazing for that use case. Could be wireless or a magsafe stand, or even a homepod-style handoff thing with the phone’s nfc chip.
The hardware for this is pretty much there, Apple just needs to productize it.
Okay you can already use a Bluetooth keyboard and mouse on an iPhone and the USB powered portable monitor I have can be used with my USB C iPhone. Because of power constraints, it can only drive the display up to 50% brightness. But you can plug power up to the other USB C port on my monitor to power the display at full brightness and charge the phone.
This is my iPhone 16 Pro Max with my external portable monitor. The phone is connected with a standard USB C cord and the Anker battery is plugged in with another USB C cord into the monitor.
That’s really neat, I just wish Apple would go all-in on monitor support.
There’s no good reason I shouldn’t be able to plug my phone into my Studio Display when I need a bit more room to work on a task I started on my phone. Yes, there’s handoff between iOS & macOS, but it’s very tied to specific apps, and requires another Mac that may not really be necessary.
Well, that monitor has a just a regular old monitor with two USB C ports and a mini HDMI port. I am connecting it with a regular USB C to USB C cord. There is no reason you can’t use a standard USB C to HDMI cord to connect the phone to your Studio Display.
What work would you start on your phone that doesn’t either have an app on your Mac where everything is synced via cloud services and/or there isn’t a web app where you can’t start on one and keep going on another?
I can already use GSuite (work and personal), Office365 (personal subscription), and iWorks across my Mac, iPhone, and iPad using apps and/or the web and things automatically sync and of course notes, calendar, mail, messages, Slack, etc are synced between everything.
My personal Trello board is synced between all of them and even my third party podcast app - Overcast - has an iPhone, iPad, Mac and web interface that syncs.
> The vast majority of the market is totally uninterested in docking their phone like that though
s/uninterested/unaware/
In reality, for such hardware to make sense, it would have to be a full MacBook Air minus the PCB. Would you be willing to spend 500-1000 USD for a piece of hardware that only works when your phone is connected? (i.e. an iPhone accessory)
This works with Macs, Windows PCs and USB C based iPhones and iPads. You just connect it with a USB C cable. It gets power and video from the USB C cable.
It would be interesting to see a tablet sized dual mode device where the OS and user apps could seamlessly switch back and forth between a 100% touch mode UI and a 100% precision pointing mode UI without restarting or losing data.
It would mean a lot more work for developers though if your app needs two different UI designs.
One issue that sticks out is that touch controls of a usable size just take up so much more screen space.
Also, look at how many years it took Microsoft to provide touch friendly access to the Windows control panels.
So your reference point is a niche device from 2013 running a stillborn mobile OS with a 1ghz processor, and because that didn’t gain traction, you’ve concluded the segment is not viable.
My reference point is existing for many years and seeing many people pushing for dockable computers and seeing them repeatedly not take off.
Notably the people pushing for dockable computers are computer geeks who say things like "I only listen to OGG Vorbis, MP3 is unusable" and "no wireless, less space than a Nomad, lame" and "I could build Dropbox in a weekend with rsync" and "I don't understand why people need to see pictures or video, text is all I need". Normal people aren't itching to SSH from their TV using a bluetooth keyboard connected to a smartphone with a VPN so they can do X-Forwarding of Brave browser. Normal people are fine with buying a laptop with Gmail and using an Android TV stick to watch from a streaming service.
Interesting how people rationalize this stuff. My phone can dock to a screen; I've never used the feature, but it doesn't obstruct anything I do. It's a nice-to-have, and it would be kinda appealing if Firefox and Spotify both work like normal. I might even deign to say I could get work done on it (though I've never tried).
Presumably the real reason is that Apple is still afraid to segment their market. A plug-in iPhone would stop people from buying the AppleTV or Mac Mini for home theater applications.
I'll chuck in another rationalisation from just one angle (of many). It's a low in demand feature that if they implement they would have to support. One thing that Apple does incredibly well (I really can't think of any tech company that comes close) is provide front line support for virtually every capability and feature in the Apple ecosystem. This means they have to resource support people to know how the feature works, and troubleshoot it when things have issue, dedicate engineer resources if an issue cannot be fixed and requires escalation, etc.
Just having such a feature involves a cost, and the juice is just not worth the squeeze.
Curious, if your phone supports the feature and it doesn't work, what is your recourse?
You really think Apple is worried about the sales of the AppleTV and Mac Mini? Two of their lowest selling devices? You can already plug an iPhone into a TV for video with a standard non proprietary HDMI to USB C cable and AirPlay is available on $49 Roku sticks.
Only around 10% of Apple’s revenue is the Mac and every estimate is 70% of those are laptops. The AppleTV is far behind in market share for media center devices or whatever you call Roku, Fire sticks etc.
You can already stream from your iPhone via AirPlay to at least Roku sticks/TVs and I assume others. The number of people who want to use an iPhone as a full computer is miniscule.
I think it largely comes down to the App Store limitation at this point (the UI is... livable enough now, though it does still seem a waste to not just have the macOS UI in keyboard-attached mode). Consumers can't take things like their Steam library over and workers can't just run existing enterprise apps - it has to be Apple approved apps which fit the Apple architectural requirements and are purchased/distributed through the Apple Store.
For a small subset that is absolutely also stuff like terminal and Docker, but there's nothing special about that group beyond "they use a different set of apps not allowed in the App Store".
Most “enterprise apps” these days are web based SaaS apps.
The PC games market is really not that large in the grand scheme of things and what serious game player would want to play games on laptop class hardware? Even with the iPad Pro you are talking about MacBook Air level hardware with worse thermals for games.
And the most popular office Apps - Microsoft Office and GSuite are a per seat license - for both home and office. Meaning you can use the apps on your phone, laptop, or mobile devices and sync apps between them.
How many productivity apps don’t “fit in Apple’s guideline”?
You should talk to this person for me https://news.ycombinator.com/item?id=45399380 since they argue these chips are already ideal for high intensity video games. It doesn't have to be AAA titles though, many games (simple and advance) make it to the "top paid apps" section of the app store once they "make the leap across" so to say. The thing is, not every game/app has made the leap, and when they do they don't all transfer ownership.
I don't mean office apps, I mean enterprise apps. I do see them becoming more web focused with time (which I think is a good thing - it's ultra portable when they are) but we're certainly not ready to claim victory just because most email and document editing can be done from a webview. Hell, there's one app I have to use daily which is still officially only 32 bit Windows (it, thankfully, works in Crossover).
What are these enterprise apps? And how many of them would run on an ARM based Mac? If they don’t port them to Macs, what are the chances they would port them to iPads?
> Hell, there's one app I have to use daily which is still officially only 32 bit Windows (it, thankfully, works in Crossover).
As an example. This one is Intangi IRIS, a BOM and quoting tool for network products, and they officially support Crossover. They've been talking about native macOS support since before I started using it in 2019... but that's the speed of business for you :).
It's the long tail of apps like these that can make it painful - not the bulk of the day (email, conference calls, and web ticketing systems) itself.
Then it that case, it wouldn’t matter if the iPad gained the capabilities of the Mac. It would never run the bespoke enterprise software unless Apple ported the x86 emulator to it. Even then the x86 emulator doesn’t support 32 bit software.
I'm not sure I follow. As already explained, the example given works for me on macOS today since I can install Crossover (which is an officially supported install method by Intangi - not a custom hack) which is not allowed/supported on the App Store/iPad. Crossover handles 32 bit x86 Windows apps on Apple Silicon fine, even though Apple themselves don't, since right after Apple Silicon launched https://www.codeweavers.com/blog/jwhite/2020/11/18/okay-im-o...
You're saying there is some other reason it wouldn't work if I ran macOS/had macOS's capabilities on an M4 iPad chassis instead of an M4 MacBook chassis?
Btw, I don't think you'd intend to go against the site guidelines by it (I figure it's probably just two separate logins on two separate devices or something), but if you're using multiple accounts in the same conversations it does go against them.
That was not my intention at all (to have multiple accounts on the same conversation). I knew not to upvote myself or downvote a reply - which I did not do.
But back to the point, i always thought of custom bespoke enterprise apps were ones that usually ignored the Mac and especially ARM Macs.
Fair point - but a niche case I don’t see Apple going out of their way to support.
Before iPad OS 26 and real windows, I would have thought that the iPad would be frustrating even for mainstream office work. I could definitely see myself using my wife’s iPad 13 Air now full time - she does except for development tasks.
My old A12 iPad Air fromn 2019 runs iOS 26 halfway decently even with only 3GB RAM.
And Apple has never really cared about “the enterprise”
No worries, I figured you didn't mean anything by it I just didn't want you to end up in a surprise problem one day over something silly.
It always seems niche when talking about examples of stuff that can't be done. Prior to iPadOS 26 people would tell me I just didn't understand what an iPad wasn't supposed to have windowing similar to macOS too, but it was hard to say it wasn't a niche use case when the old way discouraged users who wanted that from using the platform.
I agree Apple doesn't typically target Enterprise directly, but they do support them. They maintain things like MDM support across all products, Apple Business Manager, and AppleCare for Enterprise. The big difference between supporting those kinds of use cases and this is that these enable more Apple products to be sold, while enabling iPadOS to do MacBook like things enables fewer and cheaper devices to be sold. I don't actually expect Apple to go down this path for that reason, but I still wish they would.
The same is still somewhat true of some consumer use cases like games they already own on macOS or peripheral support not in iPadOS, but Apple has given a little over the years in this regard (e.g. allowing 1 external monitor, allowing certain peripherals and dock types, adding decent windowing support). Of course, Apple's goal in this remains to align with what feature set will make them the most money, not what feature set people would use, the two things just align slightly better in the consumer space than the enterprise space.
But a person could still dream their phone/iPad more powerful than most people's laptops could take the role of one when plugged in, even if it wouldn't make Apple more money.
I work in customer facing cloud consulting specializing in app dev. My days are spent:
1. Zoom
2. The AWS console in a web browser
3. The terminal - and I can bring up CloudShell for simple things from the AWS web console
4. Slack/Notion/GSuite apps/Jira
5. Visual Studio Code and using Docker. For that, I would just spin up a Windows based AWS WorkSpace with the iPad client app and wouldn’t be able to tell the difference when using a regular Bluetooth keyboard and mouse.
That's pretty much Microsoft's approach on every system now - even if you install Outlook or Teams, it's just a web app. If it weren't for that I wonder if keyboards on iPads would even still be common at all. That said, there is a heck of a lot more to enterprise apps than things like email still out there. When I was still at a health system we'd try to use tablets for new things whenever possible (they are just physically convenient in many work environments) but we'd inevitably end up with "web stuff" on the tablet and "everything else" on a laptop (sometimes with mobile cart) for this exact reason.
I also worked in B2B health SaaS companies. Even then some health systems used whatever you call the hosted Windows servers and people used their computers as dumb terminals to run apps. There were clients for iPads.
This is an option for a lot of things, e.g. we delivered Epic Hyperspace over Citrix, but it can be extremely expensive in terms of TCO to do for every app (we still had over 3,000+ individual on prem hosted apps maintained).
I think this is a pretty poor reading of the market. Everyone has a phone. An increasing segment now has little access to desktop or laptop computing. I know I hate that I have to pick up my laptop to do relatively small tasks that I'm halfway through on my phone.
Offering a dockable screen/keyboard/mouse, using the phone battery/compute/storage seems like it would be trivial for Apple.
Obviously cannibalises laptop or tablet sales, but that's not the market's disinterest.
I don’t want a laptop dock. I want to be able to connect my phone to the dock I already have on my desk or carry with me a foldable screen and keyboard when I need portability. That would give me a larger lighter screen that a MacBook and a better typing experience. I already have headphones. I already have a secondary battery. Actually make all of this wireless and that would be perfect. Plus if the screen support touch you have a tablet.
The laptop has a device makes less and less sense now that the cpu and gpu inside a phone are good enough for most. It’s propped up only because phone providers artificially limit what the software on the phone can do.
How are you reinventing a laptop? There is zero friction.
You already have the phone. All the pieces are technically there to seamlessly connecting to a screen and a keyboard wirelessly and have it become a proper computer which is what it always was. I think Google has seen it and that’s why they have started working on their desktop mode but they are sadly stuck with underpowered cpu.
You need to think bigger. The sole reason this is not already a reality is because it would be a loss of money for Apple.
Yeah, I totally agree with you but usually people who complain about not being able to plug their phone want a desktop, not a laptop. Even then, sitting down at a desk gives you a thermal envelope that is only limited by the noise of its cooling, so you can put extremely powerful chips on desktops again defeating the purpose.
I stopped using Windows when Microsoft made it obvious that they fully intend to force users to use an online account to log into their own local machine.
Requiring an online account if you want to use optional online features was perfectly acceptable.
This bodes well for the rumored entry level MacBook with an A-series chip inside. If they can get the price on those down to $500-$600 it’ll run circles around everything else at the price point.
Surely the cost to Apple of A-series vs M-series isn't that significant though? I'd think they'd need to cut a lot (or their margin) to get down to that price point on a laptop.
One big difference is volume. They sell way, way more A-series devices than M-series devices and so they may benefit from economy of scale.
With all of Apple's service offerings these days, they could also potentially justify slim margins by positioning the A-series MacBook as both a loss leader and gateway drug.
The future is going to be a single line of chips that goes into iPhone, iPad and Mac. Pricing doesn't have much to do with it. If Apple wanted to sell a $500 MacBook they would have done it already. The processor isn't what is standing in the way.
What do you mean "future"? This is already the case. M-series chips share the same architecture with A-series CPUs. If you increase the A19 core count you'll end up with something close to M5.
They could technically do it today but I imagine they want differentiation so that the cheap MacBook doesn't cannibalize sales of the Air and Pro too heavily.
I'm a backend dev, I've been using a base model m1 air since it was released, I've never felt it slow down, I still have no reason to update. 99% of people who will buy these entry level machines will not hit thermal throttling or if they do they won't care
I’m a backend dev with an m1max and 64gb ram. I run golang, various docker containers and an occasional ollama llm plus a few games. This machine still feels like it has infinite resources.
Man, it's wild how we have such different experiences. I have a base level M1 air, and I feel like it chugs along anytime I ask it to do anything even vaguely computationally expensive. Obviously that's not rigorous, but that's my subjective impression.
I also have the cheapest M1 air and the only bottleneck is memory. Once you have too many things open it's not happy. But doing webdev this thing compiles 40% faster than my (older) xeon 8 core desktop machine. It even plays Baldurs Gate 3 (but it takes too much space on the drive lol)
This is similar to me, without ollama. I sometimes fantasize about a new MacBook but even my docker containers offload their hard work to apis, so outside of memory, I do not want for much.
My m1max is getting along quite nicely, I love this thing.
Golang, but does it matter ? What do people do with these entry models anyways ? Users who have high expectations will know better than to buy these, but for the average joe they will be more than sufficient
I disagree. I’ve used a M1 air as my daily driver at work for the last four years. It only ever truly throttles during periods where I’m running one build after another for ~half an hour plus. If anything, the limit on memory to 16GB is the real killer
More interesting, these results are from the base iPhone with the base A19, not the iPhone Pro with a vapor chamber to act as a heat spreader and the A19 Pro.
I've been using an M1 since release date and I never have my fans kick in unless I'm doing something that pegs CPU for minutes on end. The last time I noticed fans spinning was the last time I ran a benchmark.
The A series has been passively cooled for 15 years inside of systems with much less thermal mass.
Not really a problem with how those usually only throttle with sustained load. Very few people do anything that keeps the CPU tied up for longer than a minute or two.
And even then, my M1 air is still fast under sustained load. As fast as it would be with a fan? Surely not. But fast enough that unless I'm racing compilers or something I really don't care.
It helps that for heavier work I tend to use my desktop I guess.
Yeah these things have margin for days, so they’re still reasonably snappy even when throttled. It’s not like you’re cut back to Intel Atom performance levels or something.
Ehh. A lot of people play games. I can make the fans spin heavy on my m3 pro. Especially when most games on mac today mean running through a translation layer and weren't multithreaded to begin with. Throttling is a nonstarter for games. Might be ok for compiling but for gaming the lag induced by throttling makes it entirely unplayable.
This is great news for the entire ARM ecosystem. The fact that ARM is now exceeding the best x86 CPUs marks a historical turning point, and other manufacturers are sure to follow suit.
> The fact that ARM is now exceeding the best x86 CPUs marks a historical turning point, and other manufacturers are sure to follow suit.
Haven't they been playing leap frog for years now? I avoid the ARM ecosystem now because of how non-standardized BIOS is (especially after being burned with several different SoC purchases), and I prefer compatibility over performance, but I think there have been some high performance ARM chips for quite some time
I came to realize by soldering a lot of fast ram on to the board of newer laptops and phones, maybe it's not the instruction set that matters that much.
Modern Apple hardware has so much more memory bandwidth than the x86 systems they're being compared to - I'm not sure it's apples to apples.
A19 has WAY less bandwidth on its 64-bit bus than desktop chips with 128-bit busses . AMD’s strix halo is also slower despite a 256-bit bus.
Pushing this point further, x86 chips are also slower when the entire task fits in cache.
The real observation is how this isn’t some Apple black magic. All three of the big ARM core designers (Apple, ARM, and Qualcomm) are now beating x86 in raw performance and stomping them in performance per watt (and in performance per watt per area).
It’s not just apples deep pockets either. AMD spent more in R&D than ARM’s entire gross profit margin last I checked. Either AMD sucks or x86 has more technical roadblocks than some people like to believe.
Spot on about the memory, even some of the actual M* models don't have all that high of memory bandwidth and still kick ass just as well as the ones that do in this kind of benchmark.
I do feel like x86 has more technical roadblocks, but disagree the amount of investment is not the primary driving factor at this point. I haven't seen designs from ARM itself beat x86 on raw performance yet, and 100% of their funding goes towards this point. E.g. the X925 core certainly doesn't, nor does the top single core Android device on e.g. Geekbench come close to current iOS/PC device scores. They've announced some future shipping stuff like the C1 which is supposed to, but now we're talking marketing claims about upcoming 2026 CPUs vs Zen 5 from 2024. Perf/Watt wise absolutely of course, that ship sailed long ago. Z1/Z2 were admirable attempts in that regard, but still a day late and a dollar short to leading ARM designs.
The other factor to consider is scale-out CPUs with massive DC core counts tend to have mediocre single core performance, and that's what AMD really builds Zen for. Compare to Graviton in the DC and AMD is actually performing really well in both single/multi performance, perf/watt, and perf/dollar. It just doesn't scale down perfectly.
Apple/Qualcomm have certainly dumped more R&D into their cores being low core count beasts, and it shows vs any competition (ARM or x86). The news likes to talk about how many of the Nuvia developers came from working on Apple Silicon, but I think that is a bit oversold - I think it's mostly that those two development programs had a ton of investment targeting this specific use case as the main outcome.
x925 does according to GeekerWan. C1 Ultra is even faster. x86 GB6 results are from the Geekbench website. I searched for the fastest overall scores I could find in the first few pages of GB6 results to steelman as best as possible.
The long and the short is that x86 is WAY behind in every way. The chips are larger, hotter, and slower too. If the rumored A19 Pro in a $500 laptop happens, it's going to absolutely crush the wintel market.
The stuff about Graviton is missing a key element too. Look at the X3 scores below. They are around 30% slower than the x86 competitors. This is what Graviton 4 is using (Neoverse V2 is based on X3). Neoverse V3 was announced almost 2 years ago now and is based on X4 which is a pretty big jump. I'd expect Neoverse V4 in Feb 2026 to be based on either X925 or C1 Ultra. When these newer, faster cores hit the market, they will be beating x86 in cost (the cores are smaller) and power consumption if not peak performance too.
I talked to a guy who'd worked at Apple on the chips. He more or less said the same thing, it's the memory that's all the difference.
This makes a lot of sense. If the calculations are fast, they need to be fed quickly. You don't want to spend a bunch of time shuffling various caches.
Memory bandwidth/latency is helpful in certain scenarios, but it can be easily oversold in the performance portion of the story. E.g. the 9950X and 9950X3D are within less than 1/20th of a percentage point of each other in PassMark Single thread (feeding a single core is dead easy) but have a spread of ~6.4% (in favor of the 9950X3D) in the multi-thread (where the cache is starting to help on the one CCD). It could just as easily have been in the other direction or 10 times as much depending on what the benchmark was trying to do. For most day to day user workloads the performance difference from memory bandwidth/latency is the "nil to some" though.
Meanwhile the AI Max+ 395 has at least twice the bandwidth + same number of cores and comes to more like a ~15% loss on single and ~30% loss on multithread due to other "traditional" reasons for performance difference. I still like my 395 though, but more for the following reason.
The more practical advantage of soldered memory on mobile devices is the power/heat reductions, same with increasing the cache on e-cores to get something out of every possible cycle you power rather than try to increase the overall computation with more wattage (i.e. transistors or clocks). Better bandwidth/latency is a cool bonus though.
For a hard number the iPhone 17 Pro Max is supposed to be around 76 GB/s, yet my iPhone 17 Pro Max has a higher PassMark single core performance score than my 9800X3D with larger L3 cache and RAM operating at >100 GB/s. The iPhone does have a TSMC node advantage to consider as well, but I still think it just comes out ahead due to "better overall engineering".
It's very possible I am misinterpreting, but the A19 seems to have less total memory bandwidth than, say, a 9800x (but not by much). And far less than the Max and Ultra chips that go into MacBooks.
So I think there's more to it than memory bandwidth.
X86 compete based on clock speed for the longest time so they use cell libraries designed for higher that. This means the transistors are larger and less dense. Arm cores are targeted at energy efficiency first so they use denser cells that doesn’t clock as fast. The trade off is they can have larger reorder buffers and larger scheduling windows to squeeze better ipc. As frequency scaling slows but not so much density scaling you get better results going the slower route.
PassMark and Geekbench are closed source. I don't know why I should trust them to e.g. treat fundamentally different kinds of devices in a sane and fair way. They have vastly different cooling mechanisms, for one. It matters if they can sustain a certain load for 5 minutes or for 5 hours.
If you have a specific use case in mind you might as well just test that use case directly. E.g. PTS (Phoronix Test Suite) is not a traditional benchmark, it's a workload tester, and it's right up that alley (though you then run into the problem of "the workload I care to compare doesn't run on an iPhone at all", at which point you either compare generalities again or don't care anymore).
In some other context I'd be inclined to agree, but I swear the moment there's an open benchmark, it's going to get gamed by someone. So long as there's money to be made, the incentives don't align, look at Volkswagen for some inspiration.
SPEC has been the industry standard performance benchmark for comparing between different CPU architectures for decades.
It takes all freaking day to run, but Anandtech published the benchmarks as soon as they got their grubby little hands on a new core design for every well known architecture.
Is GeekerWan on YouTube the only outlet still doing this today, albeit in Chinese with English subtitles?
You would think that Chips and Cheese, at least, would take up the gauntlet.
This is cool but it’s just one benchmark. It’s not clear to me what exactly PassMark even measures. That doesn’t mean the result is “wrong” but take it with a grain of salt
PassMark single threaded test is a very simple synthetic benchmark according to their docs:
> The single threaded test is an aggregate of the floating point, string sorting and data compression tests.
It’s an impressive score, but there’s more to performance in real apps than these three simple tests. This is a reflection of the high burst clock speed of the chip and the great job they did keeping it fed, but PassMark single threaded is about the simplest measurement you can do.
Modern CPUs aren’t really optimized for peak floating point throughput because any GPU will do a much better job at that. It’s also not clear if they actually use SIMD features of the chips, where desktop and server class parts should be able to walk away from the Apple phone chip due to their higher power limit alone.
> This is a reflection of the high burst clock speed of the chip
Apple's in house silicon strategy has long been to create wide cores that perform more instructions per clock cycle and then run the cores at a conservative clock speed for power savings.
Running up the clock speed for performance no matter what effect that has on power consumption and heat has been what we've seen from Qualcomm/Intel/AMD/Nvidia.
In Geekbench multicore:
> the A19 Pro delivered a powerful multi-core score of 11,054... while using a remarkably low 12.1W of power... To put that into perspective, other flagship chips from the Android camp had to push their power consumption much higher to even come close to its scores. The Snapdragon 8 Elite, for instance, had to use 17W of power to complete the same benchmark. Meanwhile, the Dimensity 9400 consumed a staggering 18.4W.
Another reason why Apple should allow the UTM project to use the JIT API in their iOS app. I want to virtualize ARM64 Linux VMs on the fastest CPU in the world. Apple could put Android's Termux to shame, but they don't want to do that, and therefore I have no use for them. What great things are iPhone users going to do with a fast locked down phone? Play Roblox at 120 FPS? Doom scroll at 2x speed? Not know how to use their phone (but faster)?
Wouldn't it be so cool if you could actually do anything with it? Imagine building a Bugatti and installing an unremovable speed limiter that forced you to top out at 75 MPH, because of course you're never going to take it to a track.
It was only intended for highway travel at best.
But hey, you can scroll at 120hz now, right? Think different!
All this tells me is that Intel and AMD are the only manufacturers making leading chips that you can do anything with.
Upvoted because 90% of your comment is great so doesn't deserve to be grayed out, but the Intel bit is wrong. I would definitely recommend checking out AMDs offerings and then backpedaling on the Intel claim because even my four year old AMD chip absolutely screams
> All this tells me is that Intel is the only manufacturer making leading chips that you can do anything with.
From context you seem to be talking about overclocking. In which case only a handful of Intel chips are unlocked per generation. By contrast, most of AMD chips are unlocked.
It’s definitely sad and weird. Times and people have changed.
For whatever reason the community now is filled with a weird mix of corporate bootlickers who want to give the largest corporations money while they take away our freedoms and they simultaneously want to make known their entitled behavior by claiming individual developers and small businesses should only ever give away their software for free.
It was bad enough people here chime in technical conversations that they have no experience with loudly contesting the points of actual experts and practitioners, but now they also want to tell us that we don’t have a right to use our property how we want.
It’s definitely sad and weird. Times and people have changed.
Yeah, I grew up tinkering with Linux since 1994; was good times. I've changed.
For whatever reason the community now is filled with a weird mix of corporate bootlickers who want to give the largest corporations money while they take away our freedoms and they simultaneously want to make known their entitled behavior by claiming individual developers and small businesses should only ever give away their software for free.
A bit after the turn of the millennium I became legally blind and I also needed to eat. Apple was able to serve my needs with their accessibility tools and keep me fully functional without additional costs beyond the hardware and base software, and still to this day. Nothing had come close back then nor now. While I appreciate Stallman's ideals and how he wouldn't use assistive technology if it isn't free software, I can't succumb to those artificial restrictions. The reality is that I would have been much worse off financially, being self-sufficient, going through studies, progressing in my career, etc. if it wasn't for Apple's accessibility tech. So yeah I'll own the whole corporate bootlicker nonsense, and it's why in later years of my life I now invest in Apple, because they deliver usable solutions to real problems to the vast population rather than cater to an insignificant population with piddly ideals.
It must be nice if your only issue with tech is that you need it to go brrrrrrrr.
What if Apple, et. al. decided to require an unaffordable subscription to use those assistive features you have become dependent on, or they stop supporting or even outright remove those features? Your devices become useless, and the ability to run free software becomes the only way to salvage them.
And sure, you could go looking elsewhere in the market, but the same thing could happen with any vendor, and there's no guarantee that the specific set of features you desire is going to be available at a price point that makes sense (especially if your income changes due to further disability).
What if the current US federal administration continues its fascist descent and decides that supporting disabled people is "woke DEI shit" and puts incredible pressure on companies to discontinue features for, or even surveil and report, its disabled users? Eugenics is part and parcel of their ideology after all. Or more likely, the US govt requires a ChatControl-like feature in all commercial software.
Free software is a matter of personal self-reliance -- not just some hobby.
Eh, not really interested in entertaining some hypotheticals that has negligible probability relative to a black swan event occurring. Needless to say there is very little evidence of what you describe with Apple; if anything over the 23 is h years of having used Apple products, the access to the latest accessibility and assistive tech has gone from paid upgrades to free.
Your devices become useless, and the ability to run free software becomes the only way to salvage them.
Free software is a matter of personal self-reliance -- not just some hobby.
Free software provides me negligible utility, I can't rely on it for anything, since it has virtually unusable accessibility and assisitve tech. The chances free software can come to the table with something usable is a lot less likely than the scenarios you have dreamt up, so we'll be fine in the long run.
Like if the Stallman life works for you, go for it; but I would've been left stranded in many areas of my life having stunted growth (not physically) if had not found another way to use tech/computers, which would probably involve spending stupid amounts of money on (again) proprietary niche accessibility tech. Stallman and his FOSS sycophants only care about their ideals, not actually empowering people.
Even if times have changed I will keep being anoying about this.
The hacker culture is something that fascinated me as teenager and the reason I am able to pay the bills today. I don't really know what would be of me without it.
As another commenter said, the part with Intel is the only manufacturer making leading chips, doesn't really makes sense.
Right now Intel is losing a lot, lucky to them Nvidia invested. So far I would only use AMD. In my desktop is an AMD, because it is just the fastest desktop CPU und the threadrippers are absolute multicore beasts for servers.
I've been super impressed with AMD chips as well. I have actually had to try to find CPU bound use cases because the damn thing is so fast it makes you question if it actually worked
This is what they are saying - the power is there, but iOS severely gets in the way of interacting with it properly any time you step out of the box of "touch first Apple approved mobile experiences". It'd be awesome if one device could be my phone and laptop instead of trying to make do with the screen share on two completely separate systems.
The iPads even more so, as some literally have the same kinds of hardware as a MacBook already (minus the keyboard detaches).
Android and ChromeOS are merging for these kinds of experiences https://chromeunboxed.com/its-official-google-says-the-andro... and both have already opened up to running anything you'd like in Linux on the side. That said, Android wasn't exactly my first pick of flexibility either.
> Buy a laptop you’ll be ok.
This kind of comment is uncalled for. Anyways, as mentioned, I do the screenshare to a MacBook today. The advantage of having both is clearly for Apple though, not me.
Imagine building a Bugatti and installing an unremovable speed limiter that forced you to top out at 75 MPH
What workload are you envisaging to be run on an iPhone where this even matters? Hyperbole aside, what target population of iPhone users even care about overclocking, and specifically what tangible benefit will they get out of it?
You are completely missing his point. It's not about the mhz, it's about making a super high performence product but refusing to let you use it for things that actually benefit from that performance.
I actually really don’t see the point. Yea, in this consumer smart phone, there are limits on what you can do with this chip because the operating system is tuned for phone stuff, not workstation stuff.
But there’s good news — this architecture will end up in MacBooks, Mac minis, Mac Studios etc.
It’s like complaining that they put a good engine in a civic when you can also buy that good engine in other configurations that will let you do more with it.
The limit is artificial. That's the entire point. There is no laws or bible verse telling apple it's illigal to let you use the cpu on your phone for workstation workloads. Wouldn't it be nice if you could hook up your phone to a usb dock, boot Linux/windows/macos and get a workstation that's faster than a 2000$ laptop? Sure you can buy a Mac mini, but iphone owners already have one in their pocket.
In a vacuum any added capability seems nice. And it seems so simple: "just let me go into desktop mode."
In practice, the engineering effort to enable that just doesn't seem worth it. And in a zero-sum world of engineer time, a cost better spent elsewhere. Let my laptop be a laptop, and focus on making that experience the best it can be. And let my phone be a phone, and focus on making that experience the best it can be.
I think people fundamentally don't understand Apple when they want them to engage in the same kind of "jack of all trades master of none" pursuits that led to subpar Windows experiences and the fragmented Android ecosystem.
You can kind of see Apple dabbling with this a bit with iPadOS. And it's an absolute mess. My least favorite operating system Apple makes. All available evidence right now points to Apple simply not being able to neatly converge different computing paradigms. They are right to show restraint with their most important product.
I'm ok with them experimenting with this with the iPad, because frankly, the iPad does not matter. But I do not want Apple to mess up the phone for the two people on hacker news that want to hook theirs up to a thunderbolt dock.
I was asking in quantity rather than identifying those users; tacitly implying that there is a negligible amount of users that actually care or want this as top priority. I know some people, especially on a site like this, might find offence in this, but I'm just stating the reality of the situation.
The focus on profit instead of hacker culture is the cancer of this site.
To me feels like you have a paratistic relationship with the brand of your phone maker. And you will fight other individuals to defend their monopolistic actions.
I mean my relationship with the brand or phone maker is immaterial, even though I have a significant investment and appreciate their direction in general.
But to address your concern, Apple is clearly not for you and that is fine, you have choices, like the premium value and control you get with PinePhone. No corporate bootlicking required!
So a market already exists for those that want full control over hardware and embrace hacker culture and don't subject you to corporate bootlicking, yet this is still not enough? I actually meant what I said when I say Apple is not for you, because this is a segment of the market they're not interested in.
If it matters to you, put your money where your mouth is, and support the markets and initiatives that embodies your values; you never know, you could contribute to efforts in creating a CPU that surpasses Apple or whoever is the incumbent.
There are already CPUs from companies like Qualcomm that come close in performance and are available to anyone with a large bank account.
The iPhone chips are great but even if they were 2X as fast as the competition it’s not going to open up an entire new world of use cases that don’t already exist
Qualcomm & Samsung kind of fill the market for high end mobile CPU though. I don't know what Apple brings to the table if their CPUs are used outside of their designs.
Fair, to entertain the thought though, it'd make little sense if it was COTS due to either a loss or razor thin margins due to low demand. However nobody is stopping a hardware integrator approaching Apple to use it, but it'd probably be stupid expensive and shrouded in secrecy as Apple would not want to cede full control of its deployment due to things like Secure Enclave.
I think it's less about what hardware integrators want more what Apple doesn't want. Apple tries to be as soup to nuts about their solution as possible, they don't want to piece that out to give some of the margin of the best pieces to a 3rd party trying to compete with them instead. Anything they could pay is just from revenue Apple could already target directly and keep all of to themselves.
Everything has a price though. If you offered half of Apple's market cap, I'm sure they'd be changing their tune. (Yes it is an exaggeration, but my point is that there are exceptions.)
The problem lies in that Apple already gets 100% of the profit from being vertically integrated today (so the offer would not only need to be large, but more than what one could reasonably hope to make in return). If it were just that Apple had a really good CPU then both Apple and the 3rd party(ies) would be able to come to the table with a net positive deal, but vertical integration makes it so only 1 party (Apple, in this case) can come out ahead until they are dethroned for other reasons completely.
Does income or number of apps matter for anything here? Like, if given the choice, would you rather only play games from App Store or only games from Steam?
The point they're making is obvious. iOS was and still is a platform for mobile applications and mobile games. Despite how much processing power is crammed into those devices, you can't use similarly for "serious work". All that power will never be used to run actual games (like games from a PC or a console), or render a 3D model in Blender, or do CAD work, or run "full" versions of any software that has a cut-down mobile variant. Which begs the question - why mimic the power of a full computer while having no way of doing the work that full computers need all that power for?
This is correct. What I'm pointing out is that the point they were making wasn't just about pure quantity, even though that's what they said directly. Obviously, the implication is that they're talking about "real" games, as in fully-produced games that are published on PCs and consoles. You may disagree over this point, but I thought this implied subtext was very obvious.
> the point they were making wasn't just about pure quantity, even though that's what they said directly
(emphasis mine)
> Obviously, the implication is that they're talking about "real" games, as in fully-produced games that are published on PCs and consoles
Is "real" games like "no true Scotsman" games? As you note, GP said "games ever made" rather than (for example) "PC games". But your reframing of the category makes me wonder - what percentage of Steam games are also on the App store? I wouldn't be surprised it if is more than .01%.
What does the emphasis add? I know what I said. Is your point that OP did actually seriously think that App Store only represents 0.01% of the video game market quantitatively? Or is it that you understood the subtext of the message but think that ignoring it is preferable?
> Is "real" games like "no true Scotsman" games?
My previous comment fully explains what I meant, and I also put "real" in quotes for a reason. All over this thread, the #1 complaint people have is that iOS devices are given lots of computing power but lack the capacity to do most "serious" performance-demanding tasks, and I explained which games are represented by those and why people want them.
There have been many games released for iOS. In fact iOS makes up about a third of all revenue for video games. Also just because it can't play some old game for PCs, it doesn't mean that it's useless. It just means that people may have to do work to port those old workloads to iOS if they want legacy software to use new, powerful hardware.
"the market", I preffer to say consumers, becuase I am not a fucking beancouter MBA, demand videogames, is the bigest entertaiment industry in the world, bigger than movies and music combined.
By your own logic because Apple does not sell spread sheet software, databases or porn, "the market" does not demand it.
I don’t want a MacBook with an A19, I want to use the A19 I already have connected to a screen and a keyboard with a proper software stack.