Hacker News new | past | comments | ask | show | jobs | submit login
Virtualizing iOS on Apple Silicon (nickb.website)
357 points by walterbell 6 months ago | hide | past | favorite | 155 comments



> Corellium and their virtual iPhone cloud product (only publicly-available “complete” solution)

Corellium won their legal case, allowing them to rent [1] iOS Cloud VMs for security research, https://hn.algolia.com/?query=corellium

If iOS can be virtualized on Apple Silicon Macbooks, it could reduce demand for commercial iOS virtualization services.

  Individuals: $400/month
  Business: $60,000/year
[1] https://support.corellium.com/subscriptions/pricing


my god, $4-$8/hr who is paying for these VMs?


You say that but companies routinely pay projects like Circle CI similar orders of magnitudes for chunkier CI builds (one place I know having builds take 30 minutes.... with 64 shards. Basically paying like 5-10 bucks per commit)

You still gotta do hardware management yourself in other words but CI is good business!


> Basically paying like 5-10 bucks per commit

Which isn't actually all that much, compared to the amount you pay your developers.


And not much compared to a bug in prod or waiting 3 weeks for a release (the value proposition of CI/CD)

Still, saving cloud spend by making the build faster or using self hosted runners is probably worth it.


Definitely, if you can cut costs, do it.


An additional several thousand dollars per month for any moderately-sized team, especially if you've got a bunch of microservices, sounds very expensive to me, and would probably get axed quickly by any devops team I can think of that's actively working on reducing their costs.


I mean if you're paying 5+ digits a month on CI it starts being reasonable to say "maybe I should do things to lower the absolute cost of this". 5 dollars time a big enough number is real money!


Is there an alternative service that offers similar service? I am not sure if I have ever seen anything else come close to what these guys do. I recall them going into battle with apple and it looks like they had won. But it sucks that we do not have any open source solution for iOS or Android to emulate the OS for these devices.


I remember we paid Circle CI ungodly amounts to host a dozen trash can Mac Pros to run our iOS CI. Early Swift versions caused huge spikes in build times.



No, those people are almost certainly paying far below list price.


There's probably some enterprise level deals going on there (as with every service provider), but they will still be paying them A Lot of Money every year.


Certainly not paying anyones rent, but I've paid them a couple of bucks over the years to test software on their VMs, since they can come jailbroken out of the box.


attackers and defenders of zero day vulns in iOS black boxes


This is cheaper than 8xH100 GPU compute time for AI.


This is such a left-field comparison. One H100 costs $25,000, whereas one Macbook Pro/iMac/iOS device costs roughly a tenth of that. It's not at all surprising that it's cheaper to rent something that has CapEx costs 2 orders of magnitude less than that of 8xH100 ($200k for the GPUs alone).


I think the point was a $4-$8/hr VM is pretty small potatoes compared to other common corporate expenses.


The H100 was a terrible example to support that point because it has a much better (rent vs buy) value proposition.


The problem is that H100s are enterprise products while Apple ones aren't. If you have trouble with your H100s how does it compare cost-wise with having trouble with your consumer Apple hardware?


Depends on the kind of trouble you are having.

Many corner stores can fix a smashed iPhone screen.


You think that's a lot wait until you see what AWS charges for GPU instances...


This is great; for your next trick, can you please figure out how to install MacOS on an iPad so that we can all finally get the dang computer we want Apple to build?



> It took two and a half hours for my iPad to crawl through installation.

Jailbroken Apple M1 iPads with iOS16 can use the iOS hypervisor to run VMs without overheating their devices or waiting hours to boot.

Still, we can thank Apple for small mercies like UTM, ashell and iSH.

As a science experiment, Apple could silently launch a "VM store" with $100 VMs, accessible only via hidden URL. How badly do Apple customers want to use the iPad hardware they already purchased? Could Apple customers be extorted into paying for VMs? Will anyone ever ship a competitive tablet running Linux?


Apple definitely does not want bad reviews on their iPads because the VMs they are selling are crashing more often than their other offerings. Any product Apple sells for actual $ would have meet Apple’s standards of support and customer service, or they would be deteriorating their Goodwill.

Except their charging cables. Apple actively trades goodwill for those margins.


> the VMs they are selling are crashing

VMs would be a subset of existing iOS/macOS. VMs are more uniform targets than bare-metal, less likely to crash.

Think Microsoft CAL (client access license) paper entitlements.

Or papal certificates of indulgence, https://theconversation.com/the-catholic-view-on-indulgences...

Software margins with iPad-as-DRM-dongle, investors would rejoice! Broadcom and Qualcomm would be envious.


You do realize that Apple sells Macs, right?


While accurate, this misses the point. The chip is very similar to that in the Mac and it’s frustrating to have Apple prevent it being used more fully.


I agree. But this has nothing to do with Macs crashing more often.


Without JIT, it’s more a prof-of-concept than a useful tool, IMO.


With the current iPad limitations, however slow it gets, running any arbitrary code locally can be a big deal.

If you wanted to debug and print out your original papercraft without remoting into another machine, that will probably be good enough.


> running any arbitrary code locally can be a big deal

yt-dlp on iPad via iSH is surprising useful.


> [...], running any arbitrary code locally can be a big deal.

Well, unless you are using JavaScript?


Prior research section reads:

> [Zhuowei Zhang] concluded that (GUI) macOS applications cannot run on iOS—but (graphical) iOS apps can run on macOS. Mac Catalyst seems to work, expectedly, only one way.


Oh, 1000 times this! Literally my research subject in the last weeks, a tablet with a dev OS. Sizewise I love the iPad Mini, but iPadOS is useless. Looking now at a Surface Go: slightly too large. If anyone has another suggestion for a small tablet with Win11 (I know there is none with MacOS), please post it here. Will order from China if needed.


GPD Pocket 3. It has an 8" screen. It is laptop form factor but the screen can be folded 180 and flat over the keyboard for use as a tablet. Spec page says it can run Windows 11 but comes with Windows 10.

https://gpd.hk/gpdpocket3


Thank you and Joeri - this is indeed the closest to what I am looking for.



Small tablet? It has a 14” screen.


What about the GPD Pocket?


The iPad Pro is the fastest machine out there (M4)


If they called it a Macbook Air with an upside-down bulge and a detachable keyboard, would that be just as well?


With touch screen, iOS and macOS?


macOS when the keyboard is attached, and iOS when it is not, with instantaneous switching.

One can dream… but really, this is not all that far-fetched


What would happen to the macOS applications when switching to iOS? iOS applications would be fine in macOS-mode (Mac Catalyst already exists). But, there isn't a good way to make macOS applications usable in iOS-mode.


Start with macOS CLI VM. Give it a couple of years and see how customers use it.

Or allow Linux/FreeBSD VMs.


We cureently have that on pixel devices, where you can just run windows if you want

https://www.cnx-software.com/2022/02/14/android-13-virtualiz...

We had that with linuxonDex or maruos or motorola or..

People say they want that, but when they actually do, it completely flops.


> People say they want that, but when they actually do, it completely flops.

That feature isn't yet exposed to end-users, although the plumbing is in place. So far only on Pixel devices.

There's potential for phones docking to desktops via USB-c, running both Android and Linux/ChromeOS VMs.


Except no one except a few geeks ever cared about Samsung DEX, or Windows Phone/Table continuum.


Well, it couldn't run any real programs. ARM caught up now.


I thought that was what FOSS was all about having all those people in their Internet coffee shops using Termux under DEX to compile their real programs.


Google Android (Linux) App Store 2023 revenue was over 45 billion dollars.

Google is a major contributor to Linux in support of Android and Chromebooks.

Some non-zero portion of those value flows would benefit from mobile intermittent-desktop UX.


And that is related to people not buying DEX, because?


Near native speeds. VMS. Android desktop mode. Etc. it's never a keyboard less computer. It's always a bodge on top of Android. Of course it flops.

I don't think the real thing would be good enough. The compromises can only be worse.


Touch screen but no iOS.


Or add 4g to the MacBook air and 80% of people will stop asking for MacOS iPads.


Can we please get iOS on Linux so we don't have to buy the phone to develop for it?


What is it about the iPad you want with macOS? It has one port (which is largely used for power), no built in keyboard, an incompatible input device (touch) by default, and often smaller screens. What’s the appeal?


Apple doesn't sell a MacBook with a cellular modem, drawing stylus, or 4:3 screen. Also, iPads sometimes get the newest chips first (e.g. M4)

To be clear, as an iPad Pro owner, I don't think "just put macOS on iPad" is the answer to the iPad's problems, either. But I can appreciate why people complain about it. Apple sold them "a computer", they want "a computer". With full iPad ports of Mac apps, a filesystem that isn't entirely built around share sheets and hope, project exports that don't demand you keep the app foregrounded because Apple can't be arsed to add a "Eat The Whole Battery" multitasking mode, multiple user profiles, third-party developer tools, third-party app distribution, and generally, the ability to innovate on the platform without Apple's prior written consent[0].

The iPad was originally announced as Apple's answer to the netbook: a cheap(ish) touch computer for casual computing tasks and games. In that narrow lane, it succeeded. But in 2015 with the introduction of the iPad Pro, Apple decided that the iPad was going to replace the Macintosh. The Mac was there to be to the iPad what the Apple Lisa was to early Macs: an annoying technical relic to bootstrap software onto the newer, superior platform. Except Apple didn't have the courage to pull the trigger on several features necessary for creative and development workflows on iPadOS until it was too late. e.g. The reason why iPadOS is built so heavily around share sheets is because, for the first six years of the iPad's life, that was the only way to share data between apps[1]. So there's a lot of old apps that do things the annoying way, a lot of roadblocks that get put up arbitrarily, and so on.

[0] More broadly, the creative economy needs to stop talking about consent. Consent is for sex, not creativity.

[1] iOS 9 (?) added support for shared containers, but AFAIK each app that wants to use the container has to opt into it; and all apps have to be published by the same corporate entity or otherwise consent to data sharing in this way. There was no way to just have files owned by the user and nonconsensually modified by other apps.


> But in 2015 with the introduction of the iPad Pro, Apple decided that the iPad was going to replace the Macintosh.

Although that is an intriguing (and controversial) possibility, Apple has never explicitly stated that. What they have done, is continued to heavily invest in the Mac lineup. IMO the ‘Pro’ in iPad Pro is meant to target creative professionals, not all the types of professionals (for e.g. programmers).


The Mac lineup from 2015 thru 2019 was one of the least compelling set of computers to buy. Fragile keyboards that broke in a year of normal use, high-end laptops with i9s that overheated immediately under any level of use, an annoying and gimmicky touchbar, and so on. If I wanted to absolutely murder a platform, I'd just do what Apple was doing to the Mac in this era, forever.

The iPad division was advertising their product with taglines like "what's a computer" and phrases like "desktop-class" that indicated Apple - or at least, the iPad division at Apple - considered the iPad to be a MacBook replacement, not just a companion. And they still do this, even now when the Mac team is actually putting out good hardware again (ironically, by putting iPad chips in them).


Well... the way the Mac lineup was broken in the date range you have given was less to do with wanting to kill the line, more to do with poor management of the designs being used. It was a very egotistical "we can make a better keyboard", "you want a fance second screen", "we decided for you you don't need ports", "our design is style over substance" type of deal. Whether that was Ive or not, after he left it seems like it all got redone.

You second paragraph - you need to understand the audience. For most average people, they don't need a Mac. They need a web browser, a banking app and a few other sundry apps for casual tasks. For the pro artist, they want a canvas. The iPad fits that fine. With a keyboard it is a very capable computer, with a pen it is a very capable canvas. Even for professional workers, the iPad can be enough if all they are doing is creating documents[1]

The MacBook Pro/Air is for anyone who - want to code in general and/or wants to create apps for iOS and macOS, wants to use addition hardware that requires extra power to be used (music production interfaces for example, or other types of compute module), needs to multitask, needs to consume files from different sources (local, usb, network etc)... etc. So, professional people, or "power users". This includes those that simply prefer a traditional computer. This is a fuzzy cross over. It is not precise.

The M series chips are not "ipad" chips. They are just chips. They happen to be used in multiple places, but this is no different to using an Intel chip in a tablet (which for sure happened. I owned a Windows 8 tablet that had the form factor of a Nexus 7 tablet.)

[1] Anecdote - a guy came to my house last week to do a survey for some heating system changes, and all he had was 2 ipads, a printer and his regular tools. All the documentation and quotes he printed on-site, they all came from the iPad Pro he had with a keyboard. The other one was most used to survey, he had a thermal camera to look for where out underfloor heating was laid as we don;t have the original owners installation documents.


This question has been endlessly asked and answered, https://www.theverge.com/2021/4/22/22396449/apple-ipad-pro-m...

The answer is obvious to Apple iPad Pro + Magic Keyboard customers:

   MOBILE MICROTASKING
No, dear Apple, it will not compete with your precious Macbook revenue, because an iPad is not a laptop. Your customers who are pleading for un-crippled iPads will keep buying desktops and laptops. But it will be life-changing for on-demand portable anytime anywhere access to OSS code and professional apps, for last-minute edits, quick checks during video calls, testing-while-learning and countless other scenarios. iPad enables flexible computing, i.e. unlimited use cases -- and revenue! Still the only mobile device with 4:3 HiDPI screen.

Apple continues to pour billions into science experiments like Vision Pro (iPad-on-Skull) and anemic cloud services, while refusing to improve the workflows of millions of customers who are willing to pay for repackaging of existing technology already sold by Apple. Fortunately, the industry has not been standing still while Apple squandered a decade of feature-frozen tablet supremacy. Google is now shipping VMs on both phones and tablets.

> At a privately held event, Google recently demonstrated a special build of Chromium OS — code-named “ferrochrome” — running in a virtual machine on a Pixel 8. However, Chromium OS wasn’t shown running on the phone’s screen itself. Rather, it was projected to an external display, which is possible because Google recently enabled display output on its Pixel 8 series... Hopefully, Google will offer the ability to run Chrome OS alongside Android in a future device


As much as I would like macOS on iPads, running macOS applications on an iPad without the Magic Keyboard would suck. Windows can do it though, so it's not really an excuse.

But, in Microsoft's case, it's sort of different. They had Windows 10 Mobile for tablets which is the closest thing to iPadOS, I suppose. Windows 10 Mobile couldn't run Win32 applications, similar to how iPadOS can't run macOS applications. Microsoft killed Windows 10 Mobile...

Implementation-wise though, it's a big effort for Apple. They can't just make macOS applications runnable on iOS. Something like reverse Mac Catalyst for iPadOS wouldn't work due to how complicated and different macOS is compared to iPadOS. It would probably have to be a full on emulation of macOS on iPadOS for applications to run.

So, it would seem like starting with macOS then implementing iPadOS on-top would be better than starting with iPadOS and implementing macOS, which is literally what Mac Catalyst is. So now, Apple has to make sure that all iPadOS APIs work with Mac Catalyst (they don't yet) and they have to do something to make the UX work better when switching between touch and M&K.

Bringing it back to Microsoft and Windows now. It's quite similar actually. Think of Windows 10 Mobile = iPadOS, UWP = iPadOS apps, Win32 = macOS apps. Microsoft killed Windows 10 Mobile and replaced it with full-on Windows 10. Windows 10 can run UWP apps.

Similarly, Apple will likely have to kill iPadOS and fully implement compatibility with iPad apps on Mac for macOS to ever be on iPads.


> But, in Microsoft's case, it's sort of different.

Microsoft has a long history of trying to make both work, but I think the reality of it is actually not that far away:

Windows 10/11 is a very poor tablet OS, a ton of stuff will be clunky with touch as it's currently just a very thin layer of adjustments on top of the OS, and expecting to get access to all the potential of the machine without a keyboard and trackpad will be a recipe for disappointment.

So, if the iPad had to rival the Surface Pro, it could probably do it tomorrow by just sticking macos on it with the accessibility options (virtual keyboard, mouse etc) and call it a day.

Where Microsoft is truely different is that they still shipped the Surface Pro and let users deal with it. The UI in tablet mode is almost the same, with just the taskbar a bit bigger. Sometimes you'll absolutely need a mouse, so if you don't have one you pull the virtual mouse, solve your situation and go back to what you were doing. Sometimes the entry field is from an obscure API and doesn't pop the keyboard, so you pop it manually and deal with it. Or you absolutely need a shortcut, so again you pop the keyboard in the middle of nowhere, hit the shortcut, and make the keyboard go away.

This is the clunkiness Microsoft has fully embraced, and it makes their tablets actually "just work" as you're given all the tools to straightly do what you want to do. That's where Apple is backing out, and chose to build a DisneyLand OS instead of a gritty, dirty and clunky, but fully functional city like environment.

I get why people want a more polished and elegant experience, but if the goal is to get things done, it's probably the most realistic way forward.


Apple could provide an opt-in, obscure Accessibility mode for clunky-functional.


> starting with macOS then implementing iPadOS on-top

This would lose the security properties of iOS, which is a big part of the value proposition for iPhones and iPads.

> They can't just make macOS applications runnable on iOS.

Other operating systems can run in _parallel_ with iOS. Hardware support for nested virtualization has shipped on Apple Silicon since M2. Google implemented Android Virtualization Framework. There's no shortage of candidate VM operating systems. Microsoft implemented WSL (Windows Subsystem for Linux) VM. Apple could ship ASF (Apple Subsystem for FreeBSD) VMs.


> This would lose the security properties of iOS, which is a big part of the value proposition for iPhones and iPads.

Right. Apple is adding more and more security features from iOS into macOS. But, people are going to complain about that too.

> Other operating systems can run in _parallel_ with iOS

Absolutely, but there is a performance hit and memory management becomes an issue. You still need a host OS as well to unify the UX. Also, there needs to be some way for IPC between the applications of different OSes.

> Microsoft implemented WSL

WSL has networking and memory management issues. WSLg sort of works. Its architecture is wild [1]. Display scaling is terrible though. IPC between Windows and WSL guests is limited.

WSL 2 was released in 2020 and it still has issues. It's not a simple problem to solve.

Microsoft has also gone from WinRT, UWP, WinUI, MAUI, and now WinUI 3 trying to unify application development [2]. Again, it's not a simple problem to solve.

I think the only OS that has actually unified application development across all form factors is Android/Chrome OS. But, people complain about how limited Chrome OS is.

[1] https://github.com/microsoft/wslg?tab=readme-ov-file#wslg-ar...

[2] https://www.irrlicht3d.org/index.php?t=1626


ChromeOS solves the memory management very well.

Depending on how you use it, you have up to 3 VMs running in the background with Wayland passtrough: ARCVM (Android), Crostini (Linux dev environment) and Borealis (SteamOS).

All these VMs run Linux and Google uses MGLRU in cooperation with Chromes tab discarder to balance memory.


Wow. Never knew about the move to ARCVM. Though the first thing that comes up when I search it is memory and CPU usage issues though :/

Google et al. has put a lot of effort into improving Linux's virtualization capabilities. Goes with being the OS of choice for pretty much all servers I suppose.


The problem with WinRT, UWP, WinUI, MAUI, and now WinUI 3 is the usual Microsoft mess.

As anyone can imagine by that list, every new acronym requires a rewrite, and most folks that aren't on Microsoft pay list no longer care.


Well, with Android/Chrome OS they just recently released Jetpack Compose Material 3 Adaptive which is further iteration by Google to try to unify UI development between form factors [1]. And there's the breaking changes with every major version of Android [2]. Then there are the incompatibilities for Android apps for Chrome OS [3].

You can shit on Microsoft all you want, but those listed APIs still work. People expect Microsoft APIs to exist essentially forever, so every breaking change pretty much has to be a new API.

Having a single simple API for developing across different devices and inputs that automatically provides great UX across the board just isn't possible. It's going to be complex and it's on the developer to cater to each device. An API to cater to everything is literally web APIs.

[1] https://chromeos.dev/en/posts/io-2024#android-on-chromeos

[2] https://developer.android.com/about/versions/13/behavior-cha...

[3] https://developer.android.com/topic/arc/manifest


Tell me you never used those APIs, without telling me.

No they don't work, that is why each iteraction requires a rewrite.

WinRT for Windows 10, isn't the same as WinRT for Windows 8.1, isn't the same as WinRT for Windows 8.

WinUI 3.0 has features that are Windows 11 only, although Project Reunion promised compatibility across Windows 10 and 11, it is still quite far from Win 2.0 in features and tooling, years away in fact.

Likewise the WinRT used by WinUI 3.0 in Win32 mode, isn't the same as WinRT used by WinUI 2.0 in UWP mode. Meaning the set of underlying COM plumbing differs in behaviour and exposed set of interfaces.

And I will leave it here, as Github issues and discussions already have plenty of rant material on the matter from the Windows developer community.


I'm not saying that the newer APIs are backwards compatible with the older ones. I'm saying that they're not which is why they are different APIs. WinRT is a bit different since you target a Windows SDK. But, likewise newer SDKs have APIs that are not backwards compatible.

What I am saying is that the older APIs are forwards compatible with newer versions of Windows. On Windows 11, you can still run applications using those old APIs.

On Android and iOS, your old app may break when running on a newer OS version.

Microsoft doesn't have the luxury of changing the behavior of older APIs on newer versions of Windows, so they end up having to make completely new ones.


Try to run a Windows 8.0 WinRT application on Windows 11 to see how forward compatible it is.


> As much as I would like macOS on iPads, running macOS applications on an iPad without the Magic Keyboard would suck.

The ability to run macOS on an iPad with a keyboard and mouse (and perhaps even a second screen), and run ipad OS when you're away from those things would be pretty great.

The hardware is already capable of this.


I wish they would at least allow VMs on iPadOS. With the release of macOS 15, you can now use iCloud in the guest macOS VM [1].

[1] https://developer.apple.com/documentation/virtualization/usi...


Bingo bango. In truth they just need to allow actual apps to use the virtualization that’s already in the goddamn things. It’s maddening.


Ferrochrome was canceled.


Could an equivalent demo be constructed using open-source components, e.g. via GrapheneOS?


I guess if enough people care, then again, what is the business case?

Being a cool technology demo will end the same way.


The same business case for headset displays powered by phones.


I have an M4 iPad Pro, it beats my M1 Max MacBook Pro on a lot of benchmarks, yet I cannot use it for programming, 3D modeling or VFX. Yes, I know that apps “technically” exist to do those things. No, those apps are not professional grade. I want VS Code, a Terminal, Blender and Adobe After Effects.

Seemingly the only thing stopping those apps from running on the iPad unmodified is the operating system. I want the operating system that runs on the other Mx devices to run on my iPad Pro. I have wanted this for years. I have never been even close to alone in wanting this.


According to the author's GitHub profile they are a fresh CS grad - seriously impressive work.


I imagine a job offer from Apple won't be too far away!


They had better swing this person an offer before they go to the dark side!


Apple didn't make an offer to the HomeBrew founder. So maybe don't hold your breath.


Why would they when Homebrew was amateurish compared to MacPorts built by Apple engineers? (For e.g. it didn't follow accepted unix conventions or macOs conventions on installing libraries for a long time. See this discussion thread https://news.ycombinator.com/item?id=34818192 and this comment https://news.ycombinator.com/item?id=34844292 for more details). But credit where credit due - the founder is a better marketer than developer and more people know about Homebrew than MacPorts.


Apple hired Max Howell to work for them on SPM.


Very cool!

I have a feeling that the reason that Apple hasn't made their Simulator into an Emulator, is because they don't want folks digging into the substrate of iOS.


Another reason it was a Simulator and not an Emulator to begin with could be because a lot of iOS (or iPhone OS) components at the time were forks of existing Mac OS X libraries.


The reason to begin with was the Mac OS was x86-32 and the iOS environment was arm. Building for intel let the ui devs have high performance by leveraging the existing network stack and graphics compositor. But most of the libraries live parallel in the sim, not using the OS ones. That wouldn’t allow you to simulate different iOS versions.


> iOS (or iPhone OS)

Slightly OT but the first iPhone ran OS X at launch.

I think as time went by and the "OS X" running on phones diverged more and more, they renamed to iPhone OS and then iOS some time later? Something like that anyway.


The first iPhone ran iPhone OS 1.0


Well, I never owned one of them, so maybe I'm not supposed to comment on this. But their website was very clear: The OS on the first iPhone was OS X.

https://web.archive.org/web/20070112064939/http://www.apple....


It's about as Mac OS X as Apple Watch runs macOS.


It's not the same as Mac OS X or macOS, they share a common kernel, system libraries etc but the userland is very different on an iPhone or Apple Watch.


Apple advertised the first iPhone to run OS X: https://youtu.be/VQKMoT-6XSg?t=506


It's not the same version of OS X that ran on Apple's computers. The "it's OS X" was more for marketing, they just share the same "core".

You could argue that the iPhone currently still runs macOS if you used the same definition today. They share kernels (iirc Apple always kept the ARM patches to Darwin closed-source), BSD-based userlands and the iPhone used versions of the macs application libraries.

A big difference is the iOS and macOS use different compositors.


Developers still use Intel Macs, and you can't virtualize ARM iOS on that.


The overwhelming majority use ARM Macs these days


Doesn’t sound like a strong enough reason for the visionOS team.


Yeah, I was thinking about the ARM Macs. They are common enough, now, to make it worthwhile.


I really do wonder now that both iPhones, Macs, and iPads are all "arm64" (Apple Silicon no less) how different the bootloaders are for iOS vs MacOS. Once you are past the bootloader, why would they be maintaining two different operating systems/lots of differences if they don't have to, especially since they seem to control the hardware?


The hardware was drastically different between Macs and iPhones when iOS was released. That was in 2007. Apple only unified the hardware in 2020. Over the *13* years, the operating systems have diverged so much that unifying them is a massive effort. The linked blog post by Zhuowei Zhang shows some of the differences. The user-space components are just so different that it's not as simple as running a macOS app on iOS.

EDIT: You can run a iOS apps on macOS without recompilation, but it uses Mac Catalyst which is a user-space shim for iOS apps to work on macOS. Even then, not everything works.


You can run iOS apps directly on M1 Macs. Some developers flip a bit to disable this, but there any many that don’t.


In this case, the app is running in a sandbox with a user-space that simulates the iOS user-space [0].

> Your apps use the same frameworks and infrastructure that Mac Catalyst apps use to run, but without the need to recompile for the Mac platform.

> Although you can run your iOS apps unmodified on a Mac with Apple silicon, Mac Catalyst lets you build your app specifically for macOS and customize your app’s behavior on that platform.

Mac Catalyst was a multi-year effort by Apple. Doing the same to run macOS apps on iOS would probably be even harder due to how complicated macOS is compared to iOS.

[0] https://developer.apple.com/documentation/apple-silicon/runn...


It seems you agree, except previously you said “Likewise, you can't just run a iOS app on macOS. You need to recompile your app with Mac Catalyst for it to work. Even then, it's a bit jank.” That’s not true, you can just run an iOS app on macOS.


I edited my original comment. The "It's a bit jank" part is still true though. When you enable running your app on macOS without compiling specifically for macOS with Mac Catalyst, it still uses Mac Catalyst, but transparently. So, you still get all the issues of Mac Catalyst, but without the compiler warnings. Your application may crash or behave strangely.


They are very similar. The differences are largely in that macOS generally will permit some things that iOS will not.


They aren’t actually too similar. They both use XNU, but the memory model is completely different. On macOS memory can be paged to/from disk. On iOS it isn’t and applications must free memory when asked or be terminated [0].

iOS applications are sandboxed by the kernel, with no opt out. macOS applications are not sandboxed by default and are opt in.

Then there are the API and UI differences.

EDIT: That linked blog post in the parent blog post also shows how different the userspace is: https://worthdoingbadly.com/macappsios/

EDIT: iPadOS 16 enables virtual memory swap [1]

[0] https://developer.apple.com/library/archive/documentation/Pe...

[1] https://www.apple.com/newsroom/2022/06/ipados-16-takes-the-v...


> On macOS memory can be paged to/from disk. On iOS it isn’t and applications must free memory when asked or be terminated

Not sure what you meant by that, you always could `mmap` files into memory on iOS. Back in the 32 bits days there was a ~700 MB limit due to the address space, but there aren't anymore nowadays with 64 bits. If `didReceiveMemoryWarning` is called on your app, then you need to free resident memory but the kernel will take care of dumping file-backed memory pages for you.


> Back in the 32 bits days there was a ~700 MB limit due to the address space, but there aren't anymore nowadays with 64 bits.

Not true, unless something changed recently (definitely more recently than the 32->64 transition). All iPhones have a virtual memory limit (although the limit is higher on phones with more physical RAM).

I know this for sure because several years ago I was the main person in charge of reducing OOM kills on the Facebook iPhone app and virtual memory exhaustion on 64-bit phones was definitely an issue.

See here for where this is enforced in XNU: https://github.com/apple-oss-distributions/xnu/blob/xnu-1121...

I assume Apple does this specifically because they want to prevent apps from simulating swap space by mapping a big file and allocating from it.


That's memory mapping. This is memory paging [1]. I.e. Windows pagefile.sys, Linux swap, macOS swap files. iOS does not have swap files, only memory compression. If you're on a Mac, open up Activity Monitor, go to Memory, and at the bottom there is `Swap Used`. That doesn't exist on iOS. So, if more memory is used than available, applications will need to free memory or be terminated. Unlike macOS, where some used memory will be swapped to disk to allow other stuff to be loaded into memory.

[1] https://en.wikipedia.org/wiki/Memory_paging


It's most likely just disabled rather than being completely different/non-existent. But yes, the application model is built around limited to no multitasking.


> It's most likely just disabled rather than being completely different/non-existent.

As evident by the limited Virtual Memory Swap enabled on iPadOS 16, but not iOS.

All Apple devices use the XNU kernel. But, as the parent blog post shows, the kernel configuration, device tree, and drivers are different.


> how different the bootloaders are for iOS vs MacOS



Very different user experiences, and also, the Mac development ecosystem is well-established. I suspect a lot of Mac AAA apps are done in C++.

Probably a lot of iOS AAA apps are still in ObjC.

It is unwise to pull the rug from established developers.


More like Objective-C++, as otherwise it is lots of fun calling macOS APIs from C++.

And no, C++ isn't as prevalent on Apple platforms as on other vendors.

Hence why you will find out most of the C++ related documentation is for IO and Driver Kit, the Metal Shading Language dialect (based on C++14), LLVM, and that is about it.

Even Metal is actually implemented in Objective-C, with Swift and C++ bindings, and the C++ bindings are really low effort versus the Swift tooling.


Depends on what part of Metal you're talking about.


I explicitly mentioned the only part that is C++, well a flavour of it.


the guy that create qemu-t8030 manage to get springboard running [1] , but doesn't made the code public. Is wonderful if the progress can combined with this one

[1] https://mastodon.social/@ntrung03/109712247237110967


Another qemu attempt is this one [1] , q22-qemu (iphone x)

1. https://www.xia0.sh/2024/03/09/Boot-Newer-iOS-with-QEMU-Step...



Related: https://worthdoingbadly.com/hv/ (Hardware-accelerated virtual machines on jailbroken iPhone 12 / iOS 14.1)


Slightly tangential, but has anyone virtualized ARM macOS on x86-64?


You can't. The term "virtualize" is generally used to mean running an OS via hardware virtualization, where your host CPU natively runs its code but forwards all I/O to a hypervisor. You can only do that with an OS built for the same CPU architecture as your host system.

For everything else, like running ARM software on x86 (and vice versa), you'll have to resort to emulation, which involves either interpreting the code or dynamically recompiling it. By definition, you can emulate anything on anything else (someone recently booted Linux for MIPS on an Intel 4004, the first ever microprocessor), but the performance might be a problem.


TL;DR: emulating any ARM binaries on x86_64 via QEMU is so slow that it is unusable for any general use.

This is also less of a QEMU problem and more just that ARM does not emulate well on x86_64 due to their designs.


I have tried emulating ARM Windows on x86 with QEMU. It is fast enough to see whether something works and not much more (imagine Windows 11 on a 400MHz equivalent processor to understand what the performance was like --- and the host was a fairly recent Intel i7.)

ARM Linux is close to usable, however.


It feels like it's just not possible in general to emulate the full instruction set of any CPU with an MMU with an acceptable performance to run modern software. QEMU running Windows for x86 on an M1 isn't very fast either.

Only emulating the portion of the instruction set available from the userspace is another story though. At least the way Apple does it with Rosetta and Microsoft with whatever their thing is called, you don't even notice that an app is running under emulation. The only giveaway is that it takes a noticeable time to start for the first time while the code is being translated. It's truly impressive.


> QEMU running Windows for x86 on an M1 isn't very fast either.

It seems the main obstacle is in paging where x86 4KB clashes with Apple 16KB (ARM/64 supports multiple sizes), so, 2-level paging canʼt aid and an emulator has to shadow-paging which is, definitely, much slower.

> Apple does it with Rosetta and Microsoft with whatever their thing is called, you don't even notice that an app is running under emulation.

But they still use a vendor-specific TSO support in hardware.


Curious, does QEMU use some kind of ahead-of-time translating scheme? Or do they rewrite instructions as they see them?


Latter.


Sounds like a huge opportunity for improvements.

A simple approach would identify basic blocks in the code and translate them to an IR for an optimizing compiler back-end like LLVM.

Of course, you have to be careful with self-modifying code.


You can try to virtualize generic ARM in qemu and see that it won't reach Raspberry Pi performance. Recent versions should have it available out of the box afaik. Virtualizing Mn cpus would be even less useful.


You should look into the hackintosh project.


Hackintosh currently has no way of running ARM-based macOS, so it is of no help here.



But that’s not arm on x86 is it? My understanding was that it ‘just’ enables things to work on unsupported intel macs, by enabling stuff that still works on newer Intel Macs.


It’s complicated, but you have the right intuition about it. OCLP re-inserts drivers removed by Apple and patches the OS to enable functionality that doesn’t rely on hardware verification or ARM hardware. According to the devs, that’s about all they are currently able to do with current approaches.

UTM might do what you want but likely not on x86.

https://mac.getutm.app/

> Virtualize macOS as well.

> Run multiple instances of macOS on your Apple Silicon Mac with UTM. This can be useful for developers as well as security conscious users.

> Note that macOS VM support is limited to ARM based Macs running macOS Monterey or higher.


>or ARM hardware

What is ARM hardware in this case? Did you mean the T2 processor on Intel Macs?


I mean both T2 (which I meant by hardware security) and ARM hardware (which means that it relies on either the ARM CPU itself or the way it functions or is implemented). Features like iPhone mirroring apparently rely on hardware support on macOS.


Apple already provides an iOS simulator in XCode. So, what's the benefit of this project over the apple-provided one?


The simulator is not actually running real iOS or the iOS build of your app. Instead, when you run an app in the simulator, your app is being compiled to the current Mac’s native instruction set and links/runs against a set of Mac frameworks and libraries that _simulate_ and in some cases only stub in the expected iOS behavior. So as an example, you can’t just take an iOS binary off of the App Store and run it in the iOS Simulator (especially not on an Intel Mac). You also can’t use the simulator to probe and learn anything about how real iOS works internally, because the simulator isn’t really running full iOS. If you drill down in the simulator’s frameworks far enough you eventually just find yourself back in macOS.

Contrast with an emulator, where you are just running the full iOS build identical to the build on a real device. You would in theory be able to run any iOS binary unmodified and probe how the real os works.

It’s sort of like the difference between running an app through Wine versus running an app in a Windows VM, except in the case of the simulator it’d be like if you had to custom recompile/link a Windows app first against the Wine environment before being able to run it. If you wanted to study how Windows works internally, there's not much you can learn about that from running Wine, but there is quite a lot you could learn from probing a VM running Windows.


Since you are someone who seems to know what they're doing I hope you'll forgive a random unrelated question: do you happen to know if it's possible to call out to M1 code inside Rosetta2? It seems like this should be possible since Rosetta2 is (supposedly) a transpiler and so it's (supposedly) really running M1 code under the hood, but I haven't been able to figure out a way to call out to native M1 code.


That's a great question. The short answer is: no, you can't, but not necessarily for the reason you might expect. The long answer is: Rosetta 2 is indeed a transpiler generating native arm64 code, but transpiled code running via Rosetta 2 vs. native arm64 code in macOS use two different ABIs. Transpiled Rosetta 2 code uses a arm64-ized version of the System V x64 ABI that contains a direct mapping between x64 and arm64 registers, whereas native arm64 code uses the standard arm64 ABI. There's a lot of magic going on in the Rosetta 2 arm64-ized System V ABI that is necessary to make Rosetta 2 work.

Koh Nakagawa's work on reverse engineering Rosetta 2 dives into this topic extensively: https://ffri.github.io/ProjectChampollion/part1/

One interesting side effect of this ABI difference comes from modern x64 macOS using AVX2 instructions by default but Rosetta 2 not supporting AVX2. Because Rosetta 2 uses a different ABI than native arm64, code running under Rosetta 2 can't just call into the native arm64 system libraries; for calls to system libraries, Rosetta 2 transpiles from the x64 versions of those as well, which are available on Apple Silicon Macs thanks to the universal binary architecture. In macOS, all of the commonly used system dylibs are pre-linked into a single giant file called the dyld cache. Since the native x64 dyld cache contains AVX2 instructions though, it isn't usable by Rosetta 2, so for when a system library call requires going into the dyld cache, Rosetta 2 ships with a _separate second version_ of the x64 dyld cache that is compiled without AVX2. This is an interesting quirk that has proven to be exceptionally useful for getting newer macOS versions running on older unsupported Macs that have Intel CPUs that are too old to support AVX2.


IIRC Rosetta 1 did attempt to use the native x86 versions of system frameworks, but there were issues with floating point precision between the emulated and native code. I don't remember if they gave up on it, fixed it, or just left it like that.


Thanks!


macOS 15 ships with AVX2 support on Rosetta 2 though


This is not supported. It’s possible in theory but the theory here is “breaking out of the emulator and fiddling with runtime metadata”.


Note that that reply isn't authoritative in terms of Apple choosing or not to stray from reusing code/frameworks across macos for simulating/emulating and ios


An early Christmas gift to all clickfarms!


Discussion of this is on the nick's funny device emporium Discord server. https://discord.com/invite/4HXCHWhf6r




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: