Don't overcomplicate it. They want to preserve their Services revenue, because Tim Cook is a bean-counter and wants to maintain the profit margin. "After Steve" by Tripp Mickle provides some background.
Many Apple indie devs and fanboys are literally screaming. I bet many people @Apple are too. There are so many things the company could do to regain geek-cred. Package manager and windowing on macOS. Align Mac App Store policies with the more liberal Windows 11 Store policies. Let people run what they want, even outside the EU, including other web engines. Invest in Proton so old Windows games can run on Mac (but then, no App Store tax). Keep OpenGL and Vulkan around for the scientific computation folks (and others). Commit to keeping Rosetta2 around indefinitely, because compatibility is your #1 job as a platform. Open-source more stuff (god forbid, your OSes! why not?).
But they've gotten timid and conservative. Top execs see risk, and VPs seem to think they're making products only for the stereotypical technophobic grandpa, rather than power users.
I think you are right that doing these things would improve Apple's standing with developers but it is important to note that none of the things you listed are in line with Steve Jobs' philosophy which you seem to imply here. Recall for example that the iPhone (aka iPhone 1) did not ship with an App Store. Apple doesn't even let you make iOS apps unless you own a sufficiently up to date MacOS device. So when you say "regain geek-cred," I think it's necessary to point out that this geek-cred never existed with a large (although perhaps not majority) of developers/power users, and when it did exist, it was not because Apple was doing things in line with the steps you have proposed to regain it. It was always about polish and ease of use. So when you say they've gotten timid and conservative I'm not sure in what respect you mean that. Because in terms of openness they have not changed at all and are still completely in line with Steve Jobs' vision in this respect.
True, but the world has changed and so have Apple's biggest fans (the ones that drive purchasing decisions for their social circles) and indie devs. They didn't mind back then, now they do. See the Accidental Tech Podcast guys for example, but there are many others (the infamous DHH and geohot just recently). People's attitudes towards huge corporations have changed. Things formerly rationalized as "for the user experience" are now seen as cash-grabs. Even many pro-Apple people are worried that rent-seeking is leading to product neglect.
Some of these things could happen, but I don’t see Rosetta being kept around indefinitely. Keeping compatibility layers around becomes increasingly expensive with time, encumbering OS development and encouraging devs to never update their apps. It also opens up possibility of multiple compatibility layers being maintained at the same time, which multiplies these issues.
It’s much easier to just make virtualization of old versions of macOS easy to facilitate compatibility with old software, which they’ve done — one can spin up a full featured GPU accelerated macOS VM with just a few lines of Swift, so you don’t even have to use third party software if you don’t want to.
Easier for Apple, but worse for users. Apple's main argument has always been this: "encouraging devs to never update their apps", but I don't buy it. Active developers all update promptly long before any threats of deprecation; they even go out of their way to switch to shiny stuff like SwiftUI. It's the long tail that doesn't get updated, and Apple's deprecation velocity changes nothing. Portal 2 (not Rosetta I know) runs poorly in a VM, but ran well on my 2011 MBP.
I've heard from several former Apple fans who switched, and they all marvel at being able to run old binaries without recompilation. Even though they ship apps with the latest frameworks. It's still "just plain cool".
Mac devs are good about updating their apps, but it’s much more hit or miss in the Windows world. If there’s no impetus to regularly update for security reasons (e.g. browsers) or to keep people subscribed (e.g. Spotify), it’s probably not getting updated too often. There’s a considerable difference in culture between platforms, and I think it largely stems from the expectations set by Apple and Microsoft.
Linux is kind of a mixed bag. Some devs are ultra responsive while others only update when absolutely necessary, but the FOSS nature of most apps there helps since you can always take matters into your own hands and fork a project if it’s accumulated too much rust to be usable.
Linux is fragmented. Windows has shockingly few indie devs given its marketshare (compared to Apple: Omni apps, Structured, Day One, Session, Pixelmator, Fantastical...). The Mac has plenty of eager indies that care deeply about UX and design, and many discerning users who care too. Caring about UX/design is their "carrot"; they don't need the "stick" (deprecation).
The stick was mainly necessary for big devs back in the day, who never cared about making Mac-assed Mac apps. Now those have switched to Electron anyway; the stick no longer provides meaningful incentives. It just annoys people who want to play Half-Life for 5 minutes every few years.
I'm sure the people on the Mac team think exactly the way you do though, so I guess I hope they read this, or at least that they make sure their assumptions are still valid.
Mac devs are good at updating their apps because Sparkle exists.
It’s hard to overstate the impact of Sparkle, it made it easy for developers to ship updates while also putting the control of that process in the hands of developers rather than the platform.
Linux went the opposite route with dozens of package managers which makes it harder for updates to reach users.
There's no reason to update a program once it's feature complete if it works offline. Apple is pushing harmful updoot ideology that kills old (but perfectly usable) programs.
This is maybe true for software that can never conceivably interact with the outside world (can’t open files, isn’t scriptable, etc) but at least on Windows and macOS where shipping static binaries or necessary libraries with the software is the norm, vulnerabilities pile up pretty quickly these days making it a bad idea to regularly use or in some cases even have installed software past a certain age. For these programs it’s better to just run them in a VM where compatibility can be perfect by running an old OS and the size of the potential crater resulting from an exploit is minimized.
It’s a bit of a different situation under Linux where the norm is dynamically linked libraries kept up to date with a package manager, but even there static binaries and things like flatpaks and app images can be bad news.
> VPs seem to think they're making products only for the stereotypical technophobic grandpa, rather than power users.
Hasn't Apple always been "computers for people who don't want to think about computers", while power users finding the tech interesting is just an accident?
That's exactly who they're making products for. Buy it, hand it over, and kids from age 1 to 92 will be just fine. Developer interest in Apple is the product of macOS being incidentally a Real UNIX, the hardware being second to none, and the aforementioned market of kids 1-92 being quite large.
The third-party developers that built the platform are becoming increasingly resentful of Apple and of the platform. It hurts all of us if they switch, even if we just want to "never think about computers".
This isn't 2012 anymore. Developers are into freedom and openness, and either Apple aligns itself, or Mac/iOS will have just as much appeal to cool indie devs as Android/Windows (a fraction of its current appeal). Apple platforms no longer feel exciting to devs.
And before the Apple fanbois come and tell you how Apple is a multi trillion dollar business so they know what they’re doing, a reminder that Apple didn’t want the App Store in the first place, and wanted 3rd party apps to be web apps.
It was the popularity of unofficially created apps running on unlocked original iPhones (which led to many unlocking their iPhone) that convinced Apple to create the App Store.
Apple didn’t really want 3rd party apps to be web apps. The infamous “just build web apps” was a stop-gap measure while they were building the App Store.
> It was the popularity of unofficially created apps running on unlocked original iPhones (which led to many unlocking their iPhone) that convinced Apple to create the App Store.
This just isn't the case. The first iPhone with iOS 1.0 was essentially a very advanced demo that just barely made it out the door. There was no SDK, API documentation, or much of any developer toolchain for early iOS. Ask anyone that fought to build even simple unofficial apps what a nightmare using those early frameworks was like. They were not ready for public consumption.
Web apps were never the long term goal for third parties. Steve Jobs might have said that in public but it was a deflection about native SDK questions. Web apps were a stopgap until the dumpster fire of an internal SDK could be rebuilt.
Apple makes money hand-over-fist already. Their stores are the most profitable retail stores per square foot in the history of retail. There's nobody not buying Apple today that would suddenly begin if they did these things.
>There are so many things the company could do to regain geek-cred.
This implies they have lost geek cred. I think that Apple could install razor blades in their keyboards tomorrow and have more than half of HN believe it's a good thing.
> There are so many things the company could do to regain geek-cred.
But why do geeks have to destroy everything for everyone else, just for the benefit of their hobbies? Isn't Linux and Android enough and everything else hardware wise and software wise that isn't Apple? Apple devices work great for people who want to get stuff done in the real world and are willing to pay developers for great software. Why does that annoy geeks? Not everybody is interested in tech, they want something that works and get on with their day.
Just because geeks understand computers doesn't give them the right to dictate how other people should use their devices. Just like car modders shouldn't have a say on how normal people use their cars.
> Just because geeks understand computers doesn't give them the right to dictate how other people should use their devices. Just like car modders shouldn't have a say on how normal people use their cars.
No one is telling normal people how to use their phones/cars. If you want the option of doing something different, however, you may need the government to help you stop companies from being anti-consumer.
Everything on the computer is fake, it's just ones and zeros. It means nothing until it interfaces with the real world. Geeks like to make computers and other devices operate with no real meaning, for their own amusement. But that can and will get in the way for people wanting to use their devices as tools for real world tasks.
You can make the same comparison for any machine. If your hobby is off-roading or dirt biking, you might want to be able to adjust fuel injection exactly as you see fit. But making every car or motorcycle owner have to adjust their gas/air mixture is not what normal people want. They want a safe vehicle that they can rely on for their commute.
There are a gazillion non-iOS devices that work perfectly fine in the real world. I do my banking, bookkeeping, everything with regular end-user software on my non-Apple devices, and so do a lot of people in my extended family and friend groups, believe it or not.
You should seriously reconsider how much you believe Apple has a monopoly on usable personal computing.
That's great. Why can't people be satisfied with those devices and use them then? Why enforce your style of computing onto other people's devices?
Instead of nerfing Apple and ruining a good thing, why can't open source loving hackers try to improve Linux and Android (non-Google) instead, to make people want to use and pay for those systems and devices? It worked with servers for Linux. More competition and more better products is better for everybody, not trying to ruin Apple systems for consumers because of jealousy.
I'm not the European Union. It's not my personal jealousy and distaste for Apple that's the driving force of the EU legislating against Apple's practices.
Many Apple indie devs and fanboys are literally screaming. I bet many people @Apple are too. There are so many things the company could do to regain geek-cred. Package manager and windowing on macOS. Align Mac App Store policies with the more liberal Windows 11 Store policies. Let people run what they want, even outside the EU, including other web engines. Invest in Proton so old Windows games can run on Mac (but then, no App Store tax). Keep OpenGL and Vulkan around for the scientific computation folks (and others). Commit to keeping Rosetta2 around indefinitely, because compatibility is your #1 job as a platform. Open-source more stuff (god forbid, your OSes! why not?).
But they've gotten timid and conservative. Top execs see risk, and VPs seem to think they're making products only for the stereotypical technophobic grandpa, rather than power users.