Hacker News new | past | comments | ask | show | jobs | submit login
Minecraft Running on Asahi Linux with Open Source GPU Drivers (treehouse.systems)
376 points by pimeys on Nov 20, 2022 | hide | past | favorite | 96 comments



It doesn't just run Minecraft, it now runs a smooth gpu accelerated GNOME desktop including things like Youtube videos: https://cdn.masto.host/sigmoidsocial/media_attachments/files...

This doesn't yet work out of the box but the next few months will be very exciting.


Man, that's so cool. When things became quiet around the GPU work and Alyssa said something along the lines of "we are very far away" I lost hope a bit. That hope is back!


> and Alyssa said something along the lines of "we are very far away" I lost hope a bit.

I think the quote you're thinking of was a reference to modern OpenGL and Vulkan support, not accelerated graphics in general. Older OpenGL is a lot easier to implement, and sufficient for an accelerated desktop and games like Minecraft.

Marcan and Alyssa have been saying for a while that we were on track to get GPU acceleration in the near-ish future.


May be Vulkan can be the first priority since OpenGL can be provided by Zink when it's in good shape.


Vulkan is a lot more work to get to, with far fewer applications using it. It would be a worse ROI, and take significantly longer to make an OS that can be a daily driver for most people.


Not sure if OpenGL 4.0-4.6 is less work in general. I'd argue implementing Vulkan gives the best ROI, since it gives all OpenGL after that without extra effort.

And I don't think that few applications use it. Anything modern tries to. The rest are planning to move to it.


What common or high profile applications do you think actively target Vulkan? Outside of games very few do in my experience but I’d love to see some examples outside what I know of.


As shmerl pointed out in their first comment, once Vulkan support is there, you get OpenGL support for free through Mesa's Zinc driver.


I'm not trying to say that they "should" have focused on Vulkan first, by the way. There may be good reasons to focus on OpenGL first. Maybe it's what Rosenzweig has the most experience with, or maybe getting a Vulkan implementation to the point where it can be used to usefully emulate OpenGL is more work than getting an OpenGL implementation to the point where it can do 2D and basic 3D acceleration, maybe Zinc has some serious performance issues or isn't solid yet, I don't know.


OpenGL is a much shorter path to stand up. The difference would be that they are already passing big chunks of GL conformance tests , but would take a long time to stand up Vulkan support.

This is about trying to get something daily drivable for most people in the most efficient way possible, and sometimes efficiency does mean implementing part of the tech stack twice if the first time unblocks you on other things.


Wayland compositors are going to switch to it, with some already doing it (like Sway). Video players like mpv use it. Blender has some plan for it.

Basically, it's the way forward, not OpenGL.


Sure, but that’s in the future and they aren’t not going to add Vulkan. But it makes sense to prioritize GL for now, especially because a lot of the GL work comes much easier thanks to their prior MESA work.

Even amongst your list, I don’t see mass adoption of the applications today that don’t also have a GL backend.

Blender alone will be a gargantuan uplift to get to Vulkan because so much of the ecosystem is coded agains GL directly.


This kind of experimental project above looks pretty well aligned with where things are heading, so I think focus on Vulkan would be still fitting.

Meaning by the time it will be more usable, Vulkan will be more used as well.

And as above, if you implement Vulkan - you get OpenGL through Zink. If you implement OpenGL - you still have to implement Vulkan. So with limited resources, the first option looks way more effective.


You’re looking at it from a number of implementations perspective.

They’re looking at it as a time to viable product perspective and a ROI.

These are often at odds within engineering , and it makes sense for them to pick their way because they already have a lot of the GL stuff done and it’s a faster route to a viable product.

Again, going for vulkan would mean they’d have to spend significantly more time up front.

You seem to be maximizing for not trying to do work (e.g implementing both Vulkan and OpenGL), but in many cases it’s better to get something stable and workable out.

Basically, don’t let perfect be the enemy of good.


I don't see OpenGL only option as something worth using seriously, so not a usable option in practice, if I can simply get hardware where Vulkan works fine.

So as a fun experiment, it can be interesting. As something practical - doesn't seem so until all pieces are in place.


Sure. No one disagrees with that. But if you have the choice 5 years of no workable GPU. Or 2 year of no GPU, then have a workable opengl driver. And then after 4 additional years you have a vulkan driver.


I believe, and I couldn't quickly find it in the docs, that Mesa provides certain OpenGL versions if you have implement certain OpenGLES version. For for example if you have OpenGLES3 you get OpenGL 3 for free via Mesa (These version numbers are made up, I don't actually know which OGL version can be implemented in terms of OGLES)


It's the other way around. GLES Versions become a subset of later OpenGL Versions.


From a spec based standpoint you maybe right but I'm clearly not talking about that. Else it wouldn't be possible that Asahi linux is running non-ES OpenGL based application when there is ONLY an openGL ES driver.


These open-source GPU driver guys are sick! About two decades ago, I had a Via Unichrome integrated GPU and the OpenChrome project had me able to run games on Linux. It was sick. I played Unreal Tournament (which had a Linux version that worked better for me than on Windows), and I think at one point my introduction to open source was having to modify another game's source code so that it would allocate less GPU memory for a texture (the texture then kind of smeared over the rest of the screen, but it was for a UI element so the game was still playable).

Love to see there are people still doing that stuff today, especially since this stuff is probably more complex than then.


Yeah, I have massive respect for those developers. I wonder how they learned to do what they do. Adding Linux support for hardware is some kind of super power.


> These open-source GPU driver guys are sick!

I believe it's guys + girls specifically for this project

edit: it might just be girls? Asahi Lina and Alyssa Rosenzweig?


No genderedness intended! My dialect has guys as gender neutral noun!


Already using Asahi Linux on my M1 Air; can't wait until the GPU stuff lands!

Extremely impressive work by all involved!


I struggle to understand their endurance on their live coding sessions, for example https://youtu.be/VYAT6NZUQUE I want to watch them carefully and slowly from the start, it's so good to see the development workflow, testing, etc, initially the music was a bit difficult for me but seeing such progress and knowledge is amazing, just amazing


What’s your battery life like compared to macOS?


I've get around 6 hours on asahi linux. MacOS battery is at least twice as good - not sure since I've never ran out of battery on MacOS. Sleeping works on macOS so I always have to remember to turn off the laptop on linux. Good for taking notes though, the CPU dips to 40 FPS on desktop sometimes and can't play videos.


Thanks for answering. This is totally without hardware graphics acceleration, right? As my understanding is that gpu drivers are still an experimental work in progress.

If so, it’s pretty cool that you still get 6 hours even with that handicap. I’m excited to see how it looks once the GPU works well.


The battery life is more about power management logic. I'd expect we'll be a long, long time, before Apple's power management logic is implemented.


I read elsewhere on HN the other day that a PSCI API extension patch is awaiting upstream review to be merged into the Linux kernel and that power management would work better with that merged. May be an usable Asahilinux on M* MacBooks are not that far off..


Sounds more like it would be lower priority than very difficult?


It's not a single, difficult feature.

It's a system wide tuning of voltage frequency plus idle times for all sorts of micro-controllers, plus who knows what else.

In its own way, it's extremely, extremely difficult.


Yes, I’m aware. I’m just putting it in the context of the rest of the project.


True true, fair enough.


Yes, no GPU acceleration. All on the CPU. Mainly on google docs though and other websites, nothing very intensive.


Damn, if that's the battery life I might as well get a Framework or System76 laptop built more specially for Linux.


I'd also like to know. We recently had an interesting thread about Linux power management and efficiency on laptops.

https://news.ycombinator.com/item?id=33644165


As a hobby or are you actually getting some benefits?


I guess the biggest benefit is to be able to run Linux and stay away from macOS. For many of us this is a huge thing.


this whole project has been amazing to watch develop. not only is this technically impressive, it seemed like it happened _so fast_. really talented people!


And the Apple GPU kernel driver is written in Rust, so it kinda proves that concept as a side effect!


Yea I'm really waiting for the day that we see it get far enough along to push for mainlining it. It'll be a really cool testament to both the work being done by Lina and Alyssa and also the work that was done by the kernel team to even enable Rust to work there in the first place.


I think a lot of people might say something like "It might be OK for simple drivers, but what about something like a GPU" knowing that GPU drivers are probably the most complex. But now we can use this the opposite way and say "well if they can write a GPU driver in Rust, surely it could be used for the simpler devices."

The driver complexity argument is shattered, and the blog posts suggest it's better - confirming fearless concurrency and no memory management issues! So weird that this seems to be the very first Rust written driver.


It's the first non-trivial one, but not the first entirely. The patch set I believe comes with a simple NVME driver to prove that the system itself works. I'm curious if we'll see some network drivers ported to it because of the concurrency and memory management benefits. They tend to be just as performance sensitive as GPUs and even more security focused since they pretty much explicitly handle untrusted input (externally and internally) in ways that lots of devices don't do.


This is impressive for 12 render distance (You can see this by D: [render distance] in F3). Lower resolution and looking upwards does help though. Also, minecraft does a draw call for every single text in F3 which significantly lowers FPS. This might be playable at 60 FPS with sodium (beats optifine in performance) and a lowered view distance.


absolutely insane development pace. Congrats to the asahi team


I wonder if this would work on 32-bit systems? I had an old one and was surprised to see that Minecraft wouldn't run on it because of a video driver issue. It's odd to me to consider x86 hardware actual trash even when computationally capable of running something at a useful speed.


Totally awesome that this is now working. But performance seems relatively lackluster for now. But it's good they are focusing on getting it to work first before getting it fast.


Wait, I can't even run it on amd64 right now! It's stalled at this point: https://github.com/minecraft-linux/mcpelauncher-manifest/com...


That's the Bedrock edition, not the Java edition. Entirely different beasts.


Wrong version! Arguably (my opinion) an illegitimate version. Another comment mentions some reasons to think that: https://news.ycombinator.com/item?id=33690135


Now we still need Minecraft to become open source.


Notch once promised to make Minecraft open source "once sales start dying"[0] but he received 2.5 billion reasons to let that decision be taken off his hands.

[0] https://web.archive.org/web/20100301103851/http://www.minecr...


I'm not blaming him. But I guess some time in the future the value of the closed-sourced-ness of Minecraft will decrease.


switch to minetest!

I'm told from long time minecraft players that it doesn't suffer the same data corruption issues as minecraft.


I've tried Minetest a few times. My issue with it is that it just feels very unpolished. Things like the lack of head bobbing, so it feels like the character is gliding across the ground, a pretty bad looking block break animation without enough particles, a very floaty-feeling jump, and on macOS, the scrolling in menus is completely broken and it's not running at the display's native resolution (no HiDPI/"Retina" support), you can't change settings in-game, hitting escape in the main menu just exits the program, etc etc etc. It feels like a game engine demo project, not a game. Everything down to the name "Minetest" feels unfinished and unpolished.


There are probably mods for all those things.

The idea in minetest is that you need to add mods for the stuff you want


Right. That might make it a good game engine playground, but a pretty bad Minecraft alternative.


It's like a couple of clicks to install mods…


This is exactly the mind-set which prevents projects like Minetest from becoming "real" games. Mods are great, but the core game has to be good by itself to attract people. And if Minetest doesn't want to be a "real" game which is good out of the box, that's fine, being a game engine playground is perfectly fine, but that means it's not even trying to be a Minecraft alternative.


The state of open source Minecraft is nuanced enough to be interesting. The Minecraft ip and codebases are currently owned by Microsoft and have technically always been closed source, but the nature of the game and its development history are such that third party mods and modding apis are very common. Because the original edition of the game is written in Java, it's relatively easy to decompile and develop for. In fact, Mojang Studios has been including a deobfuscation map with each snapshot release of the game for a few years now which would allow you to decompile the Java edition of the game and use the same names that they use internally [0]. Though, most modders prefer to use the mappings provided by modding apis like forge and fabric due to tradition and licensing conflicts. Depending on your definition of Minecraft, there's also plenty of open source implementations of Minecraft Classic, a much older version of the game that's easy to reimplement but apparently still has an active community [1]. Then of course there's Minetest, an independent totally open source voxel game that happens to have a lot in common with Minecraft [2].

I should probably also note that there's something of a conflict in this area surrounding the other edition of Minecraft, Bedrock Edition, which is written in C++ and therefore difficult to decompile and mod. Bedrock Edition is much more closed than Java edition in a number of ways, such as including a store built into the launcher for buying maps, texture packs and skins whereas these would need to retrieved externally (usually for free) on Java edition. However, Bedrock Edition does have a first party modding api, a more feature rich internal scripting system with the concept of "behavior packs" [3], and has led to the development of a number of tools used by mod authors for both editions, such as Blockbench [4]. Ultimately Java Edition and its community have the legacy of Notch and Mojang with a long history of community contribution while Bedrock Edition was only developed after the Microsoft acquisition and is much more in Microsoft's style.

EDIT: Turns out the Bedrock Edition modding api was discontinued earlier this year [5].

[0]https://minecraft.fandom.com/wiki/Obfuscation_map [1]https://wiki.vg/Category:Minecraft_Classic [2]https://www.minetest.net/ [3]https://learn.microsoft.com/en-us/minecraft/creator/document... [4]https://www.blockbench.net/ [5]https://www.minecraft.net/en-us/creator/article/removing-the...


> EDIT: Turns out the Bedrock Edition modding api was discontinued earlier this year [5].

Embrace, extend, extinguish. Any day now they will announce the eventual deprecation of the original Java-based Minecraft.


I don't see this happening, for three reasons:

1. Community oldtimers and loyalists really like Java.

2. Bedrock has no Linux port.

3. Java is the lead version - new features are added to it first and then backported to Bedrock.

While #1 and #2 could be written off as "costs of doing business", the latter is a significant problem. The current creative process that Microsoft and Mojang has adopted is that Mojang implements new features and versions in Java first, and then another team in Washington reimplements them in C++ for Bedrock. Deprecating Java means making everyone at Mojang switch development tools and languages and adopt an entirely different codebase. It would be just as disruptive for Mojang as it would be for modders.


I don't disagree with your conclusion, but I do think your information is a bit outdated. In recent updates, effort has been made to keep both editions as much in sync as possible with explicit goals of feature parity and synchronous releases. I don't work at Mojang so I can't speak with complete certainty, but I know there are a number of developers who develop features for both editions simultaneously [0]. That said, all the core developers I've heard from are extremely passionate about the Java edition of the game and have absolutely no plans to stop supporting it. If that ever happens, it will have been a Microsoft decision that they had no say in.

Unfortunately sifting through the last few years of Minecraft.net articles and Minecraft Live footage to find sources isn't a great use of my time right now, but I'm sure it's out there. Bedrock was shifted to the same version numbering system as Java in the last year iirc and they've been releasing snapshots and betas with the same content on the same day for all of the 1.19.3/1.20 snapshots so far and I think the 1.19.0 ones as well.

[0] https://mobile.twitter.com/kingbdogz/status/1509482290304659...


And lose a chunk of people that only enjoy Java edition specifically because it's Java edition.


If you choose to use Linux as your daily driver instead of MacOS, why? Just curious.


As someone who has a MBP M1 sitting there and barely used, I’m asking myself the same question, but targeted to MacOS users.


Why did you buy it then?


In the early days, RISC and other non x86 hardware was the future to some of us. Whether the OS is good or not, I wanted to try it out, as a newer era in CPU architecture and a possible competitor to the dominant architecture. To me the hardware is very much not the problem here, when I disregard non-serviceable parts. The OS is just as quirky as linux, and has some elements that are comparable to linux (the terminal is much more posix-y than windows cmd), however your options are limited in ways that you may have grown used to as a linux user:

- no alternative DE's, WM's or the ability to compute without either one. (both are in my submission history with 0 solutions, i've asked.)

- no customization (top car can't go away, bottom bar can't slide out like powerpc mac)

- no involvement in the bootloader

- no upgrading hardware without 'tricking' the OS

- no alternative drivers/firmware options

the list goes on, and like I said, just as quirky as linux so most of us just 'make do' with MacOS as 'good enough' on what is probably the most performant and efficient laptop I've ever owned. I never knew how much fan noise, vibration, and hot fingers negatively impacted me until I put hands on my old laptop.


Company bought it for me


Declarative setup with NixOS. I can choose what services and software gets run and installed. I can define my user experience (sway, tiling wm). I have the control on OS updates, not Apple. Native docker. CLI works much better on Linux compared to Apple; macOS command line tools feel limited and crippled compared to almost any Linux distribution.

And, of course, I can modify any of the tools I use. The whole stack is open source.


MacOS doesn't support the software I run (as of Catalina). Linux does, and as a bonus I get native Docker support.


At 28 FPS…


Yes, and at a really low resolution. That's okay, it's expected.

Make It Work, Make It Right, Make It Fast. I personally didn't expect Asahi would get this far on steps #1 and #2 nearly as quickly as they have. It's bloody impressive.


It’s at a 2004 resolution at sub 2004 FPS and the title is clickbait claiming it “runs”, sorry but 28 FPS isn’t considered playable in my book. I’m just not as impressed by that as people pretend I should be. They should have waited and at least gotten a solid 60 before this should’ve been shown off. The sad state of the Apple (and Microsoft too) reverse engineering scene has been obliterated by lawsuits and this whole Asahi Linux thing is a fluke that someone hadn’t got shut down yet. It actually saddens me what shambles the Apple RE’ing scene is for it to be taking this long to achieve only this meager level of support. Seemingly only 2 or 3 people in the world are working on this? For the most powerful laptops in the world?


The title never claims that it's playable, just that it runs.

That said, I've played many games at <30 fps and low resolution in the past, due to not being able to buy the latest & greatest hardware - it's perfectly playable, even if not an ideal experience.

Of course, it's different here - the hardware is more than capable. But do you expect them to jump from nothing to smooth 60 fps full resolution with nothing in between? Maybe a post like this will motivate more people to join development, as opposed to waiting until it's perfect.


> 28 FPS isn’t considered playable

My dude, the first 50 years of gaming struggled to reach 24 fps, 24fps is the framerate of the film industry we've only recently had access to 4k60fps in the last few years.


It's a start - remember this developer has basically bootstrapped linux on Apple's custom hardware and this is the early stages of the latest area she's focussed on (getting the GPU working). This has been an impressive project, it'll no doubt continue.


Minor correction but most of the "bootstrapping" work unrelated to the GPU has been done by Hector Martin, Alyssa's focus has always been on the GPU side of it.


Ah apologies, it's Alyssa who I usually see in relation to Asahi Linux so maybe I over-attributed it to her :)


Which is pretty incredible in a day 1 driver. It's already playable. What a great year this team has had. Thanks Alyssa, Yuka, and Lina.


>day 1 driver

Mesa is 27 years old.


I believe they're calling it Day 1 in the sense that the devs themselves are saying a lot of it is hacked together and not daily driver material for most of the users. A lot of releases and revisions before this is even upstreamed.


When talking about impressive performance of a "day 1" build I feel there is a difference between new drivers and new ports of existing drivers.


This isn't a port of an existing driver. It's completely new Kernel space + user space driver. Of course it makes use of the mesa "framework" but that doesn't mean the driver is 27 years old.


What do you think porting a driver to a completely new GPU means? Doing so will require new kernel space and user space code. The existence of these new components doesn't mean there is a completely new graphics driver. Only parts of it that are platform specific are new.

The graphics driver of a system spans from a talking to the hardware to exposing a graphics API such as OpenGL or Vulkan for applications to use. Splitting up the graphics driver into separate components and calling each component a driver is different from what I mean when I am referring to a driver.


Mesa isn't a driver. Mesa is just an abstraction on top of the software that DRIVES the hardware (a driver), which is being written from scratch. Nobody (including the Asahi developers) but you subscribes to your definition of a driver. Drivers implementing Mesa may share next to nothing in common, so no, it's not a "port".


The ashahi driver inside Mesa builds upon Gallium3D so it does use shared components of the Mesa library stack. This is not a from scratch driver, it's one that uses the powers of the Mesa library.


Correct. But this is a new driver.


Part of the driver is new, but part of it is just existing code that is part of Mesa.


So... It's a new driver. You don't say a program written days ago in C89 is 33 years old because the programs uses the c89 standard library.


I don't think Mesa is referred to as a driver here and I'm not sure why it should be.


Just wait until they install optifine


Better than no GPU driver and needing to spin the CPU for that work.


wonder why much the macos version gets ootb?


On a macbook with a M1 pro I generated a single new world that started in a forest biome and with the default settings I got about 80 fps when using a resolution of 854x480.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: