Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

WebGPU, all the bare metal crashes/exploits and none of the broad support. Browsers should not be operating systems and the move to make them so will end up making them just as exploitable as your host OS only much slower due to the additional abstraction layers.


There's very little that GPUs can do that CPUs can't do so far as exploitation is concerned. The GPU driver runs in a sandboxed userland process just like the browser engine does, and the GPU respects process isolation just like the CPU does. There is no “bare metal” here!

Now, sure, there must be plenty of memory-safety issues in GPU drivers, but why find an exploit in a driver only some users have installed, when you can find an exploit in the browser everyone has installed? The GPU driver exploit doesn't give you higher privileges.


The main difference is that it's a lot easier to crash GPU drivers (eg with a 100% busy loop) or slow down UI rendering. There are shader verifiers to avoid some of this.


The driver runs in ring 0, and talks over PCI bus to some blackbox that is running random source code your web browser just compiled...


WebGPU is not significantly more "bare metal" than WebGL is. There are still going to be several layers of abstraction between user code and the GPU in all cases. No operating system implements WebGPU or WGSL directly so there's always a translation and verification layer in the browser, and then there's the OS API layer, and then the driver layer (and probably multiple layers inside each of those layers). In fact, on operating systems that implement OpenGL, WebGL is actually closer to the OS/driver interface than WebGPU is.

WebGL has been around for a long time and the feared exploits never materialized. It's been no worse than other parts of the browser and better than many.


One benefits is that web browsers have to respect a standard set of APIs which abstract the hardware and makes them all inter-operable (except the usual incompatibilities obviously).

Now POSIX was arguably successful, but it was limited in scope (not including the UI aspect of things). Also: Windows...

Additionally, the absence of that big global state which is the filesystem makes the development of portable applications way more easier. There's always been limitations to what browsers can do, but personally I don't regret iTunes, Outlook or all those desktop app advantageously replaced by SPAs. I can log from any computer/smartphone and I get my apps. No installation process, configuration, the usual missing/corrutped library/dlls, etc.

And the problem of data ownership is not a technical problem. If we didn't have browsers, software companies would have moved those data away from you and have desktop app access them remotely anyway.

I get that abstraction levels suck. But only if the cost/benefits ratio is bad. Java applet had awful UX AND they were slow. Now SPAs can be quite responsive and some of them a pretty cool to use.


>WebGPU, all the bare metal crashes/exploits

It's a goldmine. Think of all the future jailbreak entrypoints this will make possible.


WebGPU has the same sandboxing in place as WebGL, e.g. despite the name there is no "direct GPU access" (minus any implementation bugs of course, which will hopefully be ironed out quickly).


That ship has long sailed.


Doesn't mean people can't drop anchor and get off the boat. Lots of terrible ideas were popular and later became uncommon.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: