I enabled the requested chrome:flags in Brave, but it still doesn't work. I haven't downloaded Chrome on any of my M1 Macs, and don't plan to start now.
I tried on latest (normal) Chrome, beta and canary/nightly and enabled both options one by one and let it relaunch but still wouldn't work at all. ¯\_(ツ)_/¯
It's running only a single thread, so I think the specs are a little less relevant. It takes about 80s per iteration and I ran the 4 iterations set by default, so a little less than 6 minutes.
Hold on, to run your demo does one have to click the "Load Model" button before doing anything? 'cos what I see is a form that is greyed out with the error message still at the top:
> You need latest Chrome with "Experimental WebAssembly" and "Experimental WebAssembly JavaScript Promise Integration (JSPI)" flags enabled!
Now I'm wondering whether the top message goes away once the flags are enabled?
> Hold on, to run your demo does one have to click the "Load Model" button before doing anything?
Yes. I thought it won't be good if it would download 3.5gb once you open the page.
>Now I'm wondering whether the top message goes away once the flags are enabled?
No, I haven't added any checks for that (and I'm not sure how the first one can be properly checked), so it's just an info bar. Which is, eventually, misleading.
It works on canary on M1 mac and Windows w/ an NVIDIA RTX GPU. I believe there are custom command line options that have to be passed to make it work. The MLC site has the deets that work.
Nah, I don't use Chrome so I don't have it installed. I'm not a web developer, so testing across different platforms isn't useful to me. I've used StableDiffusion before, so hacking around to make this demo work in my browser isn't particularly interesting either.
I agree with the poster 100%. Im convinced any Google applications immediately suck every iota of data they possibly can at install time / first launch. It’s not worth it to me either.
Why is it that implementing something in wasm is stalled for so long but doing it as a js feature is so fast? Anyone have insights? As an outsider it feels like wasm is being developed in an impossibly slow way.
Implementing something new in JS can be done relatively easily using a slow path, where you just write some privileged JS or C++ and then wrap it, without doing any optimizations. Then if it gets popular the vendors can optimize it at their own pace.
Implementing a new feature in WebAssembly is a bit more complex due to its execution model and security constraints. I expect it's also just the case that a lot of these new WASM features are very complex - promise integration is super nontrivial to get right, so are WebAssembly GC and SIMD.
JS Promises in something like their modern form were first played around with in ~2010, and it was ~2016 before browsers were shipping them natively. Good standards can take a while!
Because it basically covers what PNaCL, Java plugin, Flash plugin, Silverlight asm.js were doing.
Anything beyond those use cases it is really meh, specially given how clunky compiling and debugin WASM code tends to be.
Then we have all those startups trying to reivent bytecode executable formats in the server, as if it wasn't something that has been done every couple of years since late 1950's.
> Because it basically covers what PNaCL, Java plugin, Flash plugin, Silverlight asm.js were doing.
Right but it doesn't right now? Like you can't just write arbitrary code as you would with a Java plugin, or a PNaCL C++ plugin. Wasm is extremely difficult to use for those use cases.
> Then we have all those startups trying to reivent bytecode executable formats in the server, as if it wasn't something that has been done every couple of years since late 1950's.
Yes, because people really want this and the solutions have all been fraught with security issues historically.
I didn't say WASM is without flaws, I said the predecessors had flaws but that the premise is valuable, which is why we keep trying it over and over again.
Notably, the first paper is about exploitation of webassembly processes. That's valuable but the flaws of previous systems wasn't that the programs in those systems were exploitable but that the virtual machines were. Some of this was due to the fact that the underlying virtual machines, like the JVM, were de-facto unconstrained and the web use case attempted to add constraitns on after the fact; obviously webassembly has been designed differently.
I hope wasm sees more mitigations, but I also expect that wasm is going to be a target primarily for memory safe languages where these problems are already significantly less significant. And to reiterate, the issue was not the exploitation of programs but exploitation of the virtual machines isolation mechanisms.
Darn, guess I'll have to wait for stuff to land in Firefox.