WebGL doesn't have compute shaders. There are (gross) hacks that "emulate" gpu compute on top of stock shaders but they're limited and can't possibly compete with CUDA.
Only a matter of time until this will be available in chrome and firefox by default. Probably never in Safari though, since it doesn't even support WebGL 2.
I wonder why they are talking about "WebGL 2 compute" instead of a new WebGL version based on GLES 3.1. WebGL 2 is based on OpenGL ES 3.0, and the major feature of OpenGL ES 3.1 was compute shaders.
Apparently one snag in all this is that Apple's OpenGL version is stuck in a time before compute shaders, and all Mac/iOS browsers currently implement WebGL on top of OpenGL. (ANGLE doesn't support Metal).
People have been doing GPU computation for a long time before the gl "compute shader" feature. There's nothing emulationy about it. The shading language (GLSL) and available data types are the same in "compute shaders" and the familiar fragment/vertex shaders.
Compute shaders are just a shader type in OpenGL (and possibly a future version of WebGL) that have some convenient properties, for example they can more easily run out of step with the rendering pipeling if you have an application wanting to mix OpenGL/webGL graphics and non-graphics compute concurrently. See eg https://www.khronos.org/opengl/wiki/Compute_Shader#Dispatch
I don't know the details but the WebGL backend will have severe limitations because it doesn't use compute shaders or CUDA. This means that certain functionality like random writes to arbitrary buffers can only be emulated through workarounds that are like an order of magnitude slower than using compute shaders, and some things are going to be impossible entirely. There is something about CUDA in the link you provided but since that isn't natively supported by browser either, it will require the user to install something or use a server backend where communication between server and client is going to be an excruciatingly slow bottleneck.
So no, this isn't a web thing and not usefull for web apps/web pages, except maybe for a very limited and small set of use cases.
When I last benchmarked tensorflow.js, it was 40 times slower than native tensorflow on my laptop. However, even if WebGL compute shaders were available, Nvidia's cuBLAS and cuDNN libraries would still be faster.