This is from the same gentleman who (among other things) demonstrated that printf() is Turing complete and wrote a first person shooter in 13kB of Javascript.
Maybe a generous interpretation of the comment and a realisation that common language isn't always 100% precise would be better than pointless arguments about semantics.
There is only a single printf written in the source code.
There was an earlier version of the underlying 3d engine that used only Canvas. WebGL use is justified like this:
> Once I actually got that working doing all of the math by hand in JavaScript I decided that using WebGL would probably be worth it, and actually probably wasn't cheating all that much. WebGL exposes access to the GPU-enabled rendering engine through JavaScript. While it does abstract away some of the rendering, it's less than I thought---it just supports the ability to do the necessary math efficiently---so I decided this wouldn't be cheating. And fortunately, it didn't take long to reproduce the initial renderer, but this time supporting much better (and more efficient) graphics.
That makes sense: As far as I understand, OpenGL 2.0 and beyond don’t really provide much fixed-function/predefined logic anymore, e.g. I believe you need to provide your own vertex and pixel shaders.
You could argue that rasterization itself is being taken care by the implementation, though.
I'm not sure how this quote applies to harmless pursuits like making and solving absurd puzzles. It's not like regex chess is going to be a weapon of mass destruction.
https://github.com/HexHive/printbf
https://github.com/carlini/js13k2019-yet-another-doom-clone