Right but you still have the latency of frame buffers inside the emulator, plus more again when that’s converted out to analog, especially if an HDMI connector is still in the mix— ideally you’d do this on original hardware or at least a PC with a graphics card that has native s-video or VGA outputs.
You only need one pixel worth of RAM to display HDMI input into a CRT. You don't need to buffer the whole thing, at all. Especially if you were driving the tube with your own driving circuit.
That said, yeah, in the special case of an HDMI-driven CRT that was specifically designed with ultra low latency in mind, you could buffer way less than a frame— though I imagine you'd probably want to buffer at least a line at a time just for sanity with the timing of driving the electron gun. And obviously this would depend on the HDMI picture resolution exactly matching that of the CRT.
HDMI is RGB plus clock in 4 differential pairs. Fundamentally you just need 3 shift registers with reset tied to clock. Out comes the signal and you wire that to RGB electron guns through an E24 resistors assortment pack.
LLMs probably don't know enough about them to be useful in this discussion. Classic Google Search would be better. Yours fixating on pixels shows that.
I think the difference here is more that I'm talking about the practical reality of today's display interfaces (both sides have a full frame in memory, typical overall latency is 5-50ms), whereas you're discussing what could be theoretically possible with dedicated emulation hardware that streams out an unbuffered HDMI signal and an HDMI-supporting CRT that operates similar to a modern VRR gaming display.