Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Do you understand how WASM and Web workers work? Do you understand that low-enough-latency audio doesn't take a super computer anymore? Yeah, if you were working on DSP stuff in the 1990s, you were a hot shit programmer. Nowadays, it doesn't really say much at all. And it certainly doesn't justify talking about it as if it were a moral failure to not treat DSP with respect.


> Do you understand that low-enough-latency audio doesn't take a super computer anymore

It never did. Low latency audio has almost nothing to do with CPU power. Here's a summary of some of the issues faced on modern general purpose computers:

https://manual.ardour.org/setting-up-your-system/the-right-c...

I know how WASM and Web workers work. Since nothing you can do in WASM or a web worker has anything to do with either (a) realtime scheduling priority (b) actual audio hardware i/o, they don't have much to do with solving the hard parts of this. Browsers in general do not solve it: they rely on relatively large amounts of buffering between them and the platform audio API. Actual music creation/pro-audio software requires (sometimes, not always) much lower latencies than you can get routing audio out of a browser.


And even when it doesn't require it, we always want it. :-)


And even when we set it, we don't get it, because we blithely read a "latency" label in a GUI instead of measuring the round-trip latency on the specific device in question.


That wouldn't be correct either, at least half the time. Problem is that "latency" is used with different meanings, at least two:

1. time from an acoustic pressure wave reaching a transducer (microphone), being converted to a digital representation, being processed by a computer, being converted back to an analog representation and finally causing a new acoustic pressure wave care of another transducer (speaker).

2. time between when someone uses some kind of physical control (mouse, MIDI keyboard, touch surface, many others) to indicate that they would like something to happen (a new note, a change in a parameter) and an acoustic pressure wave emerging somewhere that reflects that change.

The first one is "roundtrip" latency; the second one is playback latency.


How do you measure playback latency? Is there a way that an end user can do it as easily as measuring rountrip latency?

Edit: clarification.

Also, is there some measurement being done to come up with the latency given in the following dialog?

https://qjackctl.sourceforge.io/qjackctl-screenshots.html

Or is that just the result of arithmetic for the given configuration options above it?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: