This is true, but in my opinion also misleading. Speed and latency are fundamentally different. Speed would be a Performance Feature in the Kano model, meaning there is usually a linear relationship between speed and user satisfaction.
Latency would be a Basic Feature.
Once you cross 7 ms (or 5 ms, or even 3 ms if you absolutely insist) you're happy, above that everything is absolutely unusable.
You are missing out the jitter. This is often the worst part of modern implementations. If there is a jitter of 4ms and peaking sometimes with 20ms, then a 5ms latency is still bad. This implementation is basically unusable. Like many modern USB ones..
The Atari has an absolute stable and extremely low jitter. Some guy measured it to 1µs. Cannot find the link though, sorry.
So the Atari has low latency around 2-4ms with an extremely low jitter. This is execatly what you want from a MIDI clock and sequencer driving multiple MIDI devices.
How do you think any professional works nowadays with MIDI? A good, modern USB interface (from Focusrite or similar) has a jitter well below 1ms, usually in the range of 200µs. If that is too much, simply sync your DAW with an external, dedicated clock, which will usually give you a jitter in the single µs range.
I have a Focusrite and the MIDI timing is terrible. Sure, there is more to it then just the interface. With USB you just cannot guarantee a stable midi timing, because there is no good midi buffer implementation for it. Technically it would be possible, but no one cares.. Professionals using something like MIDI to audio converters via an VSTi plugin that takes midi signals, modulates them onto a audio signal (which can be easily buffered) and some dedicated outboard equipment converts this back to MIDI. If you are working with hardware synths, etc. this is the only option you have nowadays with non-vintage hardware. A lot of producers do not work with midi anyways, they use plugins, that's why it is some kind of a niche problem and there's not much talking about it.
First off, I'm assuming of course we are talking Mac here, because Windows is unusable for MIDI. If you have terrible MIDI timing with a Mac, then yes indeed, you'll need to sync via audio, but there are nice and inexpensive solutions for this, for instance the Midronome.
Look, I'm not trying to convince you to get rid of your Ataris, quite the contrary. I'm just disagreeing that it's impossible to have low jitter nowadays, but I fully agree that things used to be simpler before everything was done via USB.
Agreed. It is of-course not impossible, but it is almost impossible out-of-the-box (literally ;-)) I have a USAMO (Universal Sample-Accurate MIDI Output) device, but do not use it, because as I said, Atari is king here. :-) Not sure how the Midronome can solve the problem of midi notes coming inaccurate from a modern DAW? But maybe I do not understand it completly. Need to have a deeper look. Since some years I am using Linux with a Focusrite for mastering and audio tracking. Midi was bad with Linux and Windows since I got my first USB interface and went away from PCI interfaces. But this shouldn't matter too much. :-)
Note that this is an old version, I just saw the there's now the "Nome II", and at least for Mac, he has actually developed a USB protocol to provide a stable clock (which as you've already written is totally possible via USB, it's just nobody cared enough):
Thanks a lot!
The scotsman is cool and his t-shirt too. :-D T-Shirt says in German "Little pig".
Regarding "midi notes" Sim'n Tonic himself is saying this to the Midronome:
"Note that only these MIDI messages are simply forwarded when they are received, their timing is not changed. So if your DAW sends them with a lot of latency and/or jitter, the Midronome will forward them with the same latency/jitter. Actually this is a problem I plan on tackling as well [...]"
So the Midronome does not solve the problem of inaccurate midi notes coming from a modern DAW. The USAMO does by the way.. But only with one midi channel at once. And of course, coming back to the actual topic, the Atari hasn't a problem at all with accurate midi notes, it is absolutely tight at all 16 channels. So it seems there is indeed nothing comparable to the Atari nowadays. Maybe it will in the future.
Not sure if that is still accurate. This might only be available for Mac, but on the FAQ for Nome II it says this:
Can Nome II send MIDI Notes?
Nome II is like a MIDI hub, you can ask it to forward any MIDI sent over USB to one of its MIDI outputs. It will not only forward these instantly but merge them smartly with the MIDI Clock, without affecting it.
The Windows MIDI/USB stack adds considerable amount of jitter to the MIDI clock, compared to the much superior ones in MacOS. I will fully admit that "unusable" is a personal opinion based on my experience. Of course performers also use Windows, but I heavily doubt you are able to see which device in their rack acts as a master clock, and how they sync their devices, apart from the fact that most performers nowadays don't use MIDI at all.
Midi is used heavily for guitar patch and lighting automation as well as triggering backing tracks in a DAW running on stage. The use of MIDI (over USB) has only increased on stages.
This is getting ridiculous, we are talking about making music, so triggering notes from different devices in sync. You know, what MIDI was originally designed for, not triggering some lights, guitar patches or a background track. You are exactly proving my point: MIDI nowadays is pretty much reduced to SysEx for doing simple automations. None of that is seriously affected by jitter in the ms range. You sound like you have no idea how electronic music was done before VSTs were a thing.
Latency would be a Basic Feature. Once you cross 7 ms (or 5 ms, or even 3 ms if you absolutely insist) you're happy, above that everything is absolutely unusable.