Meh. This is an extremely basic drum machine / step sequencer with a larger number samples, and I'm pretty sure I've seen the "sample cloud" approach lots of times by now in various VSTs. Probably still fun for folk with zero exposure to music production!
It's funny, but hard to make anything sound good due to inconsistent hit points in the samples. Even with a very simple pattern nothing sounds very rhythmic to my ears.
No hi hat, no tom tom. No "taiko" drums which I realize is a bit broad. "Cymbals" nets me three "percussion finger bells" (okay, technically, I suppose?) and one "percussion drums".
This is a fairly basic interface, I'm kind of surprised there hasn't been more of a scene of making unusual music interfaces in the format of Electroplankton on the Nintendo DS.
It does seem to be one of those things where the people with the musical knowledge to potentially make super creative interfaces have gotten so deep into the technical side of things that they struggle to make intuitive UIs.
I think it's supposed to be some kind of latent vector space where the coordinate maps to the sounds' timbre. So in theory, you can move seamlessly between sounds. But I seem to only be able to snap to nearest sound.
Edit: they use t-SNE for the embedding so that is why it is discrete. But if they used something like an autoencoder, then you could have a smooth continuum and interpolate between say a dog bark and a billiards clack.
Usually with these things I've to actively turn off the silent switch on my iPhone for audio to be enabled. Definitely if it's a Tone.js app, which it probably is?
They should be able to detect it's an iphone and issue an warning message for that though seeing as a lot of people will have that switch enabled