Hacker News new | past | comments | ask | show | jobs | submit | keymasta's comments login

That particular algorithm doesn't care whether the instruments are guitar or otherwise. There are other algorithms in vamp that would deal with individual notes. But in terms of separating tracks, vamp doesn't do that. There are some new ML-based solutions for this though. So you could separate them and run vamp on those outputs.

But to get the chords I don't think you need to worry about that.


To add to this, it’s vanishingly seldom that you would have a finished track where multiple parts of the ensemble are playing different chords, rather than each part making up the same overall chord structure.

What a cool thread! I like how you put the specifics of your workflow and especially details of the commands you used! Particularly with the vamp commands, because as you say, they are somewhat inscrutably named/documented.

I started dabbling with vamp as well a couple years ago, but lost track of the project as my goals started ballooning. Although the code is still sitting (somewhere), waiting to be resuscitated.

I have had an idea for many years of the utility of having chord analysis further built out such that a functional chart can be made from it. With vamp most of/all the ingredients are there. I think that's probably what chordify.com does, but they clearly haven't solved segmentation or time to musical time, as their charts are terrible. I don't think they are using chordino, and whatever they do use is actually worse.

I got as far as creating a python script which would convert audio files in a directory into different midi files, to start to collect the necessary data to construct a chart.

For your use case, you'd probably just need to quantize the chords to the nearest beat, so you could maybe use:

vamp-aubio_aubiotempo_beats, or vamp-plugins_qm-barbeattracker_bars

and then combine those values with the actual time values that you are getting from chordino.

I'd love to talk about this more, as this is a seemingly niche area. I've only heard about this rarely if at all, so I was happy to read this!


> I think that's probably what chordify.com does [...] I don't think they are using chordino

I think they were initially using the Chordino chroma features (NNLS-Chroma) but a different chord language model "front end". Their page at https://chordify.net/pages/technology-algorithm-explained/ seems to imply they've since switched to a deep learning model (not surprisingly)


I really appreciate the opinion that using args and kwargs is "bad".. It always annoys me when you get them as your parameters.. even worse when you go to declaration and the declaration also contains unlabeled parameters. A lot of wrapper libraries seem to do this. I try to just name every parameter I can. It's so much easier to use this code now that we are using autocomplete in the IDE.


I really like the link you provided and have watched it before!

But - I do want to say that C == green is not arbitrary at all. It is consistent with my calculations, which are consistent with Newton's calculations. Usually I see the colours being assigned to notes as wrong.. but this C == green is consistent with mapping light, using octave equivalence, given by the following:

  f_prime = f * 2 ** (i / 12)
  # Where, 
  # f' is the derived f, (in this case Green {5.66 × 10^14 Hz})
  # f is the reference f (in this case 261.63 Hz)
  # i is the interval in semitones
Let's say that C == 261.63 Hz, and that Green == 5.66 × 10^14 Hz. Using the preceding formula we can make a small (python) program to check whether C == Green.

  light_range_min = 400 * 10 ** 9 # Hz
  light_range_max = 790 * 10 ** 9 # Hz
  C = 261.63 # Hz
  octave = 0
  for octave in range(100): # we are just using a high number here
      f_prime = C * 2 ** (12 * octave / 12)
      if f_prime >= light_range_min:
          octave = octave
          break
  print(f"C in the range of light has f == {f_prime}, which is {f_prime / 10 ** 9} THz. We had to go {octave} octaves up to arrive there")
  # outputs: C in the range of light has f == 561846146826.24, which is 561.84614682624 THz. We had to go 31 octaves up to arrive there
We can look up colour charts like [0] or [1] and find that this frequency is in fact associated with the colour green.

The rest of your commentary seems valid.

[0] https://en.wikipedia.org/wiki/Visible_spectrum

[1] https://sciencestruck.com/color-spectrum-chart


Exactly! It's all made visual and interactive here https://chromatone.center/theory/interplay/spectrum/ - you can even slide the tuning of A4 by dragging left-to-right the last graph - it will show the distribution on 44th octave notes along the visual spectrum


Perceptually, the C pitch class and the color green have nothing to do with each other.


Not sure why I got down voted but the parent has deleted their post so ok! :D


It could be related but I also want to weigh in here and say this. Hypoglycemia can occur with no relation to the other side of diabetic symptoms, i.e, hyperglycemia. In other words there are people who suffer from hypoglycemia without ever getting high blood sugar, and so they are not "diabetic" which would mean you can have issues from both directions.


As a piano/keyboard player, a lot of musicality is possible on a keyboard. It is possible to learn to modify the technique to better utilise the velocity available to a particular keybed, weighted, or non-weighted. When playing keyboards you are working within a subset of the potential dynamics available to a piano. Though expressivity is lessened, there is still a huge palette once you learn to use less total force and less differentiation in force (dynamics).

I know I can play with high musicality on almost any keyboards with velocity, because I was blessed to have learned to use bad instruments. But, it doesn't compare to the depth of the sound generated by all the moving parts and interactions happening in a real piano. Not only the sounds, but also the sheer weight of the keys.

Most* keyboards/vsts are just triggering a (pitch-shifted, looped) sample at a given note and then doing that for n notes and that doing an additive sum.

That is definitely not what occurs in a piano though. There you have the 3-dimensionality of the physical world, like the way waves are traveling through distance and shape. When sounds' harmonics interact, resonant nodes in overtone sequences can trigger each other to resonance, which can trigger other resonances throughout the tone. Maybe you know the feeling of depressing the sustain/damper pedal while sitting in front of it and giving the instrument a smack (or holding down the keys you are not playing and doing it). Or running your nail or a pick over all the low notes with sustain.. like you are in a cave.

In midi/digital, there's the fact that dynamic is usually gonna be 8-bit. Just because midi did that and it made sense at the time, other keyboards and VSTs mostly follow suit. I'm surprised this gets generally passed over. Obviously there's more than 128 strengths of note in real life.

But all that said I think it's possible to learn keyboards/music theory/songs/playing on a non-weighted keyboard, but false to say that digital/non-weighted is equivalent to acoustic piano. But you only really need that for really dynamic music like Jazz, Classical, Instrumental et al. But it feels so very wrong to play that kind of music on bad keyboards.

* Roland V-Piano, and PianoTeq, as well as many I'm unaware of do in fact use physical/acoustic modeling as opposed to triggering samples, but it has not been predominant even among high-end digital instruments


And then you can see that the bitrate is 2-20x better. I personally get bothered by anything under CD quality (44.1kHz 16-bit) so all these platforms are basically unlistenable for me.


> either with therapy, medication, supportive people around them, or a mix of all of those

I feel like you forgot the primary thing that would help people to stop thinking of ending it: being able to afford necessities.


It breaks my heart that this is the case. Honestly, even online you hear so much about people struggling with everything from affording food, to hoping that they'll ever be able to pay off their debts or even afford a place to live. :(


I've been using Firefox for about as long as I can remember, and really don't notice sites not working. I do notice, however, that using any browser without UBO makes my eyes bleed in an unending agony of capitalist garbage. It's like using a browser and then putting sand in your eyeballs.


Kind of awesome how Behringer went from being the crappy version of gear to being the generic, affordable version of actually good gear within the past ~10-15ish years.

When I was selling gear just before my estimated time-line it was known to be basically Yamaha* or Peavey-ish company (does a very wide line of products at affordable prices), but "ever-so-slightly" crapish. Since then they have become a/the goto if you want essentially Moog or those other big names for less than 1 carrot.

I have a Prophet 5 v2 so no real need for me (in terms of analogue synth), but great to see this kind of stuff getting into the hands of more people.

Sure, the pots, etc. aren't quite as solid as the big ($) versions, but that kind-of doesn't matter at all, for a lot of use cases.

I might even get one of their thingies at some point. For less than the price of a beater car it's become a tempting case to have GAS [0] for Behringer [1].

[0] https://library.oapen.org/handle/20.500.12657/48282

[1] https://www.sweetwater.com/c510--Behringer--Synthesizers

*Yamaha actually makes a lot of very high-end gear as well, from piano to guitar, they have fairly extensive custom-shop stuff. I brought them up because, in addition to handling mid and high-end gear, they always made really quality entry-level gear as well.


Yup, I had a deepmind 12. Kinda regret selling it. The keyboard and build were decent. I found you had to menu dive the built in FX to make interesting sounds, but as a unique synthesizer it was a very good entry :). I also have a small xenyx mixer which was basically fine.


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: