> Perfect pitch means that, if someone plays a sine wave with no context, you can tell what pitch it is.
What bothers me about that definition is that it doesn't specify accuracy. If I play a sine wave at 498.753 Hz, how many of those decimal places will someone with perfect pitch get correct? Since classical music is based on discrete notes with intervals of at least a semitone, maybe an error of less than half a semitone is acceptable, because that's enough to sort a note into the correct bucket? But then what's the average error for people who don't have perfect pitch?
I think it's like: at which point it's a sharp A or a flat A#? Because when we talk about perfect pitch, is in a musical context, in the sense that a person can tell the note that would classify that wave, even without knowing how many Hz it is
At which point red becomes orange, brown, pink? What about cherry, coral? We describe color names by association — sky blue, grass green, ochre, rose, pink, violet. Rick Beato pointed that children catch names by association with a song.
> average error for people who don't have perfect pitch
I've tried once with a kid 10 years old, she expected and answered C, D, E, while I was playing same note (unison) with long pause on harmonica. Our perception skewed by our believes. I've heard same applies to drawing — one has to learn to see.
What bothers me about that definition is that it doesn't specify accuracy. If I play a sine wave at 498.753 Hz, how many of those decimal places will someone with perfect pitch get correct? Since classical music is based on discrete notes with intervals of at least a semitone, maybe an error of less than half a semitone is acceptable, because that's enough to sort a note into the correct bucket? But then what's the average error for people who don't have perfect pitch?