Absolute pitch, the ability to identify or produce the pitch of a sound without a reference point, has a critical period, i.e., it can only be acquired early in life. However, research has shown that histone-deacetylase inhibitors (HDAC inhibitors) enable adult mice to establish perceptual preferences that are otherwise impossible to acquire after youth. In humans, we found that adult men who took valproate (VPA) (a HDAC inhibitor) learned to identify pitch significantly better than those taking placebo—evidence that VPA facilitated critical-period learning in the adult human brain. Importantly, this result was not due to a general change in cognitive function, but rather a specific effect on a sensory task associated with a critical-period.
I think it's not so cut-and-dry. As a violinist, I can recognize absolute violin pitches, but can't do so with other instruments. This is definitely a learned skill; presumably there's something about the timbre of the different pitches that my brain has memorized after so many years of playing.
However someone with "true" perfect pitch is able to detect pitches across instruments (or even a sine wave) and can distinguish minute differences in pitch; for example, someone with perfect pitch can tell that an instrument has been tuned to A415 (a typical tuning for Baroque music) instead of the standard A440.
The difference between A=415 and A=440 is not "minute" at all; it's a full half-step difference!
Also, no: absolute pitch does not necessarily confer the ability to distinguish very small differences in pitch. Many people with perfect pitch can name a given note if played, but can't reliably identify if the pitch in question is 5 or 10 cents off. This is a common misconception.
Similar thing here. I'm a pianist, and while I don't have perfect pitch, there are certain piano chords that I can pick out and definitively say, "That's an F major, root position, with the root just over middle C". But other chords or even individual notes – I'm not able to distinguish them in that way.
You're talking about relative pitch, not perfect (absolute) pitch. Any competent musician will have excellent relative pitch, which is the ability to recognize intervals between pitches. This is necessary for transcription, learning music by ear, jamming, etc.
Perfect pitch means that, if someone plays a sine wave with no context, you can tell what pitch it is.
We had a guy in our college band with perfect relative and absolute pitch. One time we walked out of the dorms and the AC was squealing on top the building across the street.
"Hey Dave, what note is that?"
"Do you really want to know?"
heads nod
"It's a G"
One of the other tales was someone with true perfect pitch could tell the difference between A at 440 vs 441 hz.
I've seen this. When I was in grad school I was playing with assembly language to click a speaker in a timed loop, such that it would generate A440.
I played it to my coworker who was working in my lab, and he said: "I'm a singer and I have perfect pitch. THat isn't A440, it's slightly low. Probably one hertz."
I checked my code and indeed, I had miscomputed the timer, it was slightly longer than it should have been. Fixed the code and he confirmed it sounded better, but not absolutely perfect.
AC squealing (and a lot of other noises that are somehow synchronous to the grid frequency) would be a slightly off G outside of North America (slightly to high, G is ~49 Hz and the grid frequency almost everywhere is 50 Hz), no need for perfect pitch to guess that.
In North America (60 Hz grid frequency), I'm not quite sure whether one would recognise it as a A# (58.3 Hz) or a B (61.8 Hz).
> Perfect pitch means that, if someone plays a sine wave with no context, you can tell what pitch it is.
What bothers me about that definition is that it doesn't specify accuracy. If I play a sine wave at 498.753 Hz, how many of those decimal places will someone with perfect pitch get correct? Since classical music is based on discrete notes with intervals of at least a semitone, maybe an error of less than half a semitone is acceptable, because that's enough to sort a note into the correct bucket? But then what's the average error for people who don't have perfect pitch?
I think it's like: at which point it's a sharp A or a flat A#? Because when we talk about perfect pitch, is in a musical context, in the sense that a person can tell the note that would classify that wave, even without knowing how many Hz it is
At which point red becomes orange, brown, pink? What about cherry, coral? We describe color names by association — sky blue, grass green, ochre, rose, pink, violet. Rick Beato pointed that children catch names by association with a song.
> average error for people who don't have perfect pitch
I've tried once with a kid 10 years old, she expected and answered C, D, E, while I was playing same note (unison) with long pause on harmonica. Our perception skewed by our believes. I've heard same applies to drawing — one has to learn to see.