Hacker News new | past | comments | ask | show | jobs | submit login

That image lacks an obvious source or any explanation for methods of how the data was gathered and I can find no record of a study or context that corresponds to this image.

What I can find is a wikimedia entry with the image but no attribution except the "US Census" and no actual link to any publication put out by the Census Bureau. The archive link goes to a page that does not actually contain this graphic, or the data necessary to generate it, making it a bit suspect to begin with.

The census also don't systematically collect IQ scores or themselves administer IQ tests, making the details, data, and methodology of any study they produce paramount to interpreting this barebones graph. The title of the graph itself is borderline ridiculous, awkwardly stated at best and downright deceptive:

IQ tests are not a requirement for graduating college, and taking them at all is relatively uncommon these days.

As it stands, this image is worthless without context, and that context is oddly elusive except for an anonymous wikimedia post that did not cite the source with any specificity required to authenticate it.




This image is even more worthless than it seems. The post on Wikimedia is an original work. Its description states: "As the percentage of graduates increases the minimum IQ to include at least that percentage of graduates inherently decreases. Since 2000 the intelligence required to be a college graduate has been less than the intelligence required to graduate from high school in 1940, based on a standard distribution."

It seems the author took the the percentage of the population that graduated high school/college each year and then found the corresponding percentile on an IQ bell curve and used those as the y-values. This methodology only makes sense if you assume that high school/college graduates are exactly the highest IQ population and that everyone who does not graduate isn't intelligent enough to do so. This chart also almost certainly doesn't normalize IQ over time, even though IQ is constantly redefined so that 100 is average while raw intelligence scores have increased over time [1].

What this chart actually shows is the highest possible IQ of the graduate with the lowest IQ in a given year, a statistic that seems to have dubious value.

[1] https://en.wikipedia.org/wiki/Flynn_effect


You’re right it is worse. It looks like they assumed college-going students would approximate the standard distribution of scores for the population as a whole, which… no. An awful assumption for an activity where academic ability is a primary gatekeeper at the same time that they’re attempting to apply a metric of IQ essentially as a proxy for academic ability/knowledge/whatever. (“Whatever” because the entire concept of intelligence is filled with varying definitions)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: