Another, more optimistic way to describe it is that students are spending more of their wealth to invest in their own human capital, which isn't measured in statistics.
> Another, more optimistic way to describe it is that students are spending more of their wealth to invest in their own human capital, which isn't measured in statistics.
Statistics can measure this fairly easily. Just these particular statistics cited in this article do not.
However, the root of the argument is probably depressing. It is pretty unlikely that all the money spent on higher education produces a true positive return on investment (please do not cite any "college grads make more money" stats without controlling for co-variant effects). Most loans taken out to fund schooling are forms of malinvestment.
Thinking about this a lot, university at the end of a school cursus to develop “human capital” seems a not so great idea.
People live close to 70 years, and the world changes a lot.
So many things that we took for granted 10 years ago are just mistaken now. Anyone
wanting to be well rounded can’t stop learning when getting out of school, whatever school they went to.
Do they learn less by themselves, reading papers and lectures while having a good sense of the how society and the world works from an adult point of view ?
What would happen to someone who only focuses on work related education at school but keeps reading and getting lectures about whatever subject picks their interest ?
Wouldn’t they be in a better place than whoever spent crazy amount of money on a “well rounded” education and goes to the world with the impression they now know most of what they need to know ?
Very few people use many of the concrete facts they learn in college in the normal life. If a college education can be defended from a standpoint of practical usefulness, it must be because it teaches some generalized/abstract "how to think" rather than the specific facts. And if so, that should age much better.
The "aging better part" is where I feel there must be a better way to think about it.
It needs to age better because we tend to think we only get one chance to learn these things. If it was more of a norm to keep learning once in the workforce, there would be less question of making knowledge age, as it would be easier to keep it fresh.
On the other hand I think a 1 year or 2 year specialized course on a very practical subject to enter the workforce is a clever approach, as long a the person keeps learning about the other subjects (history, geography, philosophy, etc.) going along.
Many things being taught in college today will be out-of-date by the time a current college student dies. But I think you're overstating the proportion which that applies to, and discounting the usefulness of that knowledge between now and when it's no longer useful. Most of what's really subject to change is the stuff on the outer edges of fields, which is mostly only being taught to people specializing in those fields. Most of the rest is in relatively new fields (much of computer science) and people in new fields are going to have to work to keep up anyway.
I agree with you when it comes to fields like mathematics, physics where knowledge is basically cumulative and a basic understanding goes a long way.
Then for instance Freud is still studied in philosophical classes, history as we learn it is so basic and biased that the most important parts (the ones that directly affect how we see the world and plan the future) need to be completely reviewed as an adult to have any acceptable grasp of where we are and where we go.
For instance what I learned about the cold war that still stands today could be summarized in a A4 sheet, and we spend a few months on it. The rest just turned out to be mostly propaganda, unbased assumptions and half truths.
Even going back to the greeks and the romans, actually learning about the subject gives a completely different impression that what we got at school. It might be because I'm an adult now, but then what was the meaning of these months spent on it ?
Basically I feel that subjects closely related to humans and society are not well fit for a 3 or 4 year generic program, to be almost useless if it's the only time someone study them.
Is computer science really that subject to change, though? Especially given what most CS graduates end up doing in industry? The tech stack may change, the syntax of the language may change, but the underlying math/theory mostly doesn't.
I think most decent CS programs are roughly the same as they have been for decades.
It should include a lot of math including a few semester of calculus, linear algebra, probability, etc.
Should be forced to take a few of the CE beginner courses like digital logic. And of course, learning low level OS development, networking layers, which includes C and assembly. Algorithms and some sort of numeric computing course using the learned math.
It wasn't until my last year where I had electives that I needed an IDE for dealing with high level languages and frameworks.
But I have a feeling that a lot of CS programs - especially those in lib arts instead of under the engineering schools - have really become watered down and glorified boot camp, and for several reasons.
* need more women and diversity to pass the program.
* colleges are addicted to the foreigners who pay full tuition. Who cares of half of them cheat, can't speak English, can't complete the work - they're paying $100k for that degree. And the smartest ones can be used as indentured servants in the grad school since they can't stay otherwise.
Anyway, when I do interviewing, I have noticed a decline in candidates even if they have a fresh CS degree. Many of them know things like React and Javascript, but are clueless with most of the subjects I wrote about above.
Human capital manifests statistically as employment. It’s not a complete measure of labour but it is very convenient for measuring the natural value of a degree.
Reminder to those down voting this comment. Down votes are for people not contributing to the discussion in a meaningful way. They are not for expressing disagreement. Use the reply button if you disagree with the parent comment.
Is downvote the opposite of upvote? It seems that people upvote comments that are helpful/insightful and that they agree with.
Because upvote and downvote seem like they would be opposites, it's not surprising that people might focus on the negative of "I agree with this comment" instead of the opposite of "this comment is helpful/insightful". The former is easier to measure than the latter.
pg has in the past explicitly endorsed the idea that votes should, to some extent, reflect agreement/disagreement, but he's making a mistake. He has also correctly said the votes should answer the question "do we want more of this on HN?" since that's functionally what they are used for. If anything, we should err on the side of upvoting thoughtful things we disagree with, rather than agree with, since we learn more from that.
The OP should acknowledge that difference, but it's not a good policy to condone downvoting to disagree. It freezes out unpopular points of view and breeds resentment.