The universities did build such systems. See Andrew at CMU or Athena at MIT. They built a ton of infrastructure, from distributed file systems (afs) to chat systems (zephyr), word processors (ez), multimedia email (messages) and so on.
Some developments from those projects live on today, such as Kerberos. But most of those innovations had crappy user interfaces and never made it outside the university. Commercial companies took their ideas and built products mere mortals could use. Now there is no reason not to use the commercial products that are more stable, more secure, and have more applicability outside the university. Plus you have to serve students who aren’t there for CS with the same network.
Isn't the fact that these product were not of commercial quality an indication that maybe the courses were not that great? How people can learn building something that is useful, easy to operate and adding value if at a place where they supposed to learn it, they are not taking it seriously or don't have enough skill to teach it?
I hope this doesn't look like an attack, I am genuinely interested.
Like most things, I believe the issue comes down to incentives. If you are a university student studying CS, what’s your incentive? To get the best grade possible in your course, most likely. What is your shortest path to a great grade? Is it adding user friendly features? Or demonstrating mastery of applying theoretical principles in software, for example by implementing a novel distributed consensus algorithm? Plus your course lasts at most a semester, to perhaps several years if you are lucky to work on the same project the entire time you’re paying to attend university.
The incentive for commercial companies on the other hand is entirely opposite. Their incentive is to build a product that appeals to the widest population faster than their competitors. They optimize for user friendliness and eschew the untested in favor of hacky solutions that work now. From the developers perspective, they are now paid to work so they have more of an incentive to do things that may not be as attractive to them personally such as fixing bugs.
Also, there is a difference between university and apprenticeship. Traditionally university focused on teaching the soft skills, the “liberal arts”, providing a broad base of knowledge from history to widen the mind of those who attend. It’s not meant to be a job training center. Unfortunately these days it seems that most employers are uninterested in mentoring and apprenticeships, looking for the public to subsidize job training for them. Universities in my opinion are poorly set up for this, but alas this is what most expect.
University programs manned primarily by 18-21 year olds are supposed to compete against for-profit companies with highly experienced and specialized engineering talent?
Some developments from those projects live on today, such as Kerberos. But most of those innovations had crappy user interfaces and never made it outside the university. Commercial companies took their ideas and built products mere mortals could use. Now there is no reason not to use the commercial products that are more stable, more secure, and have more applicability outside the university. Plus you have to serve students who aren’t there for CS with the same network.