I sometimes wonder if addition to a Ethics class, Computer Science students need to take a Computer Archaeology class. It might not be a bad thing to bring up a lot of the concepts that aren't in the main stream anymore that have been tried.
Surely, then younger generations would learn how Algol and PL/I were used to write OSes, how Xerox PARC and EHTZ managed to write OSes without a single line of C code and that UNIX wasn't the genesis of operating systems.
I definitely support that. Many tips I give to people for their projects came straight out of 1960's-1980's CompSci or industrial work. It has been a ridiculous amount of effort finding all of it, though. So many silos. It needs integrated on a per topic basis, cleaned up, an executive summary, and references available for follow-up. Preferably with FOSS tools if it's an analytical method, language, etc.
Then, people might quite reinventing the wheel or missing obvious things as much as they do now.