The stuff you learned in your CS course is still just as relevant as it was 20 years ago. The things you did in your development career are nowhere near so.
Did you use SCM, automated testing and design patterns 20 years ago ? If yes do you think this was the norm in the industry ?
I'm not saying those things are the only relevant things, I'm just pointing out the obvious examples of how much things have changed since then. I've said in another comment, we've moved from mainframe/single PC shared memory model in to distributed/cloud/networked computing in the last 20 years and the problems from that era are completely different to problems today. Even on the low-level end the basic design constraints such as the difference between CPU instruction speed/memory access/cache miss dramatically changed what it means to write optimized code, nobody these days rolls their own SQRT function to squeeze out a few extra CPU cycles, now days it's about making sure your memory is laid out so that you get as few cache misses as possible.
> Did you use SCM, automated testing and design patterns 20 years ago ? If yes do you think this was the norm in the industry ?
Design patterns are way older than 20 years... hell, the design pattern bible was published 22 years ago, which means design patterns were in wide circulation well before I could tie shoes. According to wikipedia CVS has been around for 26 years.
Even though I am ony 25 I can recognize that all the computing foundations is still the same. I work in a 80's style company deving embedded system and I constantly talk with the founders who used to program the first system we used to sell. They used to implement everything from scratch. Sharing code, open source stuffs and Internet are making our life easier. But at the end of the they the knowledge behind the libraries, frameworks used in the library is the same from the early times.
I picked up the CVS habit almost exactly 20 years ago (1996) from someone in the banking & finance industry -- hardly a bastion of radical adoption. By then, CVS was 10 years old. We used RCS extensively to manage config files.
To the earlier parent who mentioned process isolation: You're thinking too much about the consumer/desktop world. The entire enterprise world was used to it and demanded it, be it on (hey, humor here) SCO UNIX (yes, they really did make a useful product before they turned evil), SunOS, VAX/VMS, MVS, BSD UNIX, etc.
The desktop world was far behind the state of the art in commercial systems in those days. Even in the 80s, you were quite possibly running your enterprise with an Oracle database, on fully memory-protected hardware. Heck - you may have even been running some of your stuff in a VM on an IBM mainframe. We took some big jumps backwards in pursuit of making computers cost-effective enough for mass adoption, and it took a while before the consumer side of things jumped forward to become the same platforms used for enterprise use.
> Did you use SCM, automated testing and design patterns 20 years ago ? If yes do you think this was the norm in the industry ?
Just because people did not use a standalone program for version control does not mean that they did not version their code. This goes all the way back to card/tape drawers and paper tape revisions (there is a great anecdote about this in Steven Levy's Hackers). Look at software from the late 1970s/early 1980s - there will be versioning information there, and usually changelogs describing what was changed. A lot of operating systems also had file versioning built into the filesystem (https://en.wikipedia.org/wiki/Versioning_file_system#Impleme...) that was used for revision control. Since most of these OSes were timesharing systems this was in effect team-based version control.
Now of course, many PC developers came from 8-bit computers rather than mainframes, where SCM didn't really mattered, because programs were very small.
Automated testing - well, I have seen Lisp compiler written in the 80s that had them. So where it made sense and was possible to do, I think people did it. (IMHO it only makes sense for programs or their parts that are written in purely functional style, which is not a big chunk.) The workflow was different in the past (big emphasis on integration and system testing) and I wouldn't say it was necessarily worse.
Design patterns definitely existed, but they weren't named as such.
I am not even sure if people (writing business applications) are more productive today. We have lot of new artificial requirements. In the past, it was typical for datatypes and other things to be constrained by design. Today, people are unwilling to do that, even though it wouldn't change anything from the business perspective.
Or take a look at web. It's a tangled mess of HTML, CSS and JS (it's been 20 years and people still can't agree whether or not is a good idea to produce HTML in JS!). In the past, you would use something like CICS or VB or Delphi, which is a consistent framework, written for the purpose of building interactive business application.
> we've moved from mainframe/single PC shared memory model in to distributed/cloud/networked computing in the last 20 years and the problems from that era are completely different to problems today
Not really. The tradeoffs are simple to understand for somebody who had CS course 20, or even 40 years ago. And even back then people writing applications didn't hand roll their own SQRT, even back then frameworks and libraries did that for you.
The design pattern book came out 22 years ago so they were certainly named as such.
My first Unix programming job had me first learn RCS and then we later switched to CVS. I used CVS personally for years after that until I switched to Subversion, then Mercurial, and now Git. What I'm doing isn't that different from when I used RCS. There's more steps because of the added complexity but it's roughly similar.
I was in high school and didn't know the first thing about programming 20 years ago. But it seems to me that "design patterns," SCM, and automated testing aren't the hard part of this discipline.
Sure they are. They are the "hard part" of what every 9-5 LOB dev has to do, and even in the more CS heavy fields from my experience the algorithms and the fancy CS is 20% of the work and the rest is implementing mundane details, bug fixing, testing, cooperating with team members, etc. And that's the stuff that improved since the 90s.
Delivering reliable software is hard - but we've gotten a lot better at it in the last 20 years, if you don't believe me boot up some old windows 95/98 image install some random software and see for yourself.
Did you use SCM, automated testing and design patterns 20 years ago ? If yes do you think this was the norm in the industry ?
I'm not saying those things are the only relevant things, I'm just pointing out the obvious examples of how much things have changed since then. I've said in another comment, we've moved from mainframe/single PC shared memory model in to distributed/cloud/networked computing in the last 20 years and the problems from that era are completely different to problems today. Even on the low-level end the basic design constraints such as the difference between CPU instruction speed/memory access/cache miss dramatically changed what it means to write optimized code, nobody these days rolls their own SQRT function to squeeze out a few extra CPU cycles, now days it's about making sure your memory is laid out so that you get as few cache misses as possible.