Umm the more comparable parable would be "or the proverbial graybeard who has done ~~it~~ kidney surgery a thousand times"
The stuff you did in software 20 years ago has absolutely nothing to do with the stuff that's being done today. Like holly shit most people weren't even using source control back, unit testing and automated testing in general was SciFi, it was done by QA departments if you were big enough to do it.
People were writing C and the major problems of the industry were how to fit shit in to x MB of ram and x CPU cycles.
Security was abysmal - you didn't even have process isolation on OS-es.
The languages and the practices are also completely different - disregarding the "we did it in the 60s with LISP in this paper" crowd - outside of academia people were just getting in to the OOP which was the pinnacle of abstractions back then.
This is the drag&drop and copy-paste as an abstraction VB shit era.
Unless you were at one of the cutting edge places that were actually innovating back then in terms of process and stuff there's likely nothing from that skill set that transfers on to the problems that we are dealing with today beyond the CS grad basics.
Stuff that hasnt changed markedly: geometry math, statistics. SQL - 1970s. SOLID - parts of that are late 1980. C still looks like C. Unix like environments. Awk/sed/etc.
Go read about the history of the web or XML; you get a very strong sense that these things have been thought about for a long time - data interchange has varied formats, but there is still a lot of tedious ETL type work. The importance of naming things well hasnt changed. Identifiers for data being a hard thing hasnt changed. Schema/vocabularies and more are still important problems.
If you cant see some of these things underpinning much of the work we do, you might be missing the forest for the trees
One thing I've found that's significantly changed since I began my career, before my beard went grey, is that projects are now in general run much better. People never make a total mess of estimating timings, and the Agile methodology (and Kanban boards, and Trello, and JIRA, and of course Slack) really fix all organisational and prioritisation problems such that projects never end in disaster; gone are the days of frantic last minute firefighting and all-nighters, as we desperately work out which features are critical to getting the project live before everything explodes. Clearly the experience of older, wiser heads in moments of crisis is now basically never required because projects run so smoothly.
>The stuff you did in software 20 years ago has absolutely nothing to do with the stuff that's being done today. Like holly shit most people weren't even using source control back, unit testing and automated testing in general was SciFi, it was done by QA departments if you were big enough to do it.
That stuff a competent graybeard knows already.
You don't think they disappeared in 1990 and re-appeared magically today, or that they didn't kept up with the times during their career, right?
In addition, they would know all the solid stuff -- the things programmers before the internet had to learn the hard way, and lots of today's dilettantes lack.
And you must be very young (20 something?) if you really believe that "the stuff you did in software 20 years ago has absolutely nothing to do with the stuff that's being done today". That's simply not true -- and triply not true for 15 years ago.
I mean if you were a good dev in the 90s chances are right now you're in some senior architect/management/consultant position. If you're competing for the same jobs as 20 something devs then you've probably screwed up somewhere in your career.
I started programming in late 90s but not professionally until 2006 I think, this was probably the late phase of transition to internet as a dominant factor in computing - so I think it gave me enough perspective as to how things changed, plus talking to older developers, reading stuff online and on the forums I know we've come a loong way as an industry.
>I mean if you were a good dev in the 90s chances are right now you're in some senior architect/management/consultant position. If you're competing for the same jobs as 20 something devs then you've probably screwed up somewhere in your career.
Or you just don't like management / like programming?
Yeah, I had some chances, but never liked the role of the pointy haired boss.
On the other hand, I don't like having to suffer stupid enterprise project managers either, though in my line of work it's better than the norm (more startup-y -- heck, we are at 100 people or so and our founder/boss still codes like crazy, same for the CTO).
I think the ideal for our kind of persons is having your own company -- either an one person one, or a bigger outfit but with someone else handling the marketing and finance parts. You get to both program and chose what and how to deliver.
Typical ageist bullshit. Your argument is if someone is good at something then they are probably not doing it anymore. Newsflash, you were moved out of dev because you have some issue. Most people avoid "architects" which is code for cranky bastard who can't be fired but is likely breaking lots of stuff in the codebase and generally making things bad for the devs of all ages.
"I mean if you were a good dev in the 90s chances are right now you're in some senior architect/management/consultant position. If you're competing for the same jobs as 20 something devs then you've probably screwed up somewhere in your career."
Not everybody's interested in climbing the corporate ladder or in architecture, management, or consulting. Some people are content to be senior engineers.
"we've come a loong way as an industry"
The industry has certainly changed a lot, but if you keep your skills up to date, that's not going to be an issue.
> The stuff you did in software 20 years ago has absolutely nothing to do with the stuff that's being done today.
The longer I'm in software the more and more I find this not to be true. The stuff that has changed is all surface level. The underlying principles are the same. The skills needed are the same. The attention to detail and other attributes needed to get the job done is the same. The problems are basically recycled or scaled up versions of the problems we had years ago.
There's new stuff that's a significant advance (machine learning that really works), there's stuff that's just different (like most of the web backend technologies), and there's stuff that's worse (security).
18 years ago we had much of what we had today. Things might move on but they do not change. I mean hey, everyone still hates Visual Basic!
At least 18 years ago we were actually writing the majority of our code rather than relying so heavily on libraries and third party code. I mean someone only has to pull a Node.js plugin for padding and there is complete melt down!
I've only been in the industry 13 years, and I'm not using a single tool today that I used back then, but I'm still using the lessons I learned:
How much time should my team spend on testing vs. code review vs. building new features?
How risky is this feature, and how much QA does it need?
Which of the brand-new engineers on my team needs a careful eye on their code so they don't blow everything up?
When is it time to say screw it, we have to ship?
Which new hip framework is likely to have staying power, because of the people and the commitment behind it?
How do I convince open-source maintainer X to accept my new feature, so I'm not maintaining patches into eternity?
Those are the important questions I deal with every day. I couldn't answer them as well 10 years ago, and I bet I can answer them better 10 years from now.
Most programs don't really use algorithms or data structures that were discovered much later than the '70s. This idea that old experience is worthless because we're using different libraries now strikes me as altogether wrongheaded.
The stuff you learned in your CS course is still just as relevant as it was 20 years ago. The things you did in your development career are nowhere near so.
Did you use SCM, automated testing and design patterns 20 years ago ? If yes do you think this was the norm in the industry ?
I'm not saying those things are the only relevant things, I'm just pointing out the obvious examples of how much things have changed since then. I've said in another comment, we've moved from mainframe/single PC shared memory model in to distributed/cloud/networked computing in the last 20 years and the problems from that era are completely different to problems today. Even on the low-level end the basic design constraints such as the difference between CPU instruction speed/memory access/cache miss dramatically changed what it means to write optimized code, nobody these days rolls their own SQRT function to squeeze out a few extra CPU cycles, now days it's about making sure your memory is laid out so that you get as few cache misses as possible.
> Did you use SCM, automated testing and design patterns 20 years ago ? If yes do you think this was the norm in the industry ?
Design patterns are way older than 20 years... hell, the design pattern bible was published 22 years ago, which means design patterns were in wide circulation well before I could tie shoes. According to wikipedia CVS has been around for 26 years.
Even though I am ony 25 I can recognize that all the computing foundations is still the same. I work in a 80's style company deving embedded system and I constantly talk with the founders who used to program the first system we used to sell. They used to implement everything from scratch. Sharing code, open source stuffs and Internet are making our life easier. But at the end of the they the knowledge behind the libraries, frameworks used in the library is the same from the early times.
I picked up the CVS habit almost exactly 20 years ago (1996) from someone in the banking & finance industry -- hardly a bastion of radical adoption. By then, CVS was 10 years old. We used RCS extensively to manage config files.
To the earlier parent who mentioned process isolation: You're thinking too much about the consumer/desktop world. The entire enterprise world was used to it and demanded it, be it on (hey, humor here) SCO UNIX (yes, they really did make a useful product before they turned evil), SunOS, VAX/VMS, MVS, BSD UNIX, etc.
The desktop world was far behind the state of the art in commercial systems in those days. Even in the 80s, you were quite possibly running your enterprise with an Oracle database, on fully memory-protected hardware. Heck - you may have even been running some of your stuff in a VM on an IBM mainframe. We took some big jumps backwards in pursuit of making computers cost-effective enough for mass adoption, and it took a while before the consumer side of things jumped forward to become the same platforms used for enterprise use.
> Did you use SCM, automated testing and design patterns 20 years ago ? If yes do you think this was the norm in the industry ?
Just because people did not use a standalone program for version control does not mean that they did not version their code. This goes all the way back to card/tape drawers and paper tape revisions (there is a great anecdote about this in Steven Levy's Hackers). Look at software from the late 1970s/early 1980s - there will be versioning information there, and usually changelogs describing what was changed. A lot of operating systems also had file versioning built into the filesystem (https://en.wikipedia.org/wiki/Versioning_file_system#Impleme...) that was used for revision control. Since most of these OSes were timesharing systems this was in effect team-based version control.
Now of course, many PC developers came from 8-bit computers rather than mainframes, where SCM didn't really mattered, because programs were very small.
Automated testing - well, I have seen Lisp compiler written in the 80s that had them. So where it made sense and was possible to do, I think people did it. (IMHO it only makes sense for programs or their parts that are written in purely functional style, which is not a big chunk.) The workflow was different in the past (big emphasis on integration and system testing) and I wouldn't say it was necessarily worse.
Design patterns definitely existed, but they weren't named as such.
I am not even sure if people (writing business applications) are more productive today. We have lot of new artificial requirements. In the past, it was typical for datatypes and other things to be constrained by design. Today, people are unwilling to do that, even though it wouldn't change anything from the business perspective.
Or take a look at web. It's a tangled mess of HTML, CSS and JS (it's been 20 years and people still can't agree whether or not is a good idea to produce HTML in JS!). In the past, you would use something like CICS or VB or Delphi, which is a consistent framework, written for the purpose of building interactive business application.
> we've moved from mainframe/single PC shared memory model in to distributed/cloud/networked computing in the last 20 years and the problems from that era are completely different to problems today
Not really. The tradeoffs are simple to understand for somebody who had CS course 20, or even 40 years ago. And even back then people writing applications didn't hand roll their own SQRT, even back then frameworks and libraries did that for you.
The design pattern book came out 22 years ago so they were certainly named as such.
My first Unix programming job had me first learn RCS and then we later switched to CVS. I used CVS personally for years after that until I switched to Subversion, then Mercurial, and now Git. What I'm doing isn't that different from when I used RCS. There's more steps because of the added complexity but it's roughly similar.
I was in high school and didn't know the first thing about programming 20 years ago. But it seems to me that "design patterns," SCM, and automated testing aren't the hard part of this discipline.
Sure they are. They are the "hard part" of what every 9-5 LOB dev has to do, and even in the more CS heavy fields from my experience the algorithms and the fancy CS is 20% of the work and the rest is implementing mundane details, bug fixing, testing, cooperating with team members, etc. And that's the stuff that improved since the 90s.
Delivering reliable software is hard - but we've gotten a lot better at it in the last 20 years, if you don't believe me boot up some old windows 95/98 image install some random software and see for yourself.
I see the opposite, rehashes of old ideas rebranded as the latest new hottness. NoSQL was basically what everyone was doing before relational databases came along. Clojure is a Lisp dialect running on the JVM. ORM's are great until you need some complex query, then it helps to know SQL. Algorithmic complexities are the same whether you were squeezing things into a tiny underpowered computer or if you are dealing with terrabytes of data.
Security was abysmal probably because there were nowhere near as many threats.
I've been programming in functional languages for twelve years and recognizably modern C++ for fifteen at this point. And yet, I'm not a graybeard; I'm 28. But what I learned at that point still informs what I do today. And it's why I respect that those graybeards' old work still has impact today.
Back in 1996 I was using CVS at work. Before that it was RCS or SCCS (RCS was the first version control system I used). This was at Bell-Northern Research and Nortel, which was not a bastion of cutting edge techniques. ClearCase was pretty commonly used as well (it came out in 1992).
Automated testing wasn't science fiction. It was regularly done, just as system or integration tests. You'd see who did the commit that broke the build. This was before QA even got to do their work. This was something done across the company and I'm certain that it wasn't the only company that did this.
If the parent is sarcasm, please disregard my reply.
> The stuff you did in software 20 years ago has absolutely nothing to do with the stuff that's being done today.
Not even remotely true. Most of computer science is built on the past and either composed into something new upon a previous foundation, or a poor re-implementation of an already existing idea. A trendy favorite of many at the moment, Go lang is a good example of the former. For instance, CSP used in Go (and Clojure) is definitely not a new idea and has everything to do with what we do today. Another example is data structures - not sure what you're doing if you aren't using these, but the most common ones were invented long ago and most newer things I see are just refinements, additions, or behavioral changes on what existed before.
Speaking of Clojure, functional programming and Lisp are additional examples of old things "we" used to do in software 20+ years ago and are very much relevant today. You are making blanket statements about research, practices, and more here. It's true that Lisp was not widely used in many circles, but there are numerous reasons for that beyond what you mention, many of which were not good ones. Many people outside academia used a lot of old concepts successfully or otherwise knew they were good but had other limited factors, most commonly hardware or market share.
I encourage anyone "inventing" anything new to simply dig through old research whether it is 60s Lisp weeny stuff as you describe or Plan9 or anything else before you think you invented something. 99% of the time someone has at least researched it, written about it, and maybe even implemented it better than you will. Timing is often everything, as are other factors like technological resources (hardware, people, market skills, etc.). Computer science, like many fields, excels at reinventing things poorly. It usually comes down to thinking before acting as very few of us are both smart enough and talented enough to produce something properly acting on impulses and limited knowledge alone.
> This is the drag&drop and copy-paste as an abstraction VB shit era.
And this has changed how? Look at what a lot of people use Google and Stackoverflow to do. Of course these tools are great, but so was the VB GUI designer. It's how you use it and who is using it. We have far more copy-paste style tools than ever before and far more languages that are suited to producing unmaintainable spider webs of awful.
> People were writing C and the major problems of the industry were how to fit shit in to x MB of ram and x CPU cycles.
What are you doing now? People are still doing this constantly, not only for embedded programming but for games, apps, everything. Just about every month there are multiple posts on HN about people patting themselves on the back for shaving memory and CPU cycles off their code on a new platform and/or running on something like commodity hardware. I know you want to say hardware wasn't as good, but it's also been my experience less people know how to manage performance properly and we still spend a lot of time cleaning up messes, only they are on the gigabyte level instead of the byte level.
> Security was abysmal - you didn't even have process isolation on OS-es.
Not all OSs worked this way. True, the popular ones were often pretty bad. They are still not good and with more stuff accessible faster remotely, even more malicious actors, and bigger consequences, the situation is arguably worse.
> Unless you were at one of the cutting edge places that were actually innovating back then in terms of process and stuff there's likely nothing from that skill set that transfers on to the problems that we are dealing with today beyond the CS grad basics.
The same holds true today. If you only do one thing, you won't be good at programming except perhaps in that narrow field if you are lucky. If anything, it's been my observation that far less people know what they are doing as the barriers to entry into the field have gone down. Not everyone who once upon a time wrote C, COBOL, Lisp, Fortran, or whatever else learned nothing along the way. Plenty of people pivot and do amazing things.
Overall, it sounds to me you are the one who lacks the depth and breadth of experience. You need to meet more people, keep an open mind, and do research before you make judgement. Moreover, you need to avoid blanket statements (appreciate the irony).
Like I've said the stuff that hasn't changed much is the CS, the actual engineering part has nothing to do with what we do today. 20 years ago the net was nowhere nearly as ubiquitous, hardware was incomparably slow, the bottlenecks were different - the focus shifted from single computer/mainframes/shared memory models -> distributed computing.
When I say copy paste in visual basic I'm talking about professional programmers that couldn't even abstract stuff in to functions but would copy paste shit all over the code - not copy-pasting from other sources.
As much as people like to bitch about our industry we have heavily increased engineering standard, if you don't use source control these days you get laughed at, if you don't use automated testing you're an amateur, even junior programmers know about patterns and nobody copy-pastes code over their code base - stuff like this was the exception back then not the norm as it is now. You can say MVC was invented in the 80s or w/e but up until maybe 10 years ago the internet was full of tutorials/guides on how to do web coding by interpolating PHP inside of your HTML, using string concatenation to build SQL queries and such nonsense. Sure top 10% never did that but the vast majority of programmers did and the broader industry has improved significantly since then, in no small part thanks to the frameworks that constrain the bad developers on to the "good path".
I'll give you that the tooling is quite a bit better these days, but that's it. Practices are as horrid as ever. Cut-pasting from Stack Overflow seems like something every new grad knows how to do, and is everywhere. Not sure about your focus on design patterns--I'd often call their use a negative, not a positive. When faced with a problem, a good developer asks, "How can I solve this?" A bad developer asks, "I must find some named design pattern to mash this problem into."
At its core, 90% of professional programming is:
1. Plumbing: Routing data from this component to that one through this API
2. Processing: Algorithms, most which have been published for decades)
3. Formatting: Ingesting data in format A and displaying it for the user in format B.
This is true today, was true when I started, and was true 20 years before that.
"When I say copy paste in visual basic I'm talking about professional programmers that couldn't even abstract stuff in to functions but would copy paste shit all over the code - not copy-pasting from other sources."
The inability to use abstractions properly has long been a sign of inexperience or incompetence. I'd really like to see some evidence that this was any more prevalent in 1996 than it is in 2016.
There's definitely more of a concern these days in the general programming world about "best practices". But those best practices weren't invented yesterday. They're usually incremental changes of older practices, and the results of bitter, painful experience of doing things the "wrong" way. Exactly the kind of experience that senior engineers have that junior engineers lack. It's senior engineers who create those best practices in the first place.
We don't really have many true standards, just pockets of consensus on best practices and collections of so-called standards that are quite often ignored. Thanks to better communication, I think it's safe to agree that these practices are more widespread. As a result, yes, quality for certain groups has definitely increased.
I think you underestimate the fact that a large number of people actually do not follow these practices, even those that know better. Further, it's hard to make statements about software quality, especially with so many moving variables like the increase in the total amount of people in the industry and lower barriers to entry. Regarding best practices, I am sure many people have stories about asking questions in interviews about a company's software development, only to be told, "Well we'd love to do X, but we just don't have the resources or time, so we do Y." Even people who know better do otherwise in addition to those that are ignorant.
I think at a conceptual level, many things have not changed. There have always been sets of new ideas and practices that people adopted, many times blindly. These are of course a mixed bag, but many so-called "best practices" are often abandoned later - fads are the norm, and while that's sometimes OK, it does not always result in "better" as much as "different." I think some people fail to realize that in the moment, it's all happened before, and all will happen again. We both learn from our past mistakes with new trends and repeat those mistakes.
MVC and frameworks are actually interesting examples. I don't want to get too into individual tech discussions, but I've personally seen people handcuff themselves trying to religiously implement things such as MVC and actually failing because of a pattern like that since they were distracted from solving the problems properly. Adopting a pattern or framework does not automatically produce better code and can even lead you the opposite way. It's hard to say if things like this are a win for all so much as if they are simply a good fit for certain problems, and therein lies the challenge and one that hasn't become easier. The golden hammer or wrong tool will forever be problems.
Frameworks often are oversold and are not standards. In fact I've found that for many projects I have actually had to abandon a particular framework long-term because it was just getting in the way for various reasons - performance, maintainability, velocity of change, conceptual failings, zealotry, etc. Frameworks can derail "bad" developers as you call them just as much as good ones. Frameworks often explode if you don't do things the "one true way" and "bad" developers can be argued to be prone to that just as much as everyone else, and simply adhering to these rules still doesn't always produce better software. There have always been frameworks in one shape or form, but they didn't always have names, sometimes they were just the language or tool imposing its constraints, sometimes for the best, sometimes the worst. As a counterpoint, I've actually found working in various languages like Clojure, Lisp, and Racket that tend to value composition of tools and functions over frameworks more productive, however I would never apply this to all projects and situations.
I see what you're getting overall at and on some level I agree. I was around in the 80s, and I don't want to go back to that. At the same time, for all the old problems solved, I see new ones that arise along with other things that were never problems coming to haunt us. It sounds like your experience is very compartmentalized to web development and you're projecting those views on to both the past and present. People are people, and no time, technologies, or tools will fix that, just deal you different cards.
Yea I think I agree with what you said, especially that the increased communication helped this (I remember when I started programming I didn't have access to internet and once I actually got it it was a complete game changer in terms of learning).
And don't get me wrong I don't think what we have today is anywhere near perfect or even good, frameworks are often more trouble than worth but at least frameworks showed people that <?php mysql_query("select * from foo where id='" . $_GET["id"] . "'") ?> isn't the only way to write code - I do not want to go back to those days.
you would actually be surprised how things are the same. You just can't see it because you didn't experience it and than you are confusing transferable knowledge with tools knowledge. Latter is easy , anybody can learn new language/tool , using a tool properly and efficiently is a problem.
The stuff you did in software 20 years ago has absolutely nothing to do with the stuff that's being done today. Like holly shit most people weren't even using source control back, unit testing and automated testing in general was SciFi, it was done by QA departments if you were big enough to do it.
People were writing C and the major problems of the industry were how to fit shit in to x MB of ram and x CPU cycles.
Security was abysmal - you didn't even have process isolation on OS-es.
The languages and the practices are also completely different - disregarding the "we did it in the 60s with LISP in this paper" crowd - outside of academia people were just getting in to the OOP which was the pinnacle of abstractions back then.
This is the drag&drop and copy-paste as an abstraction VB shit era.
Unless you were at one of the cutting edge places that were actually innovating back then in terms of process and stuff there's likely nothing from that skill set that transfers on to the problems that we are dealing with today beyond the CS grad basics.