Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wouldn't worry. We can actually Get Stuff Done, which at the end of the day is what really counts. Inexperienced young programmers working 2x the hours do not necessarily produce Output That Works (I know, I've been one myself). Sure, not every company will realize that, but then again, you don't necessarily want to work for every ping-pong table company, either. Cultural fit and all that.


I hate smug answers like this. I'm old and the young people I work with at Uber are amazing and get tons of things done. Just because you're old doesn't mean you know any better or just because they are young, it doesn't mean they can't get shit done.

Pretending that just because you're old you hold some magical advantage doesn't fly. You need to prove yourself every day because this is the industry we are in.

And the reasons why young people are more desirable is obvious. They generally have more energy, less time constraints and aren't afraid of change. I work very hard to keep myself current. But it's hard.

I have 5 side projects gathering dust at home because after work, commute and kids, I'm exhausted. It's easy to keep up when your only commitments are work and relationship.

So it's fairly obvious why younger people are more desirable. As long as companies give us old guys a chance I don't mind. I'm smart and will fight for my opportunities. I don't think I'm entitled to anything. My boss is almost 20 years younger than me but he's smart and cool. At some point this may end but that's just reality. No one hires 60 year old coal miners either when there are 20 year olds available.


> Pretending that just because you're old you hold some magical advantage doesn't fly

Experience is not magic, it is not directly a function of age but of duration in the industry. I'm not even tyat old yet, but my experience (aka larger training set) so far allows me to pattern-match/foresee potential disasters way before they appear on radars of more junior devs: and this goes for technology, people and processes.

> No one hires 60 year old coal miners either when there are 20 year olds available

That's because mining coal is physically exerting. I would hire a 60 year old boilermaker before any 20 year old since I don't want things to go boom.

Edit:

> just because they are young, it doesn't mean they can't get shit done.

With experience, you know which shit not to do


You're comparing great old people with shitty young people. I know plenty of very shitty old people and a lot of great young people that would invalidate what you said above. And in my experience there aren't very many great old people that are current with technology, that can code quickly, etc. most just want to keep doing what they're doing and earn a paycheck which is fine but then they shouldn't complain when they get replaced by younger faster models.


You might have misunderstood me. My point is experience is a great asset to any developer, young or old.

> And in my experience there aren't very many great old people that are current with technology, that can code quickly, etc.

That might mean that the great old people are either not great at these things despite their efforts (which I believe is your opinion), OR it could mean the old people optimize for the things you did not list, and don't place too a lot of value on "coding quickly", for example.

> And in my experience there aren't very many great old people that are current with technology

There older I get, the more I realize there is more to life than work. I mean, I love technology, web development particularly (because the web is awesome for humanity), I really do enjoy it. However, I can make better, more fulfilling use of (some of) my weekends and evenings than staring at my laptop, rewriting an app in LatestHypeJS. Is there any other industry that requires much running just to stay in one place?

Bear in mind the "younger faster models" won't stay young forever.


I think an interesting question is - more desirable to what end? To what end are these youths toiling? How, after millenia of evidence to the contrary indicating that experience is valuable, are we to believe that somehow magically it is not valuable? What if the end to which the young are applying themselves (or being applied) is not in fact of any value? Historically young inexperienced men have been considered entirely disposable. Perhaps now they are being disposed of in a different way?


Are you saying experience counts for nothing? Besides for the muscle memory of being able to knock out a script quickly, there's also the advantage of having seen weird one-off problems before.

To give a simple advantage, not really a one-off problem, but a dev was wondering why a function in his cron script wasn't working when the same function was working fine elsewhere in the application. I told him to check that cron was using the same config file as the webserver, but I only knew that because I'd seen the same thing before. No amounts of energy will give you that 'seen this before' experience. I remember the first time a server ran out of disk space really confused me - non of the logging/error messages seemed to make sense and the behaviour of the application seemed weird. It takes experience to pick up these patterns. Age isn't the best heuristic here, because a 25 year old might have 15 years experience, and a 40 year old might have picked up programming only 5 years ago, but experience counts.

> I have 5 side projects gathering dust at home because after work, commute and kids, I'm exhausted. It's easy to keep up when your only commitments are work and relationship.

That's just you. I've known young people with the same commitments of relationships and kids, and old people who could fully commit themselves to work so I don't think your point about time constraints hold.

And outside of Silicon Valley, I don't see this same enthusiasm for young people. Maybe it's the bias of the field I'm in (business software) but with age also comes industry experience independent of programming knowledge (I used to naively think automating all processes was the end goal - there are some processes that shouldn't be automated for good customer relationships). Bear in mind most successful startups are started between the ages 35-45 - source here - http://www.forbes.com/sites/krisztinaholly/2014/01/15/why-gr... - and the people who did the study attributed it to industry experience.

At the end of the day, the best filters/heuristics aren't visibly external ones like age (or race) but practical ones like discussions on theory, conversations about commitment - actual data about the subject matter.

> No one hires 60 year old coal miners either when there are 20 year olds available.

I'd definitely prefer a 60 year old doctor (I don't mean a surgeon, but a consultant), lawyer or accountant. Do you think coal mining is more similar to those or to coal mining?


> We can actually Get Stuff Done, which at the end of the day is what really counts

It's harder for me to "get stuff done" when then "stuff" is bullshit. There are times when what's being asked for is wrong (stealing code/images/whatever), or a wrong solution to the problem. "Right/Wrong" can be subjective, so I shy away from making those snap judgements until I know more about why something is being asked for, but after a while, you tend to develop a 6th sense for crazy.

A 23 year old in their first job will (generally) just plug away at whatever's given them. Older folks generally won't deal as well with bullshit work.


> It's harder for me to "get stuff done" when then "stuff" is bullshit

Same here, I guess tolerance to bullshit diminishes with age. Perhaps it has to do with experience. I cringe when asked to do something that will increase technical debt because I can see what it will be like repaying that debt in the future. Younger me could not see that far.


> A 23 year old in their first job will (generally) just plug away at whatever's given them. Older folks generally won't deal as well with bullshit work.

On the contrary, I've seen far too many 20 somethings that will refuse to do anything that isn't new development.


I've seen a bit of that too, but perhaps not as much as you.

I've seen loads of folks that don't know how to work with older code (and have been guilty of that myself).

Worked on a project recently where some bit of PHP was causing trouble. The answer pushed was "just build it in node!". This was pushed by people who don't know PHP, and don't know what problem was actually occurring, but hey - "move to node!" can't ever have a downside, can it? "I'm not sure what problem I'm solving, but it'll be in nodejs, so everything will be great" seems to be a common trend over the last couple of years (at least in some of the circles I travel in).


I see that as a selling point to being more experienced. Intolerance for bullshit is a good trait in many a job.


It really depends on the politics of the office/team/company you're working with. It can either be seen as a huge plus, or a huge minus.


Honestly, even as I'm getting older, this isn't at all obvious to me. Someone with five years solid experience with web development might very well be far more effective than someone with twenty years experience of working in corners of large companies with various technologies, especially when you consider the expected salary.

I think it good that people are talking about these things, but sometime it seem like people think it's a big conspiracy, rather than a factor of how the ecosystem and market looks.


If you were having cardiac surgery, would you prefer your surgeon to be a year out of med school having done the operation oh 3 or 4 times, or the proverbial graybeard who has done it a thousand times?

If you were wrongly convicted and awaiting a death sentence, would you pick a 22 year old attorney fresh from law school to defend you?

If you have a field full of tomatoes that needs picking, do you save a few dollars an hour hiring the young guy at minimum wage who is healthy, has no family, and will easily work beyond maximum hours without reporting it, or do you hire the older tomato picker who needs a higher salary because of his kids and can't work more than 7.5 hours a day because of his bad back?

So are we law and medicine or are we tomato pickers? I had always thought of engineering as the former but sadly the more articles I see on how "culture fit" discrimination works, maybe we've commoditized the industry into a low wage manual labor job.


>If you were having cardiac surgery, would you prefer your surgeon to be a year out of med school having done the operation oh 3 or 4 times, or the proverbial graybeard who has done it a thousand times?

If you were hiring, you would want the starry eyed, finger in the pulse of latest fads, young programmer you can pay less, overwork, boss around, and screw over, or the senior engineer, who values sane hours and time with family, doesn't blindly adopt any BS fad technology you want them to use, and demands salary that matches their experience?

Especially if the first can still get out something that "kinda works" and you could not give a rats arse about technical debt and bad decisions behind the UI facade?


Soooo.... Tomato pickers it is! :-)

Good counterpoint, liked both this and the grandparent comment. Ouch.


If that's what the hiring person is like, though, I'm fine with not getting that job...


As long as there are enough other good programming gigs available though...

As in, most McDonalds workers also hate both the hiring person AND the job, but they do it anyway.


- until that's _all_ there is.


Weirdly, the mortality rates from surgeries can go up with much older doctors, for a variety of reasons, but mostly because they have lower volume of work so aren't as in practice and may not be current on latest techniques that may be less invasive or higher success rate. It could also be older people like having older doctors and the mortality rate is higher with older people getting surgery.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1856535/

There is probably a statistical sweet spot, where someone has years of experience, is active enough to be up on the latest techniques, or at least learns the new techniques and adopts them. Parallels can be drawn between this and tech workers I'm sure.


The vast majority of software engineering work does not have the same rigor that other fields of engineering do. Standards, safety, inspection, accountability, documentation, testing, certification, etc... many other fields focus on these things, but in software it's pretty much "anything goes so long as we make money." The trick to making companies value good engineers is getting them to value good engineering.


Churning out code usually isn't engineering, and it isn't tomato picking. It's more like being a machinist in an industrial setting. You need to know the tools and how to read the spec. That's why you can post a technology position, get 1,000 resumes from people working professionally, and find that 80% of the candidates lack basic knowledge.

The problem is that if you don't up your game to be in a more engineering-y role, you're just one ROI calculation away from being automated out of existence.


Umm the more comparable parable would be "or the proverbial graybeard who has done ~~it~~ kidney surgery a thousand times"

The stuff you did in software 20 years ago has absolutely nothing to do with the stuff that's being done today. Like holly shit most people weren't even using source control back, unit testing and automated testing in general was SciFi, it was done by QA departments if you were big enough to do it.

People were writing C and the major problems of the industry were how to fit shit in to x MB of ram and x CPU cycles.

Security was abysmal - you didn't even have process isolation on OS-es.

The languages and the practices are also completely different - disregarding the "we did it in the 60s with LISP in this paper" crowd - outside of academia people were just getting in to the OOP which was the pinnacle of abstractions back then.

This is the drag&drop and copy-paste as an abstraction VB shit era.

Unless you were at one of the cutting edge places that were actually innovating back then in terms of process and stuff there's likely nothing from that skill set that transfers on to the problems that we are dealing with today beyond the CS grad basics.


Stuff that hasnt changed markedly: geometry math, statistics. SQL - 1970s. SOLID - parts of that are late 1980. C still looks like C. Unix like environments. Awk/sed/etc.

Go read about the history of the web or XML; you get a very strong sense that these things have been thought about for a long time - data interchange has varied formats, but there is still a lot of tedious ETL type work. The importance of naming things well hasnt changed. Identifiers for data being a hard thing hasnt changed. Schema/vocabularies and more are still important problems.

If you cant see some of these things underpinning much of the work we do, you might be missing the forest for the trees


One thing I've found that's significantly changed since I began my career, before my beard went grey, is that projects are now in general run much better. People never make a total mess of estimating timings, and the Agile methodology (and Kanban boards, and Trello, and JIRA, and of course Slack) really fix all organisational and prioritisation problems such that projects never end in disaster; gone are the days of frantic last minute firefighting and all-nighters, as we desperately work out which features are critical to getting the project live before everything explodes. Clearly the experience of older, wiser heads in moments of crisis is now basically never required because projects run so smoothly.

Haha, sorry, I'm only kidding.


Love it--Thanks for the last line there. I'm glad I didn't have any food in my mouth while reading this because it'd be all over the screen right now.


This made my day. I only wish I could have read it around 3:00 this morning when I was still up writing code.


>The stuff you did in software 20 years ago has absolutely nothing to do with the stuff that's being done today. Like holly shit most people weren't even using source control back, unit testing and automated testing in general was SciFi, it was done by QA departments if you were big enough to do it.

That stuff a competent graybeard knows already.

You don't think they disappeared in 1990 and re-appeared magically today, or that they didn't kept up with the times during their career, right?

In addition, they would know all the solid stuff -- the things programmers before the internet had to learn the hard way, and lots of today's dilettantes lack.

And you must be very young (20 something?) if you really believe that "the stuff you did in software 20 years ago has absolutely nothing to do with the stuff that's being done today". That's simply not true -- and triply not true for 15 years ago.


I mean if you were a good dev in the 90s chances are right now you're in some senior architect/management/consultant position. If you're competing for the same jobs as 20 something devs then you've probably screwed up somewhere in your career.

I started programming in late 90s but not professionally until 2006 I think, this was probably the late phase of transition to internet as a dominant factor in computing - so I think it gave me enough perspective as to how things changed, plus talking to older developers, reading stuff online and on the forums I know we've come a loong way as an industry.


>I mean if you were a good dev in the 90s chances are right now you're in some senior architect/management/consultant position. If you're competing for the same jobs as 20 something devs then you've probably screwed up somewhere in your career.

Or you just don't like management / like programming?


My last two jobs I got hired for, I said while interviewing that it was my explicit career goal to never become a manager.


Yeah, I had some chances, but never liked the role of the pointy haired boss.

On the other hand, I don't like having to suffer stupid enterprise project managers either, though in my line of work it's better than the norm (more startup-y -- heck, we are at 100 people or so and our founder/boss still codes like crazy, same for the CTO).

I think the ideal for our kind of persons is having your own company -- either an one person one, or a bigger outfit but with someone else handling the marketing and finance parts. You get to both program and chose what and how to deliver.


Typical ageist bullshit. Your argument is if someone is good at something then they are probably not doing it anymore. Newsflash, you were moved out of dev because you have some issue. Most people avoid "architects" which is code for cranky bastard who can't be fired but is likely breaking lots of stuff in the codebase and generally making things bad for the devs of all ages.


I guess latest code review didn't go well ?


"I mean if you were a good dev in the 90s chances are right now you're in some senior architect/management/consultant position. If you're competing for the same jobs as 20 something devs then you've probably screwed up somewhere in your career."

Not everybody's interested in climbing the corporate ladder or in architecture, management, or consulting. Some people are content to be senior engineers.

"we've come a loong way as an industry"

The industry has certainly changed a lot, but if you keep your skills up to date, that's not going to be an issue.


> The stuff you did in software 20 years ago has absolutely nothing to do with the stuff that's being done today.

The longer I'm in software the more and more I find this not to be true. The stuff that has changed is all surface level. The underlying principles are the same. The skills needed are the same. The attention to detail and other attributes needed to get the job done is the same. The problems are basically recycled or scaled up versions of the problems we had years ago.


There's new stuff that's a significant advance (machine learning that really works), there's stuff that's just different (like most of the web backend technologies), and there's stuff that's worse (security).


Hmmm ... disagree!

18 years ago we had much of what we had today. Things might move on but they do not change. I mean hey, everyone still hates Visual Basic!

At least 18 years ago we were actually writing the majority of our code rather than relying so heavily on libraries and third party code. I mean someone only has to pull a Node.js plugin for padding and there is complete melt down!


> The stuff you did in software 20 years ago has absolutely nothing to do with the stuff that's being done today.

This is basically false in every way it's possible to be false.


I've only been in the industry 13 years, and I'm not using a single tool today that I used back then, but I'm still using the lessons I learned:

How much time should my team spend on testing vs. code review vs. building new features?

How risky is this feature, and how much QA does it need?

Which of the brand-new engineers on my team needs a careful eye on their code so they don't blow everything up?

When is it time to say screw it, we have to ship?

Which new hip framework is likely to have staying power, because of the people and the commitment behind it?

How do I convince open-source maintainer X to accept my new feature, so I'm not maintaining patches into eternity?

Those are the important questions I deal with every day. I couldn't answer them as well 10 years ago, and I bet I can answer them better 10 years from now.


Most programs don't really use algorithms or data structures that were discovered much later than the '70s. This idea that old experience is worthless because we're using different libraries now strikes me as altogether wrongheaded.


The stuff you learned in your CS course is still just as relevant as it was 20 years ago. The things you did in your development career are nowhere near so.

Did you use SCM, automated testing and design patterns 20 years ago ? If yes do you think this was the norm in the industry ?

I'm not saying those things are the only relevant things, I'm just pointing out the obvious examples of how much things have changed since then. I've said in another comment, we've moved from mainframe/single PC shared memory model in to distributed/cloud/networked computing in the last 20 years and the problems from that era are completely different to problems today. Even on the low-level end the basic design constraints such as the difference between CPU instruction speed/memory access/cache miss dramatically changed what it means to write optimized code, nobody these days rolls their own SQRT function to squeeze out a few extra CPU cycles, now days it's about making sure your memory is laid out so that you get as few cache misses as possible.


> Did you use SCM, automated testing and design patterns 20 years ago ? If yes do you think this was the norm in the industry ?

Design patterns are way older than 20 years... hell, the design pattern bible was published 22 years ago, which means design patterns were in wide circulation well before I could tie shoes. According to wikipedia CVS has been around for 26 years.


> According to wikipedia CVS has been around for 26 years.

Not to mention SCCS for 44 years, and RCS for 34.


Again - I'm not saying these things didn't exist - I'm saying they weren't the norm in the industry.


You're saying a lot of things, without stopping to check whether you're right or not.


Even though I am ony 25 I can recognize that all the computing foundations is still the same. I work in a 80's style company deving embedded system and I constantly talk with the founders who used to program the first system we used to sell. They used to implement everything from scratch. Sharing code, open source stuffs and Internet are making our life easier. But at the end of the they the knowledge behind the libraries, frameworks used in the library is the same from the early times.


I picked up the CVS habit almost exactly 20 years ago (1996) from someone in the banking & finance industry -- hardly a bastion of radical adoption. By then, CVS was 10 years old. We used RCS extensively to manage config files.

To the earlier parent who mentioned process isolation: You're thinking too much about the consumer/desktop world. The entire enterprise world was used to it and demanded it, be it on (hey, humor here) SCO UNIX (yes, they really did make a useful product before they turned evil), SunOS, VAX/VMS, MVS, BSD UNIX, etc.

The desktop world was far behind the state of the art in commercial systems in those days. Even in the 80s, you were quite possibly running your enterprise with an Oracle database, on fully memory-protected hardware. Heck - you may have even been running some of your stuff in a VM on an IBM mainframe. We took some big jumps backwards in pursuit of making computers cost-effective enough for mass adoption, and it took a while before the consumer side of things jumped forward to become the same platforms used for enterprise use.

Kids these days. ;)


Can I ask how old you are? And if you are below 25, which I think you are, judging by your comments, then how do you know?


> Did you use SCM, automated testing and design patterns 20 years ago ? If yes do you think this was the norm in the industry ?

Just because people did not use a standalone program for version control does not mean that they did not version their code. This goes all the way back to card/tape drawers and paper tape revisions (there is a great anecdote about this in Steven Levy's Hackers). Look at software from the late 1970s/early 1980s - there will be versioning information there, and usually changelogs describing what was changed. A lot of operating systems also had file versioning built into the filesystem (https://en.wikipedia.org/wiki/Versioning_file_system#Impleme...) that was used for revision control. Since most of these OSes were timesharing systems this was in effect team-based version control.


> Did you use SCM, automated testing and design patterns 20 years ago ? If yes do you think this was the norm in the industry ?

SCM was the norm, at least in certain circles, maybe you should read up on stuff as:

https://en.wikipedia.org/wiki/IBM_Software_Configuration_and...

https://en.wikipedia.org/wiki/Source_Code_Control_System

Now of course, many PC developers came from 8-bit computers rather than mainframes, where SCM didn't really mattered, because programs were very small.

Automated testing - well, I have seen Lisp compiler written in the 80s that had them. So where it made sense and was possible to do, I think people did it. (IMHO it only makes sense for programs or their parts that are written in purely functional style, which is not a big chunk.) The workflow was different in the past (big emphasis on integration and system testing) and I wouldn't say it was necessarily worse.

Design patterns definitely existed, but they weren't named as such.

I am not even sure if people (writing business applications) are more productive today. We have lot of new artificial requirements. In the past, it was typical for datatypes and other things to be constrained by design. Today, people are unwilling to do that, even though it wouldn't change anything from the business perspective.

Or take a look at web. It's a tangled mess of HTML, CSS and JS (it's been 20 years and people still can't agree whether or not is a good idea to produce HTML in JS!). In the past, you would use something like CICS or VB or Delphi, which is a consistent framework, written for the purpose of building interactive business application.

> we've moved from mainframe/single PC shared memory model in to distributed/cloud/networked computing in the last 20 years and the problems from that era are completely different to problems today

Not really. The tradeoffs are simple to understand for somebody who had CS course 20, or even 40 years ago. And even back then people writing applications didn't hand roll their own SQRT, even back then frameworks and libraries did that for you.


The design pattern book came out 22 years ago so they were certainly named as such.

My first Unix programming job had me first learn RCS and then we later switched to CVS. I used CVS personally for years after that until I switched to Subversion, then Mercurial, and now Git. What I'm doing isn't that different from when I used RCS. There's more steps because of the added complexity but it's roughly similar.


I was in high school and didn't know the first thing about programming 20 years ago. But it seems to me that "design patterns," SCM, and automated testing aren't the hard part of this discipline.


Sure they are. They are the "hard part" of what every 9-5 LOB dev has to do, and even in the more CS heavy fields from my experience the algorithms and the fancy CS is 20% of the work and the rest is implementing mundane details, bug fixing, testing, cooperating with team members, etc. And that's the stuff that improved since the 90s.

Delivering reliable software is hard - but we've gotten a lot better at it in the last 20 years, if you don't believe me boot up some old windows 95/98 image install some random software and see for yourself.


We also do a lot of typing but nobody's arguing that that should be the primary yardstick by which we measure.


I wasn't in high school 20 years ago; I wasn't even ten. I dunno why I said that. Whoops.


"nobody these days rolls their own SQRT function to squeeze out a few extra CPU cycles"

Um, nobody did this 20 years ago, either.



Also an incredibly specialized case and in those kinds of extremely performance-intensive scenarios it might still make sense.


I see the opposite, rehashes of old ideas rebranded as the latest new hottness. NoSQL was basically what everyone was doing before relational databases came along. Clojure is a Lisp dialect running on the JVM. ORM's are great until you need some complex query, then it helps to know SQL. Algorithmic complexities are the same whether you were squeezing things into a tiny underpowered computer or if you are dealing with terrabytes of data.

Security was abysmal probably because there were nowhere near as many threats.

Some things change. None of it goes away.


I've been programming in functional languages for twelve years and recognizably modern C++ for fifteen at this point. And yet, I'm not a graybeard; I'm 28. But what I learned at that point still informs what I do today. And it's why I respect that those graybeards' old work still has impact today.


Back in 1996 I was using CVS at work. Before that it was RCS or SCCS (RCS was the first version control system I used). This was at Bell-Northern Research and Nortel, which was not a bastion of cutting edge techniques. ClearCase was pretty commonly used as well (it came out in 1992).

Automated testing wasn't science fiction. It was regularly done, just as system or integration tests. You'd see who did the commit that broke the build. This was before QA even got to do their work. This was something done across the company and I'm certain that it wasn't the only company that did this.


If the parent is sarcasm, please disregard my reply.

> The stuff you did in software 20 years ago has absolutely nothing to do with the stuff that's being done today.

Not even remotely true. Most of computer science is built on the past and either composed into something new upon a previous foundation, or a poor re-implementation of an already existing idea. A trendy favorite of many at the moment, Go lang is a good example of the former. For instance, CSP used in Go (and Clojure) is definitely not a new idea and has everything to do with what we do today. Another example is data structures - not sure what you're doing if you aren't using these, but the most common ones were invented long ago and most newer things I see are just refinements, additions, or behavioral changes on what existed before.

Speaking of Clojure, functional programming and Lisp are additional examples of old things "we" used to do in software 20+ years ago and are very much relevant today. You are making blanket statements about research, practices, and more here. It's true that Lisp was not widely used in many circles, but there are numerous reasons for that beyond what you mention, many of which were not good ones. Many people outside academia used a lot of old concepts successfully or otherwise knew they were good but had other limited factors, most commonly hardware or market share.

I encourage anyone "inventing" anything new to simply dig through old research whether it is 60s Lisp weeny stuff as you describe or Plan9 or anything else before you think you invented something. 99% of the time someone has at least researched it, written about it, and maybe even implemented it better than you will. Timing is often everything, as are other factors like technological resources (hardware, people, market skills, etc.). Computer science, like many fields, excels at reinventing things poorly. It usually comes down to thinking before acting as very few of us are both smart enough and talented enough to produce something properly acting on impulses and limited knowledge alone.

> This is the drag&drop and copy-paste as an abstraction VB shit era.

And this has changed how? Look at what a lot of people use Google and Stackoverflow to do. Of course these tools are great, but so was the VB GUI designer. It's how you use it and who is using it. We have far more copy-paste style tools than ever before and far more languages that are suited to producing unmaintainable spider webs of awful.

> People were writing C and the major problems of the industry were how to fit shit in to x MB of ram and x CPU cycles.

What are you doing now? People are still doing this constantly, not only for embedded programming but for games, apps, everything. Just about every month there are multiple posts on HN about people patting themselves on the back for shaving memory and CPU cycles off their code on a new platform and/or running on something like commodity hardware. I know you want to say hardware wasn't as good, but it's also been my experience less people know how to manage performance properly and we still spend a lot of time cleaning up messes, only they are on the gigabyte level instead of the byte level.

> Security was abysmal - you didn't even have process isolation on OS-es.

Not all OSs worked this way. True, the popular ones were often pretty bad. They are still not good and with more stuff accessible faster remotely, even more malicious actors, and bigger consequences, the situation is arguably worse.

> Unless you were at one of the cutting edge places that were actually innovating back then in terms of process and stuff there's likely nothing from that skill set that transfers on to the problems that we are dealing with today beyond the CS grad basics.

The same holds true today. If you only do one thing, you won't be good at programming except perhaps in that narrow field if you are lucky. If anything, it's been my observation that far less people know what they are doing as the barriers to entry into the field have gone down. Not everyone who once upon a time wrote C, COBOL, Lisp, Fortran, or whatever else learned nothing along the way. Plenty of people pivot and do amazing things.

Overall, it sounds to me you are the one who lacks the depth and breadth of experience. You need to meet more people, keep an open mind, and do research before you make judgement. Moreover, you need to avoid blanket statements (appreciate the irony).


Like I've said the stuff that hasn't changed much is the CS, the actual engineering part has nothing to do with what we do today. 20 years ago the net was nowhere nearly as ubiquitous, hardware was incomparably slow, the bottlenecks were different - the focus shifted from single computer/mainframes/shared memory models -> distributed computing.

When I say copy paste in visual basic I'm talking about professional programmers that couldn't even abstract stuff in to functions but would copy paste shit all over the code - not copy-pasting from other sources.

As much as people like to bitch about our industry we have heavily increased engineering standard, if you don't use source control these days you get laughed at, if you don't use automated testing you're an amateur, even junior programmers know about patterns and nobody copy-pastes code over their code base - stuff like this was the exception back then not the norm as it is now. You can say MVC was invented in the 80s or w/e but up until maybe 10 years ago the internet was full of tutorials/guides on how to do web coding by interpolating PHP inside of your HTML, using string concatenation to build SQL queries and such nonsense. Sure top 10% never did that but the vast majority of programmers did and the broader industry has improved significantly since then, in no small part thanks to the frameworks that constrain the bad developers on to the "good path".


If you don't use source control these days, you're below average.

If you don't use automated testing, you are probably average.

Plenty of junior programmers know nothing about patterns except how to cargo-cult them.

Plenty of people copy-paste code everywhere.

I'm curious how long you've been doing this stuff, because this sounds like neophilia.


I'll give you that the tooling is quite a bit better these days, but that's it. Practices are as horrid as ever. Cut-pasting from Stack Overflow seems like something every new grad knows how to do, and is everywhere. Not sure about your focus on design patterns--I'd often call their use a negative, not a positive. When faced with a problem, a good developer asks, "How can I solve this?" A bad developer asks, "I must find some named design pattern to mash this problem into."

At its core, 90% of professional programming is:

1. Plumbing: Routing data from this component to that one through this API

2. Processing: Algorithms, most which have been published for decades)

3. Formatting: Ingesting data in format A and displaying it for the user in format B.

This is true today, was true when I started, and was true 20 years before that.


"When I say copy paste in visual basic I'm talking about professional programmers that couldn't even abstract stuff in to functions but would copy paste shit all over the code - not copy-pasting from other sources."

The inability to use abstractions properly has long been a sign of inexperience or incompetence. I'd really like to see some evidence that this was any more prevalent in 1996 than it is in 2016.

There's definitely more of a concern these days in the general programming world about "best practices". But those best practices weren't invented yesterday. They're usually incremental changes of older practices, and the results of bitter, painful experience of doing things the "wrong" way. Exactly the kind of experience that senior engineers have that junior engineers lack. It's senior engineers who create those best practices in the first place.


We don't really have many true standards, just pockets of consensus on best practices and collections of so-called standards that are quite often ignored. Thanks to better communication, I think it's safe to agree that these practices are more widespread. As a result, yes, quality for certain groups has definitely increased.

I think you underestimate the fact that a large number of people actually do not follow these practices, even those that know better. Further, it's hard to make statements about software quality, especially with so many moving variables like the increase in the total amount of people in the industry and lower barriers to entry. Regarding best practices, I am sure many people have stories about asking questions in interviews about a company's software development, only to be told, "Well we'd love to do X, but we just don't have the resources or time, so we do Y." Even people who know better do otherwise in addition to those that are ignorant.

I think at a conceptual level, many things have not changed. There have always been sets of new ideas and practices that people adopted, many times blindly. These are of course a mixed bag, but many so-called "best practices" are often abandoned later - fads are the norm, and while that's sometimes OK, it does not always result in "better" as much as "different." I think some people fail to realize that in the moment, it's all happened before, and all will happen again. We both learn from our past mistakes with new trends and repeat those mistakes.

MVC and frameworks are actually interesting examples. I don't want to get too into individual tech discussions, but I've personally seen people handcuff themselves trying to religiously implement things such as MVC and actually failing because of a pattern like that since they were distracted from solving the problems properly. Adopting a pattern or framework does not automatically produce better code and can even lead you the opposite way. It's hard to say if things like this are a win for all so much as if they are simply a good fit for certain problems, and therein lies the challenge and one that hasn't become easier. The golden hammer or wrong tool will forever be problems.

Frameworks often are oversold and are not standards. In fact I've found that for many projects I have actually had to abandon a particular framework long-term because it was just getting in the way for various reasons - performance, maintainability, velocity of change, conceptual failings, zealotry, etc. Frameworks can derail "bad" developers as you call them just as much as good ones. Frameworks often explode if you don't do things the "one true way" and "bad" developers can be argued to be prone to that just as much as everyone else, and simply adhering to these rules still doesn't always produce better software. There have always been frameworks in one shape or form, but they didn't always have names, sometimes they were just the language or tool imposing its constraints, sometimes for the best, sometimes the worst. As a counterpoint, I've actually found working in various languages like Clojure, Lisp, and Racket that tend to value composition of tools and functions over frameworks more productive, however I would never apply this to all projects and situations.

I see what you're getting overall at and on some level I agree. I was around in the 80s, and I don't want to go back to that. At the same time, for all the old problems solved, I see new ones that arise along with other things that were never problems coming to haunt us. It sounds like your experience is very compartmentalized to web development and you're projecting those views on to both the past and present. People are people, and no time, technologies, or tools will fix that, just deal you different cards.


Yea I think I agree with what you said, especially that the increased communication helped this (I remember when I started programming I didn't have access to internet and once I actually got it it was a complete game changer in terms of learning).

And don't get me wrong I don't think what we have today is anywhere near perfect or even good, frameworks are often more trouble than worth but at least frameworks showed people that <?php mysql_query("select * from foo where id='" . $_GET["id"] . "'") ?> isn't the only way to write code - I do not want to go back to those days.


you would actually be surprised how things are the same. You just can't see it because you didn't experience it and than you are confusing transferable knowledge with tools knowledge. Latter is easy , anybody can learn new language/tool , using a tool properly and efficiently is a problem.


>So are we law and medicine or are we tomato pickers?

There's engineering and engineering. Making apps using electron.js and writing high-frequency trading software isn't the same gig. I think much of the problem described by the parent comment falls upon those programmers doing the former.


> If you were having cardiac surgery, would you prefer your surgeon to be a year out of med school having done the operation oh 3 or 4 times, or the proverbial graybeard who has done it a thousand times?

I can make ridiculous scenarios too. If the cardiac surgery is using the latest technology, which the younger surgeon had used all 4 times successfully, and the older surgeon had used once and killed the patient?

This idea that being around since COBOL = smarter/better/"we GTD while they sit around doing nothing for 16 hours" is idiotic and not at all based in reality.


>If you were having cardiac surgery, would you prefer your surgeon to be a year out of med school having done the operation oh 3 or 4 times, or the proverbial graybeard who has done it a thousand times?

http://www.npr.org/sections/health-shots/2014/12/22/37250839...


> If you were having cardiac surgery, would you prefer your surgeon to be a year out of med school having done the operation oh 3 or 4 times, or the proverbial graybeard who has done it a thousand times?

Rationally one would care about the outcomes, and the number of times would merely make us more certain about the outcome.

Relative to the two scenarios you showed, programming is usually more like tomato picking—nobody is going to die or be injured because of a bug or failure to deliver something.


Perhaps most of the time that's true, I suspect that a lot of health care is the same. There have been plenty of instances of software bugs killing or injuring people:

https://en.wikipedia.org/wiki/List_of_software_bugs


"nobody is going to die or be injured because of a bug or failure to deliver something"

You should read through RISKS Digest.[1]

[1] - https://en.wikipedia.org/wiki/Risks_digest


self driving cars.


If you were building a team to win the Super Bowl, would you pick the hot shot 23 year old fresh grad from college, or the 50 year old experienced Hall of Famer?


50 year old hall of famer? Its rare for a QB to make it to 40 years old in the NFL. Peyton Manning retired at 39 years old after he won the Super Bowl last year, and he was considered ancient and past his prime despite that achievement.

Furthermore, I just looked it up, and the oldest player to ever have played in the league was George Blanda, who retired at age 48. http://www.profootballhof.com/football-history/40-and-over-c...


As a cardiologist, I would prefer to refer my patients to the cardiac surgeon who is the most skilled, regardless of their time out of training. Similarly, I don't think that a programmer's age tells me much.


Engineering might be the former. Programming and software "engineering" is definitely the latter.


I came to this realization a few years ago. I found that while my judgment was still improving, my output wasn't and my ability to think through problems wasn't where it used to be, especially when it came to the tiny details.

So I moved into management. It allowed me leverage my strengths while putting me in a situation where my value wasn't determined by my raw output. It sucks going to all those meetings, but it honestly made me enjoy the programming I did do considerably more than I did before. The programming I did do for the team was all prototype work with new technologies that either I or the team felt could help us out...short 2-3 hour projects whenever I could fit them into my schedule. But everything was new and I wasn't implementing things I'd done a ton of times before.

In the end, I don't think this is a satisfying answer. There just aren't enough managerial jobs for all the old programmers and a lot of them can't handle the meeting workload of that kind of position. But I still suggest it on an individual basis because it's a chance to put all that experience to work programming at a higher level. Instead of linked lists you're dealing with fresh-out-of-college devs who know their big-Os by heart and can crank out code at a frightening pace. But you still need someone to ensure the overall design doesn't suck. To read over what they're producing since there will always be some crazy mis-step that they make from time to time. You still need someone who has seen a project evolve over multiple years to know which decisions will become problematic down the road. The young guys can't do that. And while it's been an adjustment to learn to find satisfaction without typing out the code myself, I've learned to enjoy the launches (both product and career) in a way that's more fulfilling than my time as a full-time coder was.


How does a programmer go about moving into management?


Ask. I had a conversation with the VP of Eng where I told him that's what I wanted, asked if it was a possibility and what I needed to do to make it happen. He told me what he thought I needed to learn and allowed me to take a couple of hours each week to read, meet with other managers and otherwise prepare. A few months later, when a position became open, he gave me a trial run which converted to perm after a successful month.

Most engineering leadership is very reasonable with requests like this provided you are a very strong performer with the respect of the engineering organization. They realize that promotions from within are morale boosters and developing a reputation for nurturing talent is an amazing recruiting tool.


What's your answer for the general? Not that you know but areas you suspect.


This is one thing that really worries me. I'm in my early 30s, and I'm perfectly content working a mid-level position. I've seen people my age pass me by several times, and I'm fine with that. I'm not good enough to keep up with the really good engineers, and it's probably going to be a long time before I have 'Senior' in my title, if ever.

When I'm in my 50s, I'm just not going to be one of those awesome elder coders. My boss is one of those graybeards, and he's awesome, but I'll never be him. I also have less than zero desire to ever work in management (and even if I wanted to do that, I don't have the people skills for it). So I'm seriously worried that I'm going to be unemployable in 10-20 years because companies will all prefer to hire people in their 20s with equivalent skills to mine out of ageism.


In twenty years you'll be in your 70's. Most people in their 70's are unemployable, no matter the field.

If you don't have a retirement plan, then you got to do some extreme saving and lifestyle adjustment (or you'll face even more extreme lifestyle adjustment once you retire).

If you do have a retirement plan, you have nothing to worry about.


No, they're in their 30s. They're concerned about their 50s.


Whoops, badly misread that.


If you haven't adapted in 20 years, is it really ageism?


I think "older" programmers are more likely to stick around longer. If your looking for someone to fill space so you can brag about the hours your team is putting in and party with while you blow threw a few more of some investors millions go with the young guy. Since, the companies' going to blow up in six month anyway why worry about how long they stay in a job.


Honestly I think that might be in the back of these people's heads a bit.

"Woo! We're doing what we want, when we want, how we want, this is how work should be! What, you think it would be better if we did things this way because reasons and experience? No, this is how I always wanted to work, fast and loose, I'm the one with the CEO title so this is what we'll do!

6 months later I'm so sorry guys. I have to let you all go, then come up with another idea I can get investors to pump money into so I can hop on this ride all over again and learn absolutely nothing.

This isn't true of all startups, but I've seen elements of it at several of the ones I have experience with.


I am with you on this - it is what causes me a lot of anxiety but what also drives me to at least know about "all the shiney things" that the new breed of engineers basically come equipped with standard.


Ignoring the Very Smart Capitalization, I haven't seen any correlation between age and {skill, productivity, intelligence, results}.

I've worked with very smart, cutting-edge literal graybeards who could code circles around me, and guys who have been at the same company in the same entry-level or mid-level programming job for 25 years yelling in a meeting about "this damn web shit."


Right - I took the comment to be about working sane hours yields better results than working twice as long... and generally older folks are less inclined to work the 2x hours.


Even though you personally haven't observed any correlation, wouldn't you think that experienced people can make better judgement calls when designing software? Because of their experience (age), that is. Personally I feel like I'm 10x the developer I was 20 years ago. And that is 100% because of the experience and years. One learns while doing you know.


>I wouldn't worry. We can actually Get Stuff Done, which at the end of the day is what really counts.

This is a myth. It doesn't count for that much. Programmer productivity cannot be effectively measured even by other programmers and certainly can't be measured by non-technical managers.

The fact that you are doing something can be easily verified. The fact that you are doing it effectively at an appropriate speed and quality cannot.

There's a lot more trust, salesmanship, snake oil and bullshit surrounding the development of software than most of us would like to admit.


If you are correctly following an Agile process such as Scrum, the productivity of each team member can indeed be quantified. You may think it's pure bullshit numbers, but the ratio of story points to hours worked tends to stabilize over time and give management an idea of how productive you really are.


Absolute bullshit.

* A team that is being judged on scrum velocity can simply keep inflating story points to make it look like they're going faster.

* A team that is optimistic has a lower velocity. A team that is pessimistic has a higher velocity.

* If a story is 8 points that gives no clue as to whether it ought to be a 2 point (if the technical debt accrued by the team were not so high).

In practice I've found that it doesn't stabilize either. Of course that automatically means you just "weren't doing Scrum properly".


>If you are correctly following an Agile process such as Scrum, the productivity of each team member can indeed be quantified.

No, no, no, a thousand times no. Tracking story points by developer is the diametric opposite of "correctly following an Agile process". It gives team members, individually and collectively, an incentive to game the system, thereby corrupting your metrics and estimates. You are literally begging your team members to lie to you.

Any competent lead should be able to identify who is more and less productive without using what is essentially micromanagement by story point. Story points should only ever be tracked by team for overall velocity measurements.


> The fact that you are doing it effectively at an appropriate speed and quality cannot.

Well it can, but not in a way we necessarily like - it's called the marketplace.


It's called a market for lemons.


If "it" is "producing lemons" then yes, you would need a market for lemons in order to guage the production process' quality. Well done, have a cookie. Or a lemon.


I've seen plenty of old developers who fancy themselves experienced who are not efficient too.


I agree more experienced people usually get stuff done and it's generally been my experience. Of course there are always exceptions. The problem I see is that often it is not the output itself that wins, rather the perception around the output.

I've seen many talentless people rise through the ranks of companies or start their own just because they were good at bs'ing, marketing themselves, had a certain personality type, and many other things that had little to do with programming directly. Of course some of these things are important and valuable, but many of them are toxic, especially in a software development context.

Indeed, we've all observed countless startups that become industry darlings that don't even ever produce real revenue let alone profits. We can argue about what their output actually is, but often it's smoke and mirrors at best, fraud at worst. This all happens often at the expense of progress or another company that actually does produce value, but fails at playing the game properly (same applies to people). But these startups "succeed" at least temporarily because they create perceptions that one day, just with many more millions and many more articles, they will be great and rule the world. Whether it catches up with them or not is something else. Perception is the way so many things in life work, and particularly important to many companies, the stock market.

I think there are just so many examples where the belief that if you work hard and do good work does not hold. I personally value these things, but many people do not. It's a complicated topic I could ramble on about forever, but I think we've all been in these situations and had things happen to use like receiving praise for arguably our worst work.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: