Hacker News new | past | comments | ask | show | jobs | submit login

"I suspect that most discrimination that happens against older programmers is not, in fact, age discrimination, but is, in fact, ‘wisdom discrimination’."



As long as hiring and technology decisions remain largely hype-driven, having someone around who says "we don't need k8s, hadoop, machine learning, and AWS with some big-ass Terraform-driven constellation of supporting serices for our expected data, workload, and use case; look, give me one middle-weight server, or one heavyish VM somewhere, and I can show you how to do all this with awk, sed, cron, and maybe a few lines of Perl or Python. We can have it up fast and anyone who can read man pages and a README can admin it, and anyway it's so simple and the workload static enough that once some initial kinks are worked out it'd probably run in a closet without issue for so long you'd forget it was in there—so, longer than this project is likely to be alive, anyway" isn't really something you want unless you plan to ignore them. For their fellow developers, that's résumé poison, you need "real world" experience with all this crap for the next job hunt. For the product manager, owner, and sales, you can't baffle stakeholders or investors or prospects with fancy bullshit when there isn't any fancy bullshit.


It’s always fascinating to me how people can get to this point where they eschew new technologies just because there is an older way that works fine. Yes, having a tried and true pipeline is important, but it shouldn’t be surprising that companies are made of people who want their company to push the envelope to further evolve the systems. Someone was the first person to use Python for a production project, and that evolution has to happen somewhere, with real applications to test them.


It’s always fascinating to me how some people churn from tech to tech, burning most of their time reading docs and fighting with configurations, just because a new technology might be a bit better than the existing, perfectly adequate system.

It’s not that things can’t be improved, just that there is so very much low-hanging fruit that has never been implemented at all in a usable manner. We should focus on the huge gains to be made there before the tiny incremental gains to be made by endlessly iterating.


There was a blog posted on HN or reddit (I forget) talking abou how they were moving the tech behind their blog yet again. And went through how they did static when it was popular, reactJS, node, and a few others.

I read thinking to myself "my god, it's just a fucking blog engine, why are you spending so much time rewriting it in shit, realizing what you wrote it in has downsides, and thinking the new shiny wouldn't?".

It's like someone actually believe there's a tech silver bullet just around the horizon, rather than a new way of doing things that eases some burdens and creates others.


True, it is an amazing waste of time. But then, so are all human endeavors. :-)


You really missed the point. It's not "there is an older way that works fine". It's that there is a simpler way that can be built in less time, takes fewer developers to maintain, will have less bugs, performs better, and is more reliable, because it's simpler than all the crap that people want to use. But people will insist on using all the trendy crap because it's trendy, FAANG is using it, it looks good on a resume, etc, etc.


If all those were actually the advantages of established technologies, you would have a point. But anything that does something useful is inherently complex. To someone who's been programming for 25 years, sure, sed/Perl/Makefiles are straightforward and well-established. But to everyone else, to build or modify them, first you have to go back and learn Perl and sed, which are both riddled with layers of historical baggage, gotchas, and divergent dialects.

You can't actually get all of those things: speed, team size, low defect rate, high performance, high reliability, simplicity, as they are at odds with each other. Generally, the rule is "good, cheap, fast: pick two."


The post you originally replied to listed specific technologies that purely add complexity when introduced to a project. Hadoop isn't replacing something equally complex, it's simply adding complexity. Now if you simply can't do with anything else, and you're at the point of rolling your own distributed filesystem and job scheduler and zookeeper and all the rest of it, then sure, try Hadoop, but that's not what we're talking about. Note that I can pick on Hadoop because the hype cycle has peaked there, but this applies to plenty of newer techs where it hasn't.

You literally can get more of all the good things in my list by removing unnecessary complexity from a project. And yes, there really are some technologies that, most of the time you see it being used, the entire project would literally be better off with literally nothing in place of it. And yes, people still prefer to use the trendy thing.


Fair enough, if replacing that complex of a stack with Python/Makefile/cron is even possible, then it's over-complicated and you can refactor the complexity away. Unless of course you will soon need to implement a feature that requires bringing back in that whole stack. But I suppose this is why hindsight is a valuable asset when you get a project in maintenance mode, as you can see where the peak complexity of the project actually ended up being.


I understand your point, but a minor retort: You get the same sort of "gotchas", historical choices, baggage, etc from new-ish code that does "too-much" and can be considered black-box "magic" without delving into it, reading docs, tutorials, and having sometimes just plain having experience in it.

I think a potential overarching point that started this thread: Tech should evolve, but it should also stay steady and simple to use. Too much of the "new" stuff is not just a good natural evolution of people solving problems, it's people adding layers of complexity and fluff where there need be none.


> Too much of the "new" stuff is not just a good natural evolution of people solving problems

I guess it depends on what you're lumping in with "new stuff". I've experience the opposite quite a lot of refusal to acknowledge that a new constraint or condition has modified the parameters of what a solution is trying to solve, dismissed as one off events or something to monitor, then 6 months later when the problem was isolated becoming the new norm, and monkey patches being required because of refusal to acknowledge that the environment has become more complex.


Unless you plan to work at the certain job until you retire, you gotta think about your resume. You don't want to be the guy looking for jobs with Visual Basic and Excel macros experience.


Your job as developer is not to improve your CV, you need to solve a problem and you should use the best tool for that job not what looks cool on your CV, you should use cool shit on your own time not attempt to force on your work project latest cool stuff and 1 year later leave and let the rest deal with the fallout.


I think your resume looks really good if you explain that you solved complex problems with Visual Basic because it was the right tool for the job given the circumstances. If it’s the only thing you know, maybe not.


Companies push the envelope via their product they are selling or marketing. If they are not selling consulting services then, in a vacuum, using e.g. nodejs instead of Java is not pushing the envelope in any meaningful way. If using nodejs actually lets you push the envelope in your actual product then yeah that's good.

But in my experience companies are full of people thinking there are lots of silver bullets out there. The silver bullet is the new way to do it. The wrong way is the old way to do it.

Related, lots of people jump straight from being 100% naive about performance and architecture to thinking FAANG is the only way to go. They don't stop to think that there are a multitude of midpoints between those two options.

If your business model is to be Facebook or bust it might make sense. But I see a lot of pretty boring and simple businesses resorting to really over-architected solutions. Because the developers are so inexperienced that they think you need eventually consistent distributed microservice soup to serve 10,000 requests per hour.


Software engineering would have been a higher quality profession of python had never gotten into any production project. Now we need to patch over the mess by adding mypy but we could have used a language with explicit types right from the start and it would have saved a lot of bugs in the category of 'every typo is a new variable'.

This so-called evolution is basically creating messes everywhere and code bases that only use one language and one build system are the most understandable ones. Also, once upon a time a programming language was a thing in which you could write anything of your fancy. If that is the case 'multiple programming languages' is itself somewhat of a weird thing.


Software engineering would have been a higher quality profession if we were still punching machine code into cards! Now we need to patch over the mess, deal with all this operating system nonsense, and not to mention this entire internet hype.


I once had a coworker who, when working to pointlessly extend the lifespan of an obsolete external tool, wrote a translation layer. Then he stopped and re-implemented it in Rust. This added nothing to the tool except time to the release schedule.

I don't eschew new technologies. But increasingly I need to see an expected value to justify the uptick in cost.


With wisdom you come to the conclusions you have. The value is somewhere in between. You shouldn't mindlessly reimplement working tools, but you shouldn't keep patching old tools either. Somewhere in between, you project the current and future requirements. Sometimes you have the old tool on life support, sometimes you restructure parts of it, sometimes you reimplement it from scratch. You will often make a wrong decision and that is okay - you just learn that minimal work is often the best way forward unless you positively know otherwise.


Choosing technologies with higher long-term maintainability (support, operations, ongoing development) potential may be worth it in absence of strong software development culture and extensive engineering resources—which is often the case with customers of small consulting businesses like mine.

This does not mean eschewing new technologies, necessarily. Still, it is somewhat like investing in stocks: there may be signs allowing one to tell whether a new trend or stack is likely to go out of fashion soon (which may lead to increased maintenance costs), but it is often safe to assume that the older the tech, the longer it will remain in active use and easy to hire for, making it a sensible recommendation.

Goes without saying that it’s not the only metric, but I find it important, and not applying it at all a violation of customer’s trust.

Software-driven companies with stronger culture (including but not limited to FAANG), on the other hand, have legitimate reasons and resources required to prefer the riskier on average newer stacks—not coincidentally, their teams are frequently at the forefront of developing such tech.


> the older the tech, the longer it will remain in active use

You may already be aware, but this effect has a name:

https://en.wikipedia.org/wiki/Lindy_effect


Yes, I definitely agree there are other reasons to discount the new tech (maintainability, maturity, risk, etc.).


Sure, lots of newer tech has legitimate uses and reasons to choose it over other options in some situations, but there are a hell of a lot of projects out there that are way more complex than they need to be, often at UX cost in addition to development cost, because everyone making the decisions have incentives that don't align very well with UX or controlling costs. For every legitimately-good-use of a hyped new software technology, there are probably (I'm being conservative) three unnecessary ones. Sometimes these can compound into some real monsters of over- and mis-engineering.


Well, maybe if salary compression and inversion didn’t exist and that companies would pay their existing developers based on market value instead of HR enforced policies of “no raises more than slightly above cost of living”, incentives would be better aligned.

But as long as statistically the best way to make more money is by job hopping every two or three years, developers are always going to look out for their own resume and best interest.

No job is permanent It is irresponsible not to be focused on keeping yourself marketable.


The older way wasn't working fine for the extremely limited audience where it didn't. It's not without merit, it does have a purpose, but many more people wish they had Netflix-scale problems than people who really do.


I've been fiddling with the Kaldi speech recognition toolkit over the past 6 weeks or so, which provides "recipes" for creating speech recognition frameworks (basically a combination of preprocessing/featurizing/training steps for various speech recognition models (HMM/GMM, neural nets, speaker adaptive training, etc).

These recipes are a bunch of bash scripts to glue the toolkit binaries together, all connected via Unix pipes.

With all respect to the authors, I feel that Python would have been a much better choice for this task. Bash is a verbose scripting language to begin with, not to mention much more difficult to modularize. I find it takes me at least twice as long to decipher bash scripts compared with their pythonic equivalent.

I think it's a good example of "old, tried and true" not necessarily being "best".


Incidentally this frustration with Kaldi is exactly why we made the Persephone ASR project in Python: https://github.com/persephone-tools


Bash and Python are effectively the same age. Python is 30 years old and Bash is 31.


True, but I don't think Python really eclipsed Bash in terms of popularity (or Perl, for that matter) until much, much later. I've always thought of Bash as the "traditional" approach to gluing binaries/pipes together.

Bash has been a staple of most Linux distributions for as long as I can remember, but Python only seemed to start being packaged as default in the mid-2000s (or at least, that's the impression I have. My memory could be failing me).


Bash is largely a reimplementation of ksh, which is just an extention of sh.

Pretty much anything in bash would be written much the same way in sh, which dates from, when, 1972?

So, no, bash and Python are not the same age, in practice.


Yes, indeed. But not all problems need or benefit from a modern solution.


I find this argument to be tantamount to saying English is fine, but nobody really needs it when there is French. Yes, French is perhaps older and much of English draws from French, and yes, English is more complex than it needs to be. But in the end, pretty much anything you need to say, you can say it in both French and English.

Most problems don't need a modern solution, but that isn't an argument against modern solutions for those problems. In the end, it comes down to, which would the group of people writing the code rather have, an old-style solution or a new-style solution? If both are equivalent, there's no reason not to choose the option that looks like it will be the future.


Well, the "older tech" is ridiculously cheaper.


I work mostly in the "webshit" and mobile industry. The last time, in a work context, that I saw a site (in this case, from a vendor) and went "goddamn, for what it's doing, that is fast, this is really nice to use!" it turns out it was... PHP running on some normal-ass single instance VM or server somewhere, serving mostly just HTML.


It's really a spectacular outlier how well-optimized PHP is contrasted to how poorly designed the actual language is.


Thank Facebook for that. Likely, more specifically, Andrei Alexandrescu.


Not universally, no. For instance, C programmers are harder to find than Python coders, and the farther back you go the harder it is to find good people (COBOL, Fortran). The newer the tech, the more people there are riding in on the hype train to choose from.


There have been multiple submissions here on HN about how somebody figured to replace Hadoop with a well-crafted bash script which also calls several UNIX tools, and achieved 235x speed improvement.

I think the posters you reply to meant exactly such cases.

For the record, I have entirely replaced `grep` with `rg` (ripgrep) in all my workflows. `rg` is newer, written in Rust, and utilises all CPU cores -- unlike `grep`. So I am not against objective and demonstrable progress.

But as another poster replied to you, many of us are against wasting huge amounts of time and effort on very marginal improvements just because the tech is new.

Tell me your tech offers 2x or more speed (or less defects)and I'll use it. But tell me "hey, let's use Kubernetes because.. well.. well... a lot of people use it", and I'll pay zero attention.


Eschewing newer technologies in general is clearly a terrible idea but it's not such a bad idea to eschew hyped technologies (particularly corporate driven hype).

Python is an example of a technology that grew relatively slowly and without a hype cycle.

IME hype in this industry correlates pretty strongly with crap.


That is true, but the trick is how to separate the hype from the gold, which is almost as much superstition as science. I would disagree that Python didn't have a hype phase; there were comics like this[1] after all.

[1] https://xkcd.com/353/


If you look at the Google trends for ruby on rails (where the hype is clear) it looks markedly different to python (steady trend).

I remember that comic - I think it coincided not with anything special happening in the python world, it was just when Randall Munroe learned python.


Although I would say 2007 was close to the peak of Python bleeding "mindshare" due to the hype cycle of Ruby and Rails.


Very little money was ever spent hyping Python.

Python benefited enormously by contrast to Perl, despite being slower.


That's a function of when Python was popularized, where there was very little major corporate support for new open source libraries / frameworks. Ruby and Rails had their hype cycle around the same time, more or less on the back of a <50 employee company.

Nowadays, open source development has centralized far more on large public companies, so there's a lot more marketing effort being put on all the languages / frameworks / libraries out there.


I think there should be a middle ground. On the one hand, what you are saying is correct for many projects/companies and they might be needlessly burning money on tech. On the other hand, these solutions (devops, CI/CD, version control, etc...) were made for a reason. At a certain scale, the old ways break down badly.

Would you suggest that version control (git) is over-kill? You can still use FTP to upload your website.


Excellent point. Ironically, FAANGs, and especially Amazon, are quite conservative in adopting technology from other sources and are not hype-driven.


Or just Heroku with a Docker container. Use any language with an HTTP stack. Admin qualifications include: can slide a bar left and right based on demand for the service.


Honestly, for smaller teams just starting out, Docker itself might be overkill. For most popular languages, just `heroku create` is more than enough to get started.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: