> Software development, as it has been done for decades, is over.
I'm pretty sure the way I was doing things in 2005 was completely different compared to 2015. Same for 2015 and 2025. I'm not old enough to know how they were doing things in 1995, but I'm pretty sure there very different compared to 2005.
For sure, we are going through some big changes, but there is no "as it has been done for decades".
I don't think things have changed that much in the time I've been doing it (roughly 20 years). Tools have evolved and new things were added but the core workflow of a developer has more or less stayed the same.
I also wonder what those people have been doing all this time... I also have been mostly working as a developer for about 20 years and I don't think much has changed at all.
I also don't feel less productive or lacking in anything compared to the newer developers I know (including some LLM users) so I don't think I am obsolete either.
At some point I could straight-up call functions from the Visual Studio debugger Watch window instead of editing and recompiling. That was pretty sick.
Yes I know, Lisp could do this the whole time. Feel free to offer me a Lisp job drive-by Lisp person.
Isn’t there a whole ton of memes about the increase in complexity and full stack everything and having to take in devops, like nothing has changed at all?
I don't think that's true, at least for everywhere I've worked.
Agile has completely changed things, for better or for worse.
Being a SWE today is nothing like 30 years ago, for me. I much preferred the earlier days as well, as it felt far more engineered and considered as opposed to much of the MVP 'productivity' of today.
MVP is not necessarily opposed to engineered and considered. It's just that many people who throw that term around have little regard for engineering, which they hide behind buzzwords like "agile".
Yeah, I remember being amazed at the immediate incremental compilation on save in Visual Age for Java many years ago. Today's neovim users have features that even the most advanced IDEs didn't have back then.
I think a lot of people in the industry forget just how much change has come from 30 years of incremental progress.
Ads will lower the quality of the training data, an RAG is more likely. Pay to get your product's INSTALLME.md ranked under some specific semantic vectors.
I think they are completely screwing up the AI integration.
After years of JetBrains PyCharm pro I'm seriously considering switch to cursor.
Before supermaven being acquired, pycharm+supermaven was feeling like having superpowers ... i really wish they will manage to somehow catch up, otherwise the path is written: crisis, being acquired by some big corp, enshitification.
I'm biased (work at Cognition) but I think it's worth giving the Windsurf JetBrains plugin a try. We're working harder on polish these days, so happy to hear any feedback.
I have running subscriptions with both claude and codex. They are good but, at least for me, don't fully replace the coding part. Plus I tend to lose focus because of basically random response time.
In the engineering team velocity section, the most important metric is missing: change rate of new code or how many times it is change before being fully consolidated.
I would consider feature complete with robust testing to be a great proxy for code quality. Specifically, that if a chunk of code is feature complete and well tested and now changing slowly, it means -- as far as I can tell -- that the abstractions contained are at least ok at modeling the problem domain.
I would expect code that continually changes and deprecates and creates new features is still looking for a good problem domain fit.
Most of our customers are enterprises, so I feel relatively comfortable assuming they have some decent testing and QA in place. Perhaps I am too optimistic?
That sounds like an opportunity for some inspection; coverage, linting (type checking??), and a by-hand spot check to assess the quality of testing. You might also inspect the QA process (ride-along with folks from QA).
It's tricky, but one can assume that code written once and not touched in a while is good code (didn't cause any issues, performance is good enough, ecc).
I guess you can already derive this value if you sum the total line changed by all PRs and divide it by (SLOC end - SLOC start). Ideally it must be a value slightly greater than 1.
fyi: You headline with "cross-industry", lead with fancy engineering productivity graphics, then caption it with small print saying its from your internal team data. Unless I'm completely missing something, it comes of as a little misleading and disingenuous. Maybe intro with what your company does and your data collection approach.
Apologies, that is poor wording on our part. It's internal data from engineers that use Greptile, which are tens of thousands of people from a variety of industries. As opposed to external, public data, which is where some of the charts are from.
just out of curiosity: if i'm located in spain and i setup an ec2 or digital ocean instance in germany and use it as a socks proxy over ssh, do you will detect me?
btw, big fan of postgres :D
reply