If your workflow feels comfortable there's really no need to change it in my opinion. The reason I do this is to avoid the frustrating feeling that I'm spending all my time moving around in the text file rather than actually making the program. It's less about speed and more about feeling good while programming.
> But for the remaining 2%, it's provably true that mouse is better
This is definitely true. The thing is that using the mouse is a habit, and until you break it, people find themselves instinctively using it in situations where it would be better to use the keyboard. So the 'hard' mouse disable is more of a 'going cold-turkey' type thing to try and break the habit. I agree that once it's broken it makes sense to relax this.
It's undeniably true that estimates are often driven mostly by what number will be acceptable. But it doesn't invalidate the points the article makes. Even when this distorting element of business drivers are removed, estimation is still very hard.
You might get reprimanded if you give accurate estimates - that doesn't change the fact you mostly can't give accurate estimates even if you wanted to.
> you mostly can't give accurate estimates even if you wanted to.
Ah, an optimist.
I long for a world where software development estimates and those who expect them are perceived as the unfunny jokes they are. Why do naked emperors make such wretched despots?
On the other hand, you really do need some kind of estimate. You'd never hire an hourly contractor to remake your roof if he refused to give any kind of estimate, why wouldn't the same apply to software engineering?
That being said, accurate estimates are usually not needed, but the order of magnitude is. Knowing if the change is days, weeks or years of work is important - and while we're bad at estimates, we're rarely "I estimated 2 weeks but it took 6 months"-bad.
Estimate how long it will take to develop room temperature superconductors.
To assist you, let me paint a picture to put you in the right frame of mind:
You have never worked with superconductors of any kind and you’re not even a physicist. You’re one of those “scientist types” that are indistinguishable in the eyes of account managers.
You’re in a thirty minute Teams meeting with a disinterested project mismanager that wants not just a finish date (on a specific date), but the milestone dates on the way there.
You haven’t even met the team you’ll be working with. You haven’t yet spoken with the customer. Your “requirements” (lol) is literally just three words.
Your “obstinance” at refusing to be professional and “do your job” is being thrown in your face by the PM and is being recorded for your next review meeting.
Replace “room temperature superconductors” with any one of dozens of IT technologies or tasks and you have a nearly verbatim replay of my career and my challenges with estimation.
Here’s the thing: if you’re doing something for the first time, you can’t estimate it. If you’re doing it a second time, you can’t estimate it either because it’ll benefit from reuse in a way not experienced the first time. If you’re doing things three or more times in IT, you had better automate the process… another unpredictable first-time activity.
You’re either a meat robot doing repetitive work best done by scripts or LLMs - or by definition you are doing things for the first time and can’t estimate accurately.
I don’t do the equivalent of putting down roof tiles in my work.
I think you have that exactly backwards. ONLY successful ideas are being debated more than 20 years after they are proposed. Only successful ideas spawn reaction and counter-reaction, which lead to its evolution - which you say is not happening, but certainly is, as evidenced by the tree of methodologies that branched from it.
In that respect (which might not be the one the coiners hoped for), capital A agile has been extremely successful.
If it was successful, it would be so normal that there would be no argument about it. It would be taken for granted as the platform which other improvements are built on. It would be like water to fish.
That is not what I see.
> My experience with deps.edn is that every time I start a project and make a deps.edn file, I immediately draw a blank and don’t know how to structure it, so I open ones from other projects to start lifting stuff out of them.
My new project deps file is always literally "{}". I love that that's all I need to do to start doing stuff. I add a couple libraries as needed. Maybe at some point an alias or two.
I would love to know the answer to this! It sounds like something that would be both possible and very useful. I assume there's a good reason why it can't be done based on how the Clojure / JVM execution model works, but I don't know nearly enough about it to hazard a guess.
> The site should ask for the language used to solve the problem and then use this as semi-scientific data for comparing "time-to-solution" for various languages.
I can't believe any valid causal inference about languages could be drawn from this, because the skill of the individual programmer at solving these types of puzzles (recognizing the patterns, knowing the number theory, etc.) would surely dominate any impact of the language itself.
I suppose you could conclude something along the lines of "newer programmers use Python more often than Haskell", but I doubt there would be many surprises there.
It would work if language to programmer mapping was largely random with respect to skill level, or randomly assigned languages to use rather than giving the programmers a choice, but then you'd also need to give six months notice or something to learn the assigned language if they didn't already know it.
> Do you have any recommended materials for gaining skill in organizational theory?
Absolutely.
First, start with Manager Tools [1]. This is my go-to for stuff a couple of times a week. Start with their "getting started" page and they map out how effective relationships work within an effective organization.
Second, have a look at Peter Drucker. Depending on where you want to go, he's probably literally written a book on the topic. The Effective Executive [2] might be useful, depending on your job. I haven't read his others.
Third, tangentially, is Good Strategy / Bad Strategy, by Richard Rumelt [3]. It's not organizational theory, but its adjacent, in that it pretty ruthlessly dissects good and bad strategies for various organizations and entities, and tries to summarize what a "good" strategy looks like if you have to form one, or what a "bad" strategy looks like if you're subject to one. I link to a video here because the guy's a great speaker, and for us here, he looks (in 2011) at why NVIDIA was dominating the market at the time. :-) Skip to 47m 06 seconds for the NVIDIA bit. As an aside, the way he explains the 3D marketplace, and -gets it right- makes me trust everything else he wrote. He explains how 3DFX was crushed. Along with everyone else. He explains how NVIDIA's simulators, and driver expertise helped them to fundamentally disrupt the market. And then continue to stay ahead of it, and eventually become multiples bigger than Intel, who at the time was the behemoth in the space.