I agree with you, but I just have to say that if you're having to change a bunch of 255s to 65535 everywhere, you've done things poorly. Ada has a bunch of nice features to refer to these sorts of limits by names or methods that won't need to be changed. For those not familiar with Ada, one way might be MyInteger'Last, which will give the largest valid value for the type.
> I agree with you, but I just have to say that if you're having to change a bunch of 255s to 65535 everywhere, you've done things poorly.
Sure, but that was just one example that stuck in my head from using Ada 30+ years ago. I also remember one of my project partners threatening me with bodily harm because I wanted to change a name, and it was going to ripple through his modules.
Perhaps there were better ways to do this in contemporary Ada. However, the fact that people using the language didn't find them says something, no?
A whole class of us absolutely LOATHED Ada because of this kind of stuff. An entire engineering class learned to reach for FORTRAN (unsurprising as it was just beginning its decline) and C (a little surprising as it really wasn't a juggernaut yet) instead of Ada. That says something about Ada, and it isn't good.
Sure, we weren't geniuses, but we weren't stupid. If Ada had been useful help to our classes, we would have used it. The HP-28/HP-48 came out in a similar timeframe and EVERYBODY in our class jumped on those in spite of them being $400+ (1988-1990 money) and having to use RPN--being able to chew through matrices on a calculator was huge.
Maybe modern Ada doesn't suffer from these kinds of problems (or, at least has decent IDE support to deal with it), but it certainly pertains to the path dependence about why Ada isn't popular.
I'm not sure why changing a name would be so particularly bad in Ada. If anything, as Lucretia09 pointed out, Ada probably tends to be a bit easier to change names in than other languages.
I'm not surprised an engineering class tended away from Ada. It's really designed to ensure robust software is produced than it is anything else. It tends to want to force you to write down far more of your mental model than other languages, and it will hold you to it. While I find this very helpful in ensuring my programs actually do what I intended, I think it also incurs some up-front costs. It's harder to just start writing anything and then slowly twist it into a solution. It also takes a little more time before you run it for the first time. The certainty that it actually works at that point is what makes it all worth it.
It's a bunch of trade-offs ill-suited for a large number of simple programs.
Emotionally I think it's also a bit of a harder sell because of it. I spend far more time trying to get it to compile than you do in other languages. Particularly if you are in a rush, it can feel worse. You don't even have an executable yet - and it's the damned language/compiler that won't let you make one! Never mind that the reason it's stopping you is that the theoretical executable wouldn't work properly. I can't decide if it is actually a matter of delayed gratification, or if it's merely very similar. But either way, I think that's one of the adoption issues I haven't seen talked about much.
> Sure, but that was just one example that stuck in my head from using Ada 30+ years ago. I also remember one of my project partners threatening me with bodily harm because I wanted to change a name, and it was going to ripple through his modules
Sounds like a psycho to me. If you change the name of a variable in any language it will ripple through other modules.
I may or may not be a little crazy about Ada...