Hacker News new | past | comments | ask | show | jobs | submit login

> Why is ADA not adopted more broadly?

I would argue that Ada suffered because IDEs weren't a common thing when it had its heyday.

For example, Ada (and VHDL which took after it) is really verbose and a single change to a declaration can ripple all over the place.

"Refactoring" is no big deal today. IDEs chop through it really easily.

Back in 1990, on the other hand, you wanted to strangle a language that made you ripple a domain change of 0..255 to 0..65535 through all the code by hand. EVERYWHERE.

There is a reason programmers are so huffy about "type inference", you know.

Currently popular languages are quite "path dependent"--they needed to be popular before IDEs existed but still gain benefits when IDEs became commonplace.

Now that IDEs are common, I suspect that some languages like Ada may slowly gain ground over time. There is far less need for a language to be "stupid editor" friendly.

Edit: Change ADA to Ada.




I agree with you, but I just have to say that if you're having to change a bunch of 255s to 65535 everywhere, you've done things poorly. Ada has a bunch of nice features to refer to these sorts of limits by names or methods that won't need to be changed. For those not familiar with Ada, one way might be MyInteger'Last, which will give the largest valid value for the type.

I may or may not be a little crazy about Ada...


> I agree with you, but I just have to say that if you're having to change a bunch of 255s to 65535 everywhere, you've done things poorly.

Sure, but that was just one example that stuck in my head from using Ada 30+ years ago. I also remember one of my project partners threatening me with bodily harm because I wanted to change a name, and it was going to ripple through his modules.

Perhaps there were better ways to do this in contemporary Ada. However, the fact that people using the language didn't find them says something, no?

A whole class of us absolutely LOATHED Ada because of this kind of stuff. An entire engineering class learned to reach for FORTRAN (unsurprising as it was just beginning its decline) and C (a little surprising as it really wasn't a juggernaut yet) instead of Ada. That says something about Ada, and it isn't good.

Sure, we weren't geniuses, but we weren't stupid. If Ada had been useful help to our classes, we would have used it. The HP-28/HP-48 came out in a similar timeframe and EVERYBODY in our class jumped on those in spite of them being $400+ (1988-1990 money) and having to use RPN--being able to chew through matrices on a calculator was huge.

Maybe modern Ada doesn't suffer from these kinds of problems (or, at least has decent IDE support to deal with it), but it certainly pertains to the path dependence about why Ada isn't popular.


I'm not sure why changing a name would be so particularly bad in Ada. If anything, as Lucretia09 pointed out, Ada probably tends to be a bit easier to change names in than other languages.

I'm not surprised an engineering class tended away from Ada. It's really designed to ensure robust software is produced than it is anything else. It tends to want to force you to write down far more of your mental model than other languages, and it will hold you to it. While I find this very helpful in ensuring my programs actually do what I intended, I think it also incurs some up-front costs. It's harder to just start writing anything and then slowly twist it into a solution. It also takes a little more time before you run it for the first time. The certainty that it actually works at that point is what makes it all worth it.

It's a bunch of trade-offs ill-suited for a large number of simple programs.

Emotionally I think it's also a bit of a harder sell because of it. I spend far more time trying to get it to compile than you do in other languages. Particularly if you are in a rush, it can feel worse. You don't even have an executable yet - and it's the damned language/compiler that won't let you make one! Never mind that the reason it's stopping you is that the theoretical executable wouldn't work properly. I can't decide if it is actually a matter of delayed gratification, or if it's merely very similar. But either way, I think that's one of the adoption issues I haven't seen talked about much.


> Sure, but that was just one example that stuck in my head from using Ada 30+ years ago. I also remember one of my project partners threatening me with bodily harm because I wanted to change a name, and it was going to ripple through his modules

Sounds like a psycho to me. If you change the name of a variable in any language it will ripple through other modules.

With Ada, in his modules, he could've done this:

    Old_Name : Type_Name renames New_Name;


> Now that IDEs are common, I suspect that some languages like Ada may slowly gain ground over time. There is far less need for a language to be "stupid editor" friendly.

I kind of disagree. While it seems to me that languages now tend to want to support tooling, the trend to tolerate depending on the IDE instead of improving the ergonomics of the language alone peaked sometime around the height of Java’s relative industrial popularity (I'm not saying Java is some kind of extreme examole of IDE dependence; Java itself has focussed a lot on its own ergonomics since that time.)

OTOH, I don't think Ada is so I ergonomic that, other than in a case of pathologically bad design to start with, you'd have to manually change integer ranges everywhere; I'm fairly certain that it supported type aliases for things like that and it would have been idiomatic at the height of its popularity (such as it was) to use them, so that you'd only have to make that change in one place, not everywhere.


>Back in 1990, on the other hand, you wanted to strangle a language that made you ripple a domain change of 0..255 to 0..65535 through all the code by hand. EVERYWHERE.

If you're using hard coded constants instead of actual constants, you deserve the pain of changing it everywhere.

    Something_Lower  : constant := 0;
    Something_Higher : constant := 255;

    type Something is range Something_Lower .. Something_Higher;

Is how you do it. Changing the higher value to 65535 or anything else won't ripple.

Now, you do that in C, you've got hardcoded magic numbers all over the place and the compiler would've just silently compiled without warning. Have fund debugging that mess.


Or

    type Something is range 0 .. 255;
And now Something'First equals 0, and Something'Last equals 255, or whatever upper bound you chose.

There is no reason to hard code a magic constant for the upper end of the range anywhere.


My first IDE was Turbo Basic in 1989.

Changing from

Type Domain is 0..255; to Domain is 0..65535;

Is quite easy.


"Turbo<foo> had a great IDE" in no way refutes that Ada suffered from not having an IDE.


Except it did, that is how Rational Software was born.

https://datamuseum.dk/wiki/Rational/R1000s400

Eventually that experience was moved into UNIX workstations, like everyone else trying to capitalise on their market.

The "I don't need IDE" culture is more related to languages born in UNIX culture.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: