Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The seven programming ur-languages (madhadron.com)
57 points by felixyz on Sept 30, 2021 | hide | past | favorite | 29 comments


The blog post seems to be getting some ML language history wrong: Caml was developed in INRIA (https://en.wikipedia.org/wiki/Caml) and ML was originally from Edinburgh (https://en.wikipedia.org/wiki/ML_(programming_language)).


I think the CaML he's talking about is not the same as the CAML/Caml that then became OCaml, though I can't find any info on it. There's this page about the history of OCaml: https://ocaml.org/learn/history.html

The wikipedia page for ML mentionned that it was developed at Endinburgh by Robert Milner https://en.m.wikipedia.org/wiki/ML_(programming_language). Maybe the author made a pun?


Interestingly, there is a Cambridge ML (binary was called CML not to be confused with Concurrent ML CML), which seems to underlie the Nuprl theorem prover (http://www.nuprl.org/book/Metalanguage.html), and was used in Cambridge FCS courses (https://www.cl.cam.ac.uk/teaching/0910/FoundsCS/usingml.html). There was also Cardelli ML (https://smlfamily.github.io/history/SML-history.pdf). I am not aware of either being called CaML but who knows maybe there is more history out there.


Yes, Robin Milner was at Edinburgh when he created the first version of ML.

The author might have been confused by the fact that much later Robin Milner moved to Cambridge.


That was a great article, I like that notion of ur-language. I would love to know if there are other ur-languages out there.

> A lot of work was done on how to make Smalltalk run fast and efficiently, culminating in the Strongtalk project. Strongtalk is historically important because its discoveries became the basis of the HotSpot just-in-time compiler for Java.

It's also the basis for v8! Truly an important project.

> Second, Erlang switched the notion of a thread of execution jumping from object to object to run various code and instead had parallel threads of execution that explicitly listen for and send messages.

I'm glad I'm not the only one that noticed the similarities between OO in the Alan Kay sense and Erlang. An extension of that would be the microservice architecture of today. You have encaspulation, late binding, and message passing.


Erlang is a very nice language, but nevertheless it does not include any feature that can be considered as an innovation over older languages.

A programming language does not need to include anything new to be a better language, it is enough to offer a combination of good features that do not exist together in competing languages.

"Parallel threads of execution that explicitly listen for and send messages" as the authors says, already existed in Ada (1979).

Moreover, such concurrent programming already existed in PL/I, in 1965. PL/I, in 1965, was actually much more convenient than the supposedly modern POSIX pthreads, because it was the first to include the equivalent of WaitForMultipleEvents, which is sorely lacking in POSIX.


Interesting points about PL/I. The more I hear about the 60s/70s in computer science, the more I wonder what exactly happened. I've heard a lot about interesting things existing at this point, and people just rediscovering them recently. I don't know much about this time, and know even less about other engineering disciplines, so I'm not sure if it's specific to computer science. Or maybe open source means that we actually hear about it, and when 10 companies design the almost same kind of wheel in a NIH here fashion we don't hear about it? I wish we had a bit more of a focus on history in our field.


What happened was that some programming languages, like PL/I, were available only on very expensive computers, e.g. IBM mainframes.

The languages available on cheap computers, e.g. BASIC, Pascal or C, had less features, especially features for parallel processing, which were not useful on cheap hardware.

In time, the cheap computers became more powerful than the old supercomputers. Then many of the features formerly available only on powerful computers were added to the new popular programming languages, but not all of them.

While what were considered large languages during the sixties, e.g. PL/I and ALGOL 68, had a few serious flaws, they also had many nice features that are still not present in the most popular programming languages of today, mostly as a consequence of the fact that while C has taken most of the features it added over BCPL and B from either PL/I or ALGOL 68, it simplified those features a lot or even crippled them compared to the original, in order to allow implementation on much cheaper computers.

Later languages attempted to be better than Pascal or C, which was a very easy target, but their designers did not study what was available in earlier languages, to be also better than that.


> I would love to know if there are other ur-languages out there.

At least to me, "The seven ..." reads as the author saying "there are exactly seven of them, and they are these: ..."


Was a little shocked not to see any mention of Snobol4 in this article. Fantastic language, that really does not resemble any other languages and is a category of its own. The syntax is pretty dated but I still use it today.


Close friend in college managed to do a lot of Snobol4; I got the impression that it was a language based around string transformations, like regex everywhere.

Mathematica is essentially a transformation engine like that.. A standard Lisp unification engine, initially.

And didn't Snobol lead to some language that's embedded in large healthcare management systems?


Indeed the primary usage was text and string manipulation; it had a huge following in the 70s and 80s in the humanities. However, like with most languages, all the bases are covered for complex functions, mathematics, etc; it just never got as much traction in those spaces. Far as I know it was the first language to use associative-arrays, it also supports backtracking (like prolog), and meta-programming (code generation during runtime), map functions, lambda like functions, etc. It really had it all!


https://thedailywtf.com/articles/a_case_of_the_mumps makes unflattering comparisons to both FORTRAN and SNOBOL, but it sounds like its own special hell.


MUMPS - that was it. The Sendmail-config of the healthcare industry.


Are there any free implementations of Snobol4 available? Every couple of years I search for them again and I only find broken links...

(Btw, Icon - Snobol4's successor - was my favorite language for several years. When I tried to learn Snobol4 by myself in the university I couldn't understand its semantics, but the book "The Icon Programming Language" was superbly written and it explained generators and backtracking very well).


I highly, highly recommend learning Forth, solely because it will melt your brain. (In a good way.) Keeping track of a stack in your head as you're writing is a delightful puzzle.

Then go write a Forth. Easy to implement, and a rewarding exercise.


POSTSCRIPT, baby. Months and months of hand-written Postscript, before Illustrator was a thing.


Didn't early versions of Fortran precede Algol 60? I also thought Algol came out of Europe more than N America and Fortran the reverse? Or is Fortran considered what the Neanderthals used before they all died off?


The first version of Fortran was defined internally at IBM in November 1954 and introduced as a public product in October 1956.

In 1958, IBM introduced Fortran II, which added user-defined functions and procedures. In the 1956 Fortran, a user could add custom functions, but they had to be written in assembly language, not in Fortran.

During 1958, several proposals for IAL, later renamed ALGOL, were discussed by American and European teams, but the final 1958 report was closer to the European proposals than to the American proposals.

One year and a half later, the ALGOL 60 report had few changes compared to the 1958 version, but those were important changes, e.g. the block structure requiring stack memory and allowing recursive procedures or the while-do loop, besides the for loop.

So yes, Fortran was a real usable product in 1956, 4 years before 1960, when 3 other major programming languages were introduced, LISP I, COBOL 60 and ALGOL 60.

However Fortran was very primitive and it had even less capabilities than the Superplan of Heinz Rutishauser from 1951/1952.

While Fortran brought little conceptual progress, it had a performant compiler for that time, it was well documented and easily available for anyone who could afford to lease an IBM computer.

Still Fortran belongs to the prehistory of programming languages, because the vast majority of features from any modern programming language comes from ALGOL 60 and LISP I, with very few from Fortran and Cobol.

Most later improvements in programming languages happened until the introduction of ALGOL 68, then an even smaller number were introduced in the next decade, until Ada (1979) and C++ (1984) and an even smaller number of improvements are more recent than that.

Unfortunately that does not mean that the modern programming languages are very good, because none of the currently popular languages includes all the best features of the older programming languages.

Due to various historical accidents that could not be corrected later without breaking backward compatibility, when comparing a modern popular programming language with some language from 50 years ago, the modern language will be better from many points of view, but it will also sometimes be worse in various details.

For example, all C-like languages are worse at handling arrays than Fortran II was in 1958.


Yes, Fortran was earlier -- in 1957. But ALGOL introduced structured programming with code blocks and programmer defined functions. These were later added to Fortran, but Fortran was originally structured like 1980s BASIC -- with GOTOs to various line numbers.


According to google, Fortran II in 1958 introduced functions and subroutines. Structure, yeah, and non-numeric statement labels and more sensible if's.


Also, Simula preceded Self by several years.

Simula started as Algol with object-orientation, but it deserves mention for introducing the ideas.


I'd add a few more entries

Excel - the spreadsheet model of programming has quite a bit of power that is hard to duplicate in other ways

Verilog/VHDL - Logic programming that happens all at once is a very important paradigm to know

Ladder Logic - The low level predecessor to the above, much like assembler is to C.

SQL - If you can't natively get data to/from a database from a command line tool, you're missing a very valuable toolset.

Other things I've come across of note:

TKsolver! - The version 1 ran on MS-DOS, and could find what ever variables weren't supplied from those that were, by working backwards through equations.

Metamine - this appeared and winked out of existence, the main feature was the ability to do a "magical assignment" that reactively updated the left side throughout the life of the program, and had sane provisions for dealing with both normal and magical assignments in the same program.

STOICAL - A forth variant that did data types, lists, queues, dictionaries and hashes along with the usual forth stuff. The author moved on to write DAW, an audio editor. Unfortunately, it's in C, and makes a lot of 32 bit assumptions that I don't know enough to fix.

GNU Radio - If you want to learn DSP without the math, install this, and play with your audio inputs and outputs, and work your way up from there. You build flowgraphs, and they compile/run in real time.


Excellent...

Ladder logic, or "Linear Programming" usually implements state machines for industrial control systems.

Constraint systems like TKsolver! are sort of like Prolog, maybe.

Spreadsheets: reminded me of a similar but different beast, the scriptable outline programs. Dave Winer's UserLand Frontier is maybe the best example.

Omni Software's OmniFocus product grew from Ethan Schoenover's beast of a script system that he built in Omni's outliner product.

Is industrial "Operations Research" computer programming? Or is it just in a bigger bag, alongside the computers, called "Cybernetics"?


I'm not sure how linear programming has anything to do with ladder logic. Did you mean to put it next to constraint programming, which it does have more in common with?


Yes: oops. Linear programming as constraint solving. Thanks!

I have not actually done any ladder logic; it seemed to be like flow charts or network transaction diagrams...

I should not post so much when tired. But your great comments got me thinking about all of the ways that we approach programming in non-lexical ways, be that with a graphical formal analysis like Yourdon/Coad or (shudder) Rational Rose, or more direct translation like (ouch) UML, or just wireframes and story boards when hashing things out with a client.

Perhaps all of that is wildly off-topic; they all distill down into written code, usually in one of the "7 ur-languages".


> Excel - the spreadsheet model of programming has quite a bit of power that is hard to duplicate in other ways

Aargh -- at least frigging call it VisiCalc, please. Or improve on it, drop the final -e, and call it "Improv". Either way, please don't contribute to (or are you yourself suffering from?) the current pandemic of thinking "spreadsheet" is synonymous with "Excel".


Does tcl qualify as an ur-language? No reserved words, words compose into commands, commands into scripts. ideas like upvar and uplevel.


Is a shader patch or modular audio synthesis a computer program?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: