Hacker News new | past | comments | ask | show | jobs | submit login

I think I'm confident in saying that K&R saw C as lower-level language. As you say, it is relative [1]. I just don't think enough people considered C to be a high-level language (given that Smalltalk, APL, and Lisp were about) to make your broader characterization that "C was considered a high-level language."

Here's my reasoning:

From "The C Programming Language" book (1st. ed., 1978) at https://archive.org/details/TheCProgrammingLanguageFirstEdit... we can read 'C is not a "very high level language"' (p. ix) and 'C is a relatively "low level" language' (p. 1).[1]

They describe what "low level" means to them: "This characterization is not pejorative; it simply means that C deals with the same sort of objects that most computers do. namely characters, numbers, and addresses."

And on page 2 we see how they don't regard C as the lowest level: "Of 13000 lines of system code, only about 800 lines at the very lowest level are in assembler."

I also found "The C Programming Language" paper in The Bell System Technical Journal (1978) saying "All three languages [BCPL, B, and C] are rather low-level", at https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=6770408... .

Now to the [1], I found the paper "Implementing LISP in a high‐level language" from 1977, where that high-level language is BCPL, which is a precursor to C, so clearly a good number of people at the time would have considered C a high-level language, at the very least in the context of developing a Lisp.




K&R are not the last word on this. They made their comment in 1978, and now it's 2021, and computing is very different.

"Characters, numbers, and addresses" are very much not what CPUs deal with internally today. Most languages no longer reference addresses directly, and "characters and numbers" live behind abstractions of their own.

The point is that C assumes a certain model of computing that was baked into both hardware and software from the late 70s onwards. That model has been superseded, but hardware and software still lose a lot of cycles emulating it. The claim is that this is both inefficient and unnecessary.

But the advantage of the C model is that it's simple, comprehensible, and general.

If you expose more of what goes on inside a modern CPU, programming becomes more difficult. If you build a CPU optimised for some specific other language abstractions you bake other assumptions and compromises into the hardware, and other languages become less efficient.

So if you want to replace the C model you'd first have to define an industry standard for - say - highly parallel languages with object orientation. That is not a small or simple project. And previous attempts to tie hardware to more abstract languages haven't ended well.

So C persists not because it's high or low level, but because it's general in a way that other potential abstractions aren't.

This is not to say that alternatives couldn't be both more general and more performant. It's more a reminder that designing performant alternatives is harder than it looks, and this is not a solved problem.

My guess (FWIW) is that nothing credible will emerge until radically new technologies become more obviously better for general purpose computing - whatever that looks like - than current models.


> K&R are not the last word on this. They made their comment in 1978, and now it's 2021, and computing is very different.

Yes, but K&R back then are relevant to refuting GP's contention that

>> C was always considered a high-level language.


Thank you for clarifying my intent!




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: