That its predecessor compiled on an EDSAC, it compiled on a PDP-11, and was the language of UNIX (its killer app) was why it spread everywhere. Network/legacy effects took over from there. There were languages like Modula-2 that were safer, efficient, easy to implement, and still close to the metal. Such languages kept being ignored and mostly are to this day for system programming. They even ignore the ones that are basically C with enhanced safety (eg Cyclone, Popcorn).
It's all social and economic reasons justifying C both at its creation, during its spread, and for its continuing use. The technical case against such a language has always been there and field-proven many times.
C was quite a few years before Modula-2. C was developed in 1972 and was heavily used in UNIX by 1973. Development of Modula-2 wasn't started until 1977 and it wasn't widely available until the 1980s.
1. Languages designed to be high-level, readable, optionally easy to compile, work with large programs, have safety, and compile to efficient code.
2. Two languages, BCPL and C, that strip as much of that as possible to compile on an EDSAC and PDP-11 respectively.
Thompson and Ritchie along with the masses went with No 2 while a number of groups with minimal resources went with No 1 with better outcomes. At each step of the way, groups in No 1 were producing better languages with more robust apps. Yet, Thompson et al continue to invest in improving No 2. By present day, most fans of No 2 forget No 1 existed, push No 2 as if it was designed like No 1, and praise what No 2 brought us whereas it actually hampered investments No 1 enabled easily.
Compiler, HW, and OS people should've invested in a small language from No 1 category as Wirth's people did. That Pascal/P got ported to 70 architectures, some 8-bit, tells me No 1 category wasn't too inefficient or difficult either. Just gotta avoid heavyweights like PL/1.
But C on microcomputers did not come any earlier than Pascal or Modula-2 compilers.
My first compiler was Modula-2 on my Atari ST. But it was difficult to do much in it because so much of the OS documentation and example code was geared towards C. Also compiling on a floppy based system (couldn't afford a hard disk) was terrible.
The irony too, that Ritchie, Thompson, Pike et al at the Unix labs were then enamoured by Modula-2 & Oberon and used the ideas to build plan9 but in a new version of C.
The wikipedia article says that, when designing Google's language, all three of them had to agree on every single feature so no "extraneous garbage" crept in. The C developers dream language was basically an updated Oberon. That's them dropping the QED on C supporters for me. :)
Funny thing is, Oberon family was used to build both apps and whole operating systems. Whereas, Thompson et al's version is merely an application language lambasted for comparisons with system languages. I don't know if they built Oberon up thanks to tooling and such or if they've dropped back down a notch from Component Pascal since it's less versatile. Just can't decide.
Note: Imagine where we'd be if they figured that shit out early on like the others did. We'd be arguing why an ALGOL68 language wasn't good enough vs ML, Eiffel DbC, LISP's macros, and so on. The eventual compromise would've been better than C, Go, or ALGOL68. Maybe.
I still use Acme, a plan9 text editor that was based on Oberon and it is the best software ever. Just the notion that "all text is a potential action" blows all other usability notions out of the water.
Want a drop down menu ? : make a file in the current directory with the name of the menu, put the commands in it you want in the menu. Done. And that can go as deep as you like.
Those commands can include snippets of shell code that act on the current document e.g. | sed s/[a-z][A-Z]/g
Highlight some text, middle click that command and the command runs on the selected text.
Add to that a file system for the text editor :
date > /n/acme/new
execute that and you get a new document containing the date
When using it, it feels limitless. You build a command set that matches your current project. There's a pattern matcher too: the plumber, so you can click somefile.c:74 and the editor opens that file at that line. so "grep -n someregex ???.c" and you get a list of clickable "links" to those files.
When you browse the plan9 source code you might see the unusual
void
main(args)
we follow that convention so one can 'grep -n ^functionname ???.c' and you get a source code browser
I'll stop there, and I've only scratched the surface
??? means * - you can't type * . c without spaces in HN !
That's pretty wild. I might have to try it some time. The Oberon interface was certainly interesting. More interesting was that hyperlinked documents became the default way of doing apps that replaced native ones in many places. Something Oberon did before that they rejected. ;)
Nonetheless, I'm hesitant to put a execution system in a text editor. One of my favorite aspects of them is that they load, edit, and/or render data. The data stays data. Means throwing one in a sandbox was always safest route to inspect files I wasn't sure about. An Oberon-style text editor that links in other apps functionality might be a nightmare to try to protect. It's why I rejected that style of GUI in the past.
Composability, it's what makes Unix Unix. Doug McIlroy and his pipes, that's what makes it powerful in th ehands of a skilled operator. Your system grows with you
Possibly a nightmare for someone else to reason about, so I accept that aspect. But every long time plan9 user I know (and that's about 20 I know by name to their face + more I meet at conferences) finds going back to plain old unix a retrograde step. It's like going back to a 16" b&w tv.
"Have you ever actually tried to write code in Modula-2?"
Three people wrote a consistent, safer platform in it from OS to compiler to apps in 2 years. Amateurs repeated did stuff like that with it and its successors like Oberon for another decade or two. How long did the first UNIX take to get written and reliable in C?
"Has there ever been an efficient, portable implementation of Cyclone or Popcorn? "
I could've said that about C in its early years given it wasn't intended to be portable. Turns out that stripping almost every good feature out of a language until it's a few primitives makes it portable by default.
"And, if you just randomly picked three examples, why are they all bad? What does that mean about the other examples, statistically?"
It means that, statistically, the other examples will have most of C's efficiency with little to none of its weaknesses. People using them will get hacked or loose work less. Like with all safe, systems languages that people turned down over C. Least important, it means I picked the examples closest to C in features and efficiency because C developers' culture rejects anything too different. That includes better options I often reference.
Why is that bad for their ability to produce robust programs? And what does that mean about C developers statistically?
Here's what's wrong with your whole line of reasoning. People wrote an OS in Modula-2? That's great. How many of them wrote an OS that people wanted to use? That's a considerably harder task, you know. It's not just that all OSes are equivalent, and the one with the most publicity or corporate support wins.
In particular, I assert that Unix was considerably bigger and harder to write than those OSes written with Modula-2, in the same way that Linux quickly became much bigger than Minix. That "bigger" and "harder" isn't the result of bad technique or bad languages - it's the result of making something that actually does enough that people care to use it.
Next assertion: C makes some parts easy that are hard in Modula-2. That makes it more likely that the harder parts actually get written, that is, that the OS becomes usable instead of just a toy. (True, C also makes it easier to wrote certain kinds of bugs. In practice, though, the real, usable stuff gets written in C, and not in Modula-2. Why? It's not just because everybody's too stupid to see through the existing trends and switch to a real language. It's because, after all the tradeoffs are weighed, C makes it easier to do real work, warts and all.)
I dunno, I sure as hell never asked for Unix. That was a decision made for me by AT&T and it's monopoly a decade before I was born.
As for why "the real, usable stuff" gets written in C? Because C is the only first class programming language in the operating system.
And of course 20 years ago "the real, usable stuff" was actually written in assembly because C was slow and inefficient. Or Fortran if you needed to do any math more complicated than middle school algebra.
Well, the "decision" was actually the consent decree in the AT&T antitrust case, which said that AT&T couldn't go into the software business. (Note well that AT&T did not have any kind of a monopoly on computer operating systems - not then, and not ever.)
The result, though, was that Unix became available for the cost of distribution plus porting, and it was portable. It was the easy path for a ready-for-real-work non-toy operating system on new hardware.
As with scythe, you're ignoring the greater point to focus on tactics that it makes irrelevant. C started with stuff that was literally just what would compile on an EDSAC. It wasn't good on about any metric. They couldn't even write UNIX in it. You know, software that people wanted to use. So, seeing good features of easy compilation and raw performance, they decided to invest in that language to improve its deficiencies until it could get the job done. Now, what would UNIX look like if they subsetted and streamlined a language like ALGOL68 or Modula-2 that was actually designed to get real shit done and robustly?
It's programming, science, all of it. You identify the key goals. A good chunk of it was known as Burroughs stayed implementing it in their OS's. Their sales numbers indicated people wanted to use those enough they paid millions for them. ;) Once you have goals, you derive the language, tools, whatever to achieve those goals. Thompson failed to do this unless he only cared about easy compilation and raw speed at the expense of everything else. Meanwhile, Burroughs, Wirth, Hansen, the Ada people, Eiffel later... all sorts of people did come up with decent, balanced solutions. Some, like the Oberons or Component Pascal, were very efficient, easy to compile, easy to read, stopped many problems, and allowed low-level stuff where needed. Came straight from strengths of design they imitated. A form of that would be easy to pull off on a PDP as Hansen showed in an extreme way.
C's problems, which contributed to UNIX Hater's Handbook and many data losses, came straight from the lack of design in its predecessors which soley existed to work on shit hardware. They tweaked that to work on other shit hardware. They wrote an OS in it. Hardware got better but language key problems remained. Whether we use it or not, we don't have to pretend those effects were necessary or good design decisions. Compare ALGOL68 or Oberon to BCPL to see which looks more thought out if you're still doubting.
Are you referring to C here, or to B? If C, I'd like to see your source. If B, that's a highly misleading statement. You're attributing to C failings that belong to another language, and which C was designed to fix those failings.
> Now, what would UNIX look like if they subsetted and streamlined a language like ALGOL68 or Modula-2 that was actually designed to get real shit done and robustly?
But they weren't. I mean, yes, those languages were designed to get stuff done, and robustly, but in practice they were much worse at actually getting stuff done than C was, especially at the level of writing an OS. Sure, you can try to do that with Algol. It's like picking your nose with boxing gloves on, though. (The robust part I will give you.)
> Their sales numbers indicated people wanted to use those enough they paid millions for them. ;)
But, see, this perfect, wonderful thing that I can't afford is not better than this piece of crap that crashes once in a while, but that I can actually afford to buy. So what actually led to widely-used computers was C and Unix, and then assembly and CP/M, and assembly and DOS.
> Once you have goals, you derive the language, tools, whatever to achieve those goals. Thompson failed to do this unless he only cared about easy compilation and raw speed at the expense of everything else.
Nope. You don't understand his goals, though, because they aren't the same as your own. So you assume (rather arrogantly) that he was either stupid or incompetent. (The computing history that you're so fond of pointing to could help you here.)
"Are you referring to C here, or to B? If C, I'd like to see your source. If B, that's a highly misleading statement. You're attributing to C failings that belong to another language, and which C was designed to fix those failings."
My side of this discussion keeps saying C's design is bad because it avoided good attributes of (insert prior art here) and has no better design because it's foundations effectively weren't designed. The counters, from two or three of you, have been that the specific instances of the prior art were unsuitable for the project due to specific flaws, some you mention and some not. I countered that error by pointing out C's prior art, BCPL to B to original C, had its own flaws. Rather than throw it out entirely, as you all are saying for C alternatives, they just fixed the flaws of its predecessors to turn them into what they needed. The same thing we're saying they should've done with the alternatives.
So, you on one hand talk like we had to use Modula-2 and the others as is or else impossible to use it. Then, on the other, justify that prior work had to be modified to become something usable. It's a double standard that's not justified. If they could modify & improve BCPL family, they could've done it with the others. The results would've been better.
"The robust part I will give you."
As I've given you speed, ease of porting, and working best in HW constraints. At least we're both trying to be fair here. :)
"but that I can actually afford to buy. So what actually led to widely-used computers was C and Unix, and then assembly and CP/M, and assembly and DOS."
It did lead to PL/M that CP/M was written in. And to the Ceres workstations that ETH Zurich used in production. And A2 Oberon system that I found quite useful and faster than Linux in recent tests despite almost no optimization. Had almost no labor vs UNIX and its basic tools. I imagine data, memory, and interface checks in a micro-ALGOL would've done them well, too.
"Nope. You don't understand his goals, though, because they aren't the same as your own. "
That's possible. I think it's more likely they were very similar to my own as security and robustness were a later focus. I started wanting a reliable, fast, hacker-friendly OS and language for awesome programming results. Started noticing other languages and platforms with a tiny fraction of the investment killed UNIX/C in various metrics or capabilities with various tradeoffs. Started exploring while pursuing INFOSEC & high assurance systems independently of that. Eventually saw connection between how things were expressed and what results came from them. Found empirical evidence in papers and field backing some of that. Ideas you see here regularly started emerging and solidifying.
No, I think he's a very smart guy who made many solid achievements and contributions to IT via his MULTICS, UNIX, and Plan 9 work. UNIX has its own beauty in many ways. C a little, too. Especially, when I look at them as an adaptation to survive in specific constraints (eg PDP's) using specific tech (eg BCPL, MULTICS) he learned before. Thing is, my mental view of history doesn't begin or end at that moment. So, I can detach myself to see what foolish choices in specific areas were made by a smart guy without really thinking negative of him outside of that. And remember that we're focusing on those specific topics right now. Makes it appear I'm 100% anti-Thompson, anti-C, or anti-UNIX rather than against them in certain contexts or conditions while thinking better approaches were immediately apparent but ignored.
"The computing history that you're so fond of pointing to could help you here."
I've looked at it. A ton of systems were more secure or robust at language level before INFOSEC was a big consideration. A number of creations like QNX and MINIX 3 achieved low fault status fast while UNIX took forever due to bad architecture. Oberon Systems were more consistent, easier understanding, faster compilation, and eventually included a GC. NextStep & SGI taught it lessons for desktops and graphics. BeOS, like Concurrent Pascal before it, built into OS a consistent, good way of handling concurrency to have great performance in that area. System/38 was more future proof plus object-driven. VMS beat it for cross-language design, clustering, and right functions in OS (eg distributed locking). LISP machines were more hacker-friendly with easy modifications & inspections even to running software w/ same language from apps to OS. And so on.
The prior history gave them stuff to work with to do better. Hence, me accusing them. Most of the above are lessons learned over time building on aspects of prior history plus just being clever that show what would've happened if they made different decisions. If not before, at least after the techs showed superiority we should've seen more imitation than we did. Instead, almost outright rejection of all that with entrenched dedication to UNIX style, bad design elements, and C language. That's cultural, not technical, decision-making that led to all related problems.
> My side of this discussion keeps saying C's design is bad because it avoided good attributes of (insert prior art here) and has no better design because it's foundations effectively weren't designed. The counters, from two or three of you, have been that the specific instances of the prior art were unsuitable for the project due to specific flaws, some you mention and some not.
No, my counter in the specific bit that you are replying to here is that your history is wrong. Specifically, you said that C was initially so bad that they couldn't even write Unix in it. That statement is historically false - except if you're calling BCPL and B as "part of C" in some sense, which, given your further comments, makes at least some sense, though I still think it's wrong.
I'm not familiar enough with Modula or Oberon to comment intelligently on them. My reference point is Pascal, which I have actually used professionally for low-level work. I'm presuming that Modula and Oberon and that "type" of languages are similar (perhaps somewhat like you lumping BCPL and C together). But I found it miserable to use such a language. It can protect you from making mistakes, but it gets in your way even when you're not making mistakes. I would guess that I could write the same code 50% to 100% faster in C than in Pascal. (Also, the short-circuit logical operators in C were vastly superior to anything Pascal had).
So that's anecdote rather than data, but it's the direction of my argument - that the "protect you from doing anything wrong" approach is mistaken as an overall direction. It doesn't need for later practitioners to re-discover it, it needs to die in a fire...
... until you're trying to build something secure, or safety-critical, and then, while painful to use, it still may be the right answer.
And I'm sure you could at least argue that writing an OS or a network-facing application is (at least now) a security situation.
My position, though, is that these "safety-first" languages make everything slower and more expensive to write. There are places where that's appropriate, but if they had been used - if C hadn't won - we would be, I estimate, ten years further behind today in terms of what software had already been written, and in terms of general availability of computers to the population. The price of that has been 40 years of crashes and fighting against security issues. But I can't say that it was the wrong choice.
" Specifically, you said that C was initially so bad that they couldn't even write Unix in it. That statement is historically false "
Well, if you watched the Vimeo video, he looks at early references and compares side-by-side C with its ancestors. A lot of early C is about the same as BCPL & its squeezed version B. The first paper acted like they created C philosophy and design out of thin air based on B w/ no mention of BCPL. Already linked to it in another comment. Fortunately for you, I found my original source for the failed C attempt at UNIX which doesn't require a video & side-steps the BCPL/B issues:
You'll see in that description that the B -> Standard C transition took many intermediate forms. There were several versions of C before the final one. They were simultaneously writing UNIX in assembly, improving their BCPL variant, and trying to write UNIX in intermediate languages derived from it. They kept failing to do so. Ritchie specifically mentions an "embryonic" and "neonatal" C followed by this key statement:
"The language and compiler were strong enough to permit us to rewrite the Unix kernel for the PDP-11 in C during the summer of that year. (Thompson had made a brief attempt to produce a system coded in an early version of C—before structures—in 1972, but gave up the effort.)" (Ritchie)
So, it's a historical fact that there were several versions of C, Thompson failed to rewrite UNIX in at least one, and adding structs let them complete the rewrite. That's ignoring BCPL and B entirely. That they just produced a complete C magically from BCPL or B then wrote UNIX is part of C's proponents revisionist history. Reality is they iterated it with numerous failures. Which is normal for science/engineering and not one of my gripes with C. Just got to keep them honest. ;)
" I would guess that I could write the same code 50% to 100% faster in C than in Pascal. (Also, the short-circuit logical operators in C were vastly superior to anything Pascal had)."
Hmm. You may have hit sore spots in the language with your projects or maybe it was just Pascal. Ada would've been worse. ;) The languages like Modula-3, Component Pascal, and recently Go [but not Ada] are usually faster to code in than C. The reasons that keep turning up are straight forward: design to compile fast to maximize flow; default type-safety reduces hard-to-debug problems in modules; often less interface-level problems across modules or during integrations of 3rd party libraries. This is why what few empirical work I read comparing C, C++, and Ada kept showing C behind in productivity & with 2x the defects. Far as low level, the common trick was wrapping unsafe stuff in a module behind safe, simple interfaces. Then, use it as usual but be careful.
". until you're trying to build something secure, or safety-critical, and then, while painful to use, it still may be the right answer."
Not really. Legacy software is the counterpoint: much stuff people build sticks around to become a maintenance problem. These languages are easier to maintain due to type protections countering common issues in maintenance mode. Ada is strongest there. The simpler ones are between Ada and C in catching issues but allow rapid prototyping due to less debugging and fast compiles. So, reasons exist to use them outside safety-critical.
"My position, though, is that these "safety-first" languages make everything slower and more expensive to write. "
In mine, they're faster and less expensive to write but more expensive to run at same speed if that's possible at all. Different, anecdotal experiences I guess. ;)
What I think is interesting is Intel is adding bounds checking registers to their processors. That should eliminate a lot of the issues people complain about. (Except your programs foot print will be larger due to needing to manage bounds information
>Three people wrote a consistent, safer platform in it from OS to compiler to apps in 2 years.
I take it you didn't link to the OS because it was only ever a toy, correct? If it had been useful for anything, it would be worth linking to -- otherwise, you're better off if I know less about it.
Lilith wasn't popular because it didn't solve any problems that needed to be solved -- it was too slow to be useful as a microcomputer OS, and it wasn't designed for that anyway. Microcomputer OS's of the time were written in assembly, and assembly continued to be important until around the time of Windows 3.1. Not integrating well with assembly is an anti-feature; the fact that it would become unnecessary some 15 years later is irrelevant. C did the job that needed to be done; Modula-2 did the job that somebody thought was cool. Also, you have yet to give me a reason to believe that Lilith was somehow safer or better-designed than UNIX of the time, considering that security wasn't even really a thing in 1980.
That's not to mention it's [Pascal family] just poorly designed from a usability standpoint, with the language creators doing silly things like removing "for" loops from Oberon (a decision which both Clojure and Rust eventually admitted was bad). Lilith itself was "succeeded" by an OS that was written in an entirely different language and made no attempt to be backwards compatible (but portability is a red herring amirite?).
>I could've said that about C in its early years given it wasn't intended to be portable. Turns out that stripping almost every good feature out of a language until it's a few primitives makes it portable by default.
I guess it's not a big surprise then that C wasn't popular in its early years? The implementations of Cyclone and Popcorn were never even complete enough to write software on normal operating systems, much less to write UNIX.
>It means that, statistically, the other examples will have most of C's efficiency with little to none of its weaknesses
It means that, statistically, they will all have other huge, glaring weaknesses that you pretend don't exist...
>Least important, it means I picked the examples closest to C in features and efficiency because C developers' culture rejects anything too different. That includes better options I often reference.
If you have better options, why didn't you name them? Because they're equally bad if you spend even a few minutes thinking about it? (Lemme guess: Limbo, SML, Ada)
" take it you didn't link".... "you have yet to give me"... "s a red herring amirite?" "you pretend don't exist...""why didn't you name them?"
The answer to all that is that my response to you was less thorough and referenced on purpose. That's due to the trolling style of your first comment that this one matches albeit with more information. A comment more about dismissal and challenge with little informational content didn't deserve a high-information response. My observation and recommendation is here though:
That is, there were specific languages in existence from PL family, Pascal tradition, ALGOL's of Burroughs, ALGOL68, and so on that balanced many attributes. A number, from safety-checks to stronger typing to interface protections, had already been proven to prevent problems or provide benefits. Thompson would've known that given MULTICS was one of systems that provided evidence of that even if it failed in other ways. He even re-introduced some problems that PL/0 prevented. Let's focus on fundamental failure, though.
So, if I were him, I'd note that the consensus of builders of high-reliability systems and CompSci researchers is we need a language like ALGOL68 that balances various needs. I'd still need as much of that as possible on my crappy PDP. So, I'd subset a prior, safe language then make it compile with tight integration to the hardware due to resource constraints. Might even made similar tradeoffs they did although not in a permanent way. If safety checks wouldn't work yet, I'd turn off as many as I needed to get it to run. As hardware improved, I could turn more on. I'd keep well-written language definition and grammar as others showed to do. I might also borrow some syntax, macro's or easy parsing from LISP to make my job easier as Kay is doing and numerous Scheme teams did at the time. Keep language imperative and low-level, though.
Thompson had a different idea. There was a language that had no good qualities except raw performance on crap hardware. Was opposite of its authors design intent to top it off. Thompson decided on it. Any gripes you have with Modula-2, Cyclone, Popcorn, etc apply at this point because the language, esp his B variant, wasn't so good enough for about any job. As I'm advising for safe languages, it would have to be modified to get stuff done. The first version of C was still crap to the point UNIX was mostly assembly. They added structs to get information hiding/organizing then were able to write UNIX in C. Both C and UNIX were simple enough to be portable as a side effect. Rest is history.
Almost any gripe you've had against safer languages of the time I can apply to C's predecessors other than compilation difficulty. That you will dismiss them on grounds that they need modification for the job at hand, but don't do that for C, shows the bias and flaws of your arguments. I've called out everyone from McCarthy to Wirth to Thompson for stuff I thought were bad ideas. For loop removal is a good example. I didn't settle against C until years of watching people's software burn over stuff that was impossible or improbable in better designed, even lean, languages. Evidence came in even in empirical studies on real prorammers, C showed to be behind in every metric except raw performance, its history tells us why, and logically follows that they should've built on better, proven foundations. Those that did got more reliable, easy to understand, easy to maintain systems.
Of course, don't just take my word for it: Thompson eventually tried to make a better UNIX with Plan 9, but more importantly a better language. He, Ritchie, and Pike all agreed every feature in their language. That language was Go: a clone of ALGOL68 and Oberon style with modern changes and updates. Took them a long time to learn the lessons of ALGOL that came years before them.
Kronos, the Soviet 32-bit workstation, was designed and built from hardware to all compilers, OS and miscellaneous applications by a small group of students in a couple of years. The only language they used was Modula-2.
Thanks for the link! Had no idea Russians built a Wirth-style system based on Lilith. There's almost a trend with them as they received the Oberons and especially Component Pascal better than most countries. My foray into Blackbox showed lots of Russian use.
There have to be attributes of the language and tooling that appeal to Russian programmers more than others for whatever reason. I wonder what those are, both aspects of Russian style and language that appeals.
My strategy for that sort of thing was wrapping all the stuff in unsafe modules that exported safe API's. Just do some basic checks as function is called then messy stuff is handled in its own little file of scary code.
I was doing this in other languages way after you were dealing with Modula-2 on even older hardware. So, I'm curious as to how you and others then handled that problem? What were the tactics?
I was too young and too distracted by the world of being young to focus on it enough back then to truly sit down and do that. By the time I got my Modula 2 compiler I was 13 or 14 years old and then high school happened and about 3 or 4 years later I had a PC running (pre 1.0) Linux, so. I never had to truly deal with it.
http://pastebin.com/UAQaWuWG
That its predecessor compiled on an EDSAC, it compiled on a PDP-11, and was the language of UNIX (its killer app) was why it spread everywhere. Network/legacy effects took over from there. There were languages like Modula-2 that were safer, efficient, easy to implement, and still close to the metal. Such languages kept being ignored and mostly are to this day for system programming. They even ignore the ones that are basically C with enhanced safety (eg Cyclone, Popcorn).
It's all social and economic reasons justifying C both at its creation, during its spread, and for its continuing use. The technical case against such a language has always been there and field-proven many times.