Hacker Newsnew | past | comments | ask | show | jobs | submit | more pornel's commentslogin

By that criteria there's no meaningful C++ compiler/spec.


How so? There are compiler-agnostic C++ specs, and compiler devs try to be compatible with it.

What the GP is suggesting is that the rust compiler should be written and then a spec should be codified after the fact (I guess just for fun?).


> compiler devs try to be compatible with it.

You have to squint fairly hard to get here for any of the major C++ compilers.

I guess maybe someone like Sean Baxter will know the extent to which, in theory, you can discern the guts of C++ by reading the ISO document (or, more practically, the freely available PDF drafts, essentially nobody reads the actual document, no not even Microsoft bothers to spend $$$ to buy an essentially identical PDF)

My guess would be that it's at least helpful, but nowhere close to enough.

And that's ignoring the fact that the popular implementations do not implement any particular ISO standard, in each case their target is just C++ in some more general sense, they might offer "version" switches, but they explicitly do not promise to implement the actual versions of the ISO C++ programming language standard denoted by those versions.


"Cancel correctness" makes a lot of sense, because it puts the cancellation in some context.

I don't like the "cancel safety" term. Not only it's unrelated to the Rust's concept of safety, it's also unnecessarily judgemental.

Safe/unsafe implies there's a better or worse behavior, but what is desirable for cancellation to do is highly context-dependent.

Futures awaiting spawned tasks are called "cancellation safe", because they won't stop the task when dropped. But that's not an inherently safe behavior – leaving tasks running after their spawner has been cancelled could be a bug: piling up work that won't be used, and even interfering with the rest of the program by keeping locks locked or ports used. OTOH a spawn handle that stops the task when dropped would be called "cancellation unsafe", despite being a very useful construct specifically for propagating cleanup to dependent tasks.


Another aspect is that the majority of projects that keep using C, do it specifically to maximize performance or low-level control (codecs, game engines, drivers, kernels, embedded).

For such projects, a GC runtime goes against the reason why they used C in the first place. Rust can replace C where Fil-C can't.

A technically memory-safe C with overhead is not that groundbreaking. It has already been possible with sandboxing and interpreters/VMs.

We've had the tradeoff between zero-overhead-but-unsafe and safer-but-slower languages forever, even before C existed. Moving C from one category to the other is a feat of engineering, but doesn't remove the trade-off. It's a better ASAN, but not miraculously fixing C.

Most projects that didn't mind having a GC and runtime overhead are already using Java, C#, Go, etc. Many languages are memory-safe while claiming to have almost C-like performance if used right.

The whole point of Rust is getting close to removing the fast-or-safe tradeoff, not merely moving to the other side of it.


There are so many programs written in C or C++, but not in any of the languages you cite, that run just fine in Fil-C.

The thing that is interesting about Fil-C isn’t that it’s a garbage collected language. It’s that it’s just C and C++ but with all of the safety of any other memory safe language so you can have a memory safe Linux userland.

Also, Fil-C isn’t anything like ASAN. ASAN isn’t memory safe. Fil-C is.


Many people know and like C. Many companies have access to plenty of C talent, but no Rust talent.

These are two great reasons to try Fil-C instead of Rust. It seems that many Rustaceans think in terms of their own small community and don’t really have a feel for how massive the C (or C++) universe is and how many different motivations exist for using the language.


I agree, but the converse is also true and is where the value of this DARPA grant lies:

There's a lot of legacy C code that people want to expand on today, but they can't because the existing program is in C and they don't want to write more potentially unsafe C code or add onto an already iffy system.

If we can rewrite in Rust, we can get safety, which is cool but not the main draw. The main draw, I think, is you now have a rust codebase, and you can continue on with that.


Many project do keep using C out of cultural and human reasons as well, they could have long written in a memory safe language, but goes against their world view, even when proven otherwise.


This has been recognised as the next important milestone for Rust and there's work towards making it happen. The details are uncertain yet, because move constructors are a big change for Rust (it promised that address of owned objects is meaningless, and they can be simply memcpy'd to a new address).


The Rust folks want a more usable Pin<> facility for reasons independent of C++ support (it matters for the async ecosystem) and Pin allows objects to keep their addresses once pinned.


Carbon devs explicitly say use Rust if you can.

Non-trivial C++ programs depend on OOP design patterns, webs of shared-mutable objects, and other features which Rust doesn't want to support (Rust's safety also comes from not having certain features that defeat static analysis).

Rust really needs things written the Rust way. It's usually easier with C, because it won't have clever templates and deep inheritance hierarchies.

Even if your goal is to move to Rust, it may make sense to start with Carbon or Circle to refactor the code first to have more immutability and tree-like data flow.


In Cloudflare's case, Rust is much more productive.

Rust modules can handle any traffic themselves, instead of a split between native code and Lua that is too slow to do more than config (Lua is relatively fast for a scripting language, but on the critical path it was a "peanut butter" slowdown adding latency).

At the size and complexity of a server serving 20% of the Web, Lua's dynamic typing was scary. Modules in Rust can enforce many more requirements.

Rust's solid dependency management also helps share implementations and config logic across modules and even different products, instead of everything having to go literally through the same server.

https://blog.cloudflare.com/20-percent-internet-upgrade/


> instead of a split between native code and Lua that is too slow to do more than config

In case you’re not aware, Cloudflare ran on LuaJIT (not just for config) until not that long ago.

https://news.ycombinator.com/item?id=23856875


Five years ago is a pretty long time, and kornel does (or did) work at Cloudflare.


This is more nuanced in Rust's case.

Rust is trying to systemically improve safety and reliability of programs, so the degree to which it succeeds is Rust's problem.

OTOH we also have people interpreting it as if Rust was supposed to miraculously prevent all bugs, and they take any bug in any Rust program as a proof by contradiction that Rust doesn't work.


> Rust is trying to systemically improve safety and reliability of programs, so the degree to which it succeeds is Rust's problem.

GNU coreutils first shipped in what, the 1980s? It's so old that it would be very hard to find the first commit. Whereas uutils is still beta software which didn't ask to be representative of "Rust", at all. Moreover, GNU coreutils are still sometimes not compatible with their UNIX forebears. Even considering this first, more modest standard, it is ridiculous to hold this software to it, in particular.


You would not be able to find the first commit. The repositories for Fileutils, Shellutils, and Texutils do not exist, at least anywhere that I can find. They were merged as Coreutils in 2003 in a CVS repository. A few years later, it was migrated to git.

If anyone has original Fileutils, Shellutils, or Textutils archives (released before the ones currently on GNU's ftp server), I would be interested in looking at them. I looked into this recently for a commit [1].

[1] https://www.mail-archive.com/coreutils@gnu.org/msg12529.html


In this case I agree. Small, short-running programs that don't need to change much are the easy case for C, and they had plenty of time to iron out bugs and handle edge cases. Any difficulties that C may have caused are a sunk cost. Rust's advantages on top of that get reduced to mostly nice-to-haves rather than fixing burning issues.

I don't mean to tell Rust uutils authors not to write a project they wanted, but I don't see why Canonical was so eager to switch, given that there are non-zero switching costs for others.


>OTOH we also have people interpreting it as if Rust was supposed to miraculously prevent all bugs, and they take any bug in any Rust program as a proof by contradiction that Rust doesn't work.

Yeah, that's such a tired take. If anything this shows how good Rust's guarantees are. We had a bunch of non-experts rewrite a sizable number of tools that had 40 years of bugfixes applied. And Canonical just pulled the rewritten versions in all at once and there are mostly a few performance regressions on edge cases.

I find this such a great confirmation of the Rust language design. I've seen a few rewrites in my career, and it rarely goes this smoothly.


It might be a bit of bad publicity for those who want to rewrite as much as possible in Rust. While Rust is not to blame, it shows that just rewriting something in Rust doesn't magically make it better (as some Rust hype might suggest). Maybe Ubuntu was a bit too eager in adopting the Rust Coreutils, caring more about that hype than about stability.


> Rust is not to blame

Isn't that an unfalsifiable statement until the coreutils get written in another language and can be compared?


> Isn't that an unfalsifiable statement

Sounds pretty axiomatic: Rust is not to blame for someone else's choice to ship beta software?


> OTOH we also have people interpreting it as if Rust was supposed to miraculously prevent all bugs

That is the narative that rust fanboys promote. AFAIK rust could be usefull for a particular kind of bugs (memory safety). Rust programs can also have coding errors or other bugs.


>That is the narative that rust fanboys promote.

Strawmanning is not a good look.


People in Europe don't have the automatic anti-regulation sentiment that US has. Regulations, at least from consumer perspective, seem to be working pretty well in the EU.

- My mobile operator wanted to charge me $6/MB for data roaming, until the anti-business EU regulation killed the golden goose. Roaming is free across EU. The mobile operator is still in business.

- USB-C not just on iPhone, but also all the crappy gadgets that used to be micro-USB. Consumer prices on electronics probably rose by $0.01 per unit.

- Chip & pin and NFC contactless payments were supported everywhere many years before ApplePay adopted them. European regulators forced banks to make fraud their problem and cooperate to fix it.

- The card payment system got upgraded despite card interchange fees being legally capped to ~0.3%. The bureaucrats killed an innovative business model of ever-increasing merchant fees given back to card owners as cashback, which made everyone else paying the same prices with cash the suckers subsidising the card businesses.

- Apple insinuates they only give 1 year of warranty, but it magically becomes 2 years if you remind them they're in the EU.


> - Apple insinuates they only give 1 year of warranty, but it magically becomes 2 years if you remind them they're in the EU.

3 actually, if bought after 2021


> How do we boost innovation and deliver cutting-edge products to Europe while navigating complex and untested new rules?

Ask your lawyers? They can parkour through the most complex laws when you need European tax loopholes.


What if you just tried making products that don’t seem to violate the rules on their face?

It’s not like the DMA outlaws software. It deals with certain practices, and makes certain business models pretty untenable.

But it doesn’t just ban everything.


It's hard to find good data sources for this, especially that StackOverflow is in decline[1].

IEEE's methodology[2] is sensible given what's possible, but the data sources are all flawed in some ways (that don't necessarily cancel each other out). The number of search results reported by Google is the most volatile indirect proxy signal. Search results include everything mentioning the query, without promising it being a fair representation of 2025. People using a language rarely refer to it literally as the "X programming language", and it's a stretch to count all publicity as a "top language" publicity.

TIOBE uses this method too, and has the audacity to display it as a popularity with two decimal places, but their historical data shows that the "popularity" of C has dropped by half over two years, and then doubled next year. Meanwhile, C didn't budge at all. This method has a +/- 50% error margin.

[1]: https://redmonk.com/rstephens/2023/12/14/language-rankings-u... [2]: https://spectrum.ieee.org/top-programming-languages-methodol...


By far the most useful and helpful is job ads: it literally defines the demand side of the programming language market.

Yes, that does not show us how much code is running out there, and some companies might have huge armies with very low churn and so the COBOL stacks in banks don’t show up, but I can’t think of a more useful and directly measurable way of understanding a languages real utility.


> the most useful and helpful is job ads

That would certainly be the case, if it were not for the fact that [fake job postings][1] are a thing.

[1]: https://globalnews.ca/news/10636759/fake-job-postings-warnin...


Is there a reason to believe this would skew results?

i.e. Are you assuming (insinuating) jobs for some programming languages are more likely to be fake


I would assume so. I expect there to be a lot of job postings looking for more "sexy" technologies to create the visage that those companies are growing and planning towards the future. And conversely I wouldn't expect any job postings of old "streets behind" technologies like COBOL to be fake, as they wouldn't help with such signalling.


Yes to your point, COBOL which ranks very low here is still fundamental to the infrastructure of several major industries, with some sources [1] reporting that it is used in:

43% of all banking systems.

95% of all US ATM transactions.

80% of all in-person credit card transactions.

96% of travel bookings.

This may very well dramatically change in the next few years with such an emphasis on enterprise AI tools to rewrite large COBOL repositories. [2]

[1] https://www.pcmag.com/articles/ibms-plan-to-update-cobol-wit...

[2] e.g. Blitzy https://paper.blitzy.com/blitzy_system_2_ai_platform_topping...


I can only speak to the two bigger German banks (i.e., Sparkasse and VR banks), but if you look at their outsourced development providers (Atruvia and Sparkasse Informatik), they're still offering incentives for their apprentices to learn COBOL, especially in the german dual apprenticeship programs which they can steer more easily than university courses. My wife has been doing COBOL for one of them since 2012, and the demand has never diminished. If anything, it's increased because experienced developers are retiring. They even pull some of these retired developers back for particularly challenging projects.


Sparkasse and VR aren't the two largest German banks. DB is at least double the size of Commerzbank which is again 100mn in assets ahead of DZ. I don't find it all that surprising that these small banks are still trying to keep their legacy systems alive, but it's not the case for the bigger boys. (Source: work for several of them)


You are right if we only talk about assets. Should've clarified I meant more in regards of retail customers and branches.


Oh, right, consumer banks. Yes I can imagine they're all extremely legacy bound. They're a very small percentage of banking, though.


Cobol is used in pretty much all enterprise legacy systems.

But "used in" doesn't mean that it's actively being developed by more then a tiny team for maintaining it.

As this graph we're commenting on is mostly talking about popularity/most used it's never going to rate higher, because for every one Cobol dev there are more then 100 Java devs employed by the same company


That's a pretty wild claim. What's legacy for you? I'd consider legacy e.g J2EE crap running on web[sphere|logic] as holding most of the points in that league table vs COBOL.


A legacy software to me is whatever the company that employs me says is said legacy software.

Pretty much every business I've worked at to date has had such legacy software, which was inevitably still used in some contexts.

It's not always obvious, because - following with the previous example numbers - only 1-2 Java devs will have to interact with the legacy software again, hence from the perspective of the remaining 98, Cobol doesn't exist anymore.


If they're talking about Cobol, it's usually systems originating before the early 90s that haven't been completely rewritten.

J2EE would be late 90s and 2000s.


In retail banking I'm sure that this could be true. Working in investment banking, I never saw a single COBOL application, or had to have my C++/Java/$MODERNLANGUAGE code interact with one.


Corp bank here, everyone has rumours about COBOL systems but no one I've ever spoke to has seen, interacted or has any other evidence these really exist anymore either.


Me neither.

But I asked for a bank statement from my old savings account a few years old and it took two weeks to print out, printed in monospace dot matrix.

Or the betting company that I was a customer that suspends betting everyday 6:30am for an hour for daily maintainance. Ironically, they would accept bets for football matches played at the time, but the system was shut down.

I suspect both are run on COBOL.


You haven’t seen or heard them because they are abstracted away by APIs, circuit breakers and proxies. Almost ALL banks, credit card companies, travel systems and other high throughput transaction systems run on mainframe that is written in COBOL.


I think the issue here is that people working in fintech don't seem to come across these systems much, if at all - if you know one specifically, please tell us.


It's still there at the accounting/backend level. Automated Financial Systems Level 3 and it's replacement Vision are commercial loan systems.

LVL3 is pure cobol. It has been recently deprecated but there are many banks who own the code and are still self hosting it, along with it's IBM green screen support.

Vision is a java front end in front of an updated cobol backend. When your reputation is based on your reliability and long term code stability, at what point do you risk making the conversion, versus training new developers to work on your system.

https://www.linkedin.com/jobs/view/business-analyst-afs-visi...


No, we are not afraid of our own systems. The idea that there is some fabled computer system which everyone is too scared to touch doesn’t exist (I work in payment processing). There are levels of controls way outside these systems which provide these safety nets (e.g settlement / reconciliation controls).

If the cobol is still there, it’s not due to risk. If anything, the cobol is a much higher operational risk than replacing it.


Analogously, GDSes like SABRE still ran on mainframes until very recently (c. 2023) [0]. SABRE was written in some combination of assembly and some kind of in-house dialect of PL/I, if I recall.

[0] https://www.theregister.com/2022/04/11/gds_gets_over_histori...


I worked briefly at a company that wrote applications that interacted with bank mainframes. Think end point bank teller systems and in branch customer/account management. They definitely do exist - every major bank has a mainframe written in (usually) cobol.

But it's very abstracted, part of our main product offering WAS abstracting it. On top of our ready to use applications, we offered APIs for higher-level data retrieval and manipulation. Under the hood, that orchestrates mainframe calls.

But even then that there could be more level of abstractions. Not every bank used screen-level mainframe access. Some used off the shelf mainframe abstractors like JxChange (yes, there's a market for this).

Fintech would be even more abstracted, I imagine. At that point you can only interact with the mainframe a few levels up, but it's still there. Out of sight.


Yeah when I worked in investment banking it was VBA and Java everywhere, never saw or heard of COBOL.


    > Working in investment banking, I never saw a single COBOL application
What was the back office settlement or wire transfer system written in? There is a good chance that some part of them was written in COBOL. And while Bloomberg terminals are a vendor product, for a bloody long time, many of their screens had some COBOL.

Also, lots of quantitative software at i-banks use LINPACK or BLAS, which use FORTRAN.


Well, I had a very badly specified project to write a library for our back office systems to do Swift payments from our C++ applications, via COM. There was no obvious COBOL involved, on either side, but it has to be said that the whole use case for the library was very murky. And it never worked, due to the lack of spec, not the languages.


First hand knowledge: ERGO and MunichRE both have a lot of cobol still doing the core business. You will most likely never run into the system because they just run batch jobs - sometimes configured via a “nice” web UI… you configure your job, submit and the next morning you have your report… that’s why you never actually see COBOL.


1. Not all roles are advertised. I've actually only been interviewed for two of the jobs I've ever had, both at the same place - my current employer because it's a public institution and so it always advertises and interviews for jobs even if it has an internal candidate who is likely to be a good fit. In fact the first of those jobs was basically my shape on purpose, another candidate was an equally good fit and they hired both of us.

Everywhere else people hired me because they knew who I was and what I could do and so in place of an "interview" maybe I grab lunch with some people I know and they explain what they want and I say yeah that sounds like a job I'd take and maybe suggest tweaks or focus changes. No shortlist of candidates, no tech interview, no tailoring a CV to match an advert. Nothing -> Lunch or Drinks -> Job offer.

So that can cause some distortion, especially for the niche languages where there are like six experts and you know them - an advert is futile there.


> measurable way of understanding a languages real utility

It feels like that metric misses "utility" and instead comes from a very American (or capitalistic maybe is better) mindset.

What about Max/MSP/Jitter? Huge impact in the music scene, probably has very small amount jobs available, so it'd rank fairly low while it's probably the top media/music language out there today. There are tons of languages that provide "the most utility for their domain" yet barely have any public job ads about them at all.

I think such metric would be useful to see the "employability of someone who knows that language" if anything, but probably more pain than gain to link "# of job ads" with "utility".


Thinking about how to measure this properly, why not just the moving average of daily downloads over 30 days from each repository?

… yes CI would be a lot of these downloads, but it’s at least a useful proxy


Yeah except job adverts have enormous lag behind what's actually popular. For example we used Rust quite a lot at my previous company but we didn't advertise for Rust developers at all.

Also then you're looking at which languages were popular in the past whereas the interesting stat is which languages are being used to start new projects.


Interesting might not be the same as useful.

If I'm trying to figure out which language to learn next, knowing what I can get paid for might be more useful, even if it's not that "interesting".

If lots of projects are starting up in Rust, but I can't get interviews because nobody is advertising, how useful is learning Rust?


Well, we have to define what a language's popularity mean. Because Rust is surely more 'hyped', than Java, but Java has at least an order of more developers/software written, etc.

So in which meaning do you use 'popular'?


Ideally we'd like to know both, as they tell us different things.


    > we used Rust quite a lot at my previous company but we didn't advertise for Rust developers at all.
How did you find Rust developers when you needed to hire?


Find developers. Tell them they get to use Rust. You now have Rust developers.


Find a CS program that teaches Rust and hire their graduates.


Existing C++ developers learned Rust.


Plus TIOBE had Perl enter the top 10 suddenly this year but I do not see any new developers. And Ada too! Where are all those Ada programmers?



It's more like people in golf suits agree on corruption schemes rather than actual devs making decisions.


Which nonetheless reveals it is more relevant than others.


https://pkgstats.archlinux.de/packages?compare=ada,gcc,go,ja...

Ada seems pretty popular on Arch

This data is kinda worthless for popularity contests, since they may get picked up by aur packages, but this gives a solid insight into wich languages are foundational

I wish the same was available for other distros

You can do the same with docker images

    curl -s https://hub.docker.com/v2/repositories/library/python/ | jq -r ".pull_count"
    8244552364

    curl -s https://hub.docker.com/v2/repositories/library/golang/ | jq -r ".pull_count"
    2396145586

    curl -s https://hub.docker.com/v2/repositories/library/perl/ | jq -r ".pull_count"
    248786850

    curl -s https://hub.docker.com/v2/repositories/library/rust/ | jq -r ".pull_count"
    102699482


"Top Languages" doesn't mean "better" nor does it mean "best"


That’s a C++ URL parser library, has nothing to do with the programming language.


You want gcc-ada for the programming language.


>>Perl enter the top 10 suddenly this year but I do not see any new developers.

Perl is almost as active as Javascript. And more useful than Python.

https://metacpan.org/recent

I write Perl to do all sorts of thing every week. Its strange its not in the top 5 list.


You're joking right?


If you look at programming language list- Apart from Python, Java. Most are targeted to specific platforms(databases, browsers, embedded systems) or tech(SQL for database).

The general purpose programming languages today are still- Python, Java, and Perl. Make whatever of this you will.

Larry Wall at one point said, if you make something very specific to a use case(like awk, sed, php etc), it sort of naturally starts to come out of general purpose use.

Its just that Kotlin, Rust, Go, SQL, Julia, SQL, Javascript etc. These are not general purpose programming languages.


That was an active debate ... 15 years ago


Yep. And the sources are too often self-reinforcing and self-referential.

Use the "right"/better tool from the toolbox, the tool you know best, and/or the tool that the customer wants and/or makes the most money. This might include Ada[0] or COBOL[1]. Or FORTH[2] or Lua[3]. Popularity isn't a measure of much of anything apart from SEO.

0. https://www2.seas.gwu.edu/~mfeldman/ada-project-summary.html

1. https://theirstack.com/en/technology/cobol

2. https://dl.acm.org/doi/pdf/10.1145/360271.360272

3. https://www.freebsd.org/releases/12.0R/relnotes/#boot-loader


> It's hard to find good data sources for this

I like this:

https://madnight.github.io/githut/#/pull_requests/2024/1

It gives you a count of public repos on GitHub by language used, going back to 2012.


this is much better and aligns with the stackoverflow survey "what are you working on in your free time"


Perhaps the best source would now be the statistics of LLM queries, if they were available.

Edit: I see they raise this point at length themselves in TFA.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: