Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> dominant memory model in use by, well, everybody.

"...by, well, x86"

Fixed that for you.



No, you didn't. It's the C++ memory model, which is borrowed by C, then every compiler IR targeting something in the C/C++ space, and then every other language that decided to support language-level atomics.

Your mistake is thinking solely in terms of the hardware memory model. Indeed, if you care only about x86 (and ignore the potential for compiler optimizations), most of the variety provided by the C/C++ memory model is unnecessary, as every operation on the x86 (well, except for the exceptions) are inherently release/acquire operations.

Please do read up on the C/C++ memory model added in C11 and C++11. It may serve you to understand this space better.


So your incomplete explanations are based on a single language model instead of a single processor model? Sorry, that's a distinction without a difference. It's no excuse for misleading people who might use other languages or processors. And stop with the implied insults about my level of knowledge, when you're the one clearly fixated on only one environment. That's both rude and stupid. I've worked on many CPU architectures, on NUMA and COMA systems long before most people even knew such things existed, at many different levels from the first instruction after an exception (or restart) on up. I say that not to make my own appeal to authority, but to refute yours and to underscore that these "irrelevant" details are in fact highly relevant and important to some of us out here in the wider world. I haven't been alone in any of those things. There are still many programmers working in "exotic" environments where these things matter, and we both rely on their work every day. You should at the very least qualify your statements to say that they're only true for application level programmers like yourself.


> So your incomplete explanations are based on a single language model instead of a single processor model?

It's the language memory model incorporated by all major programming languages. That you are unaware of this tells me that you are a hardware engineer, not a software engineer.

> I've worked on many CPU architectures

And I've worked on computer architectures where it's not clear how to even translate "volatile" correctly because the memory system is that weird. I didn't bring that up before because I'm not interested in appeal-to-authority until people start accusing me of being an idiot who doesn't know what they talk about. I could bring up more bona fide credentials, but what's the point? You've already dismissed my expertise.

> You should at the very least qualify your statements to say that they're only true for application level programmers like yourself.

I did. Every single message, in fact:

> what most programmers would need to understand about memory ordering

> Furthermore, my focus is on a software memory model, not the hardware memory model.

> It's the C++ memory model

And actually, I would go further and suggest that kernel programmers might be better served by adopting the language memory model offered by their compiler rather than insisting on rolling their own and yelling at the compiler when they mess it up.


> You've already dismissed my expertise.

That's pretty rich, since you were the one who jumped in to dismiss others'.

> you are a hardware engineer, not a software engineer

Incorrect. I've merely worked close to the hardware enough to understand why these things matter. Someone has to support the abstraction on which people like you depend, which requires understanding both the abstraction and the domain where it doesn't exist yet. BTW that's why kernel folks don't rely on your abstraction; that would be a circular dependency.

I suggest that this whole fracas could have been avoided if you weren't so prone to make assumptions (including that one about me), over-generalize from your own experience, and meet any disagreement with ever-increasing levels of condescension. I'm sure you're good at what you do, but try to accept that others are also good at what they do and got that way by learning about things you consider irrelevant or exotic. All I was trying to do before you decided to play "I'm smarter" was share some of that information for the next generation of system programmers.


> That's pretty rich, since you were the one who jumped in to dismiss others'.

Where did I do that? You were the one to accuse me of "you suppose the authors spent their time writing that for no reason?", to which I directly responded that no, I did not. My goal was to produce a smaller, more concise comment that could be reasonably offered up as something that "every programmer should know" (with particular emphasis on the word "every"), which by its very nature, ought to be incomplete.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: