How does Computer Science teach the basics? Doesn't most of the CS curriculum assume a Turing machine with infinite tape?
At my university, most of the classes the GP is talking about (logic design, microprocessor implementation, OS-level programming) were in the Electrical Engineering curriculum. The lowest level that the CS curriculum got to was writing a parser for a programming language.
I learned about computer architecture in my CS curriculum. It seems there’s great variance to these courses, so it’s hard to assume your experience is the same everywhere. Mine had a good deal of practical applications, prefaced by theory. I know people who went to school and didn’t learn some “basic” things, that their curriculum didn’t prioritize.
As time goes by the topics to focus also changes (should we have Machine Learning courses now?), so that’s to be expected.
This sort of gatekeeping is the kind of reason why a lot of people are put off by engineers - "oh, you don't know X? You're clearly not a software engineer".
There's a reason abstraction layers exist, perhaps you should try and figure out why your mental model of software does not account for that.
From my experience a substantial fraction of programmers learn the absolute minimum to get a job.
Then they either learn on the job or ask the same questions over and over again, never learning. The latter being the really annoying people to work with.
You can be a perfectly good e.g. web developer without knowing any of that. It would be good to know how it works but you don’t need to go that deeply into it unless you’re interested in it. And if you are, or if you really do need to know it, this is a learning resource so you can do that
Some types of low level programming require you to know about CPU architectures and op codes, but certainly not all types of programming. You can be an excellent web dev without knowing how a CPU works, for example.
How is that relevant? A computer architecture class explains the high-level concepts of an ISA and micro-architecture, covering features from across the domain.
There is never a strong focus on x86, and any kind of rote memorization has nothing to do in a higher education institution.
Are you autistic or something? Serious question. You seem to have a hard time understanding other perspectives, and you missed obvious sarcasm/hyperbole there.
Often it's the case that those who learnt to program at college/university at the age of 20 are unsettled by their peers who at 30 have two decades under their belt, having spent their fertile years from 10 or earlier programming. By 20 I'd already been writing code every week without fail for a decade. Now I'm at > 70%.