From what I gathered (8bit era cpu, console hardware videos) the concept of programming has virtually evolved. Nowadays you use general languages to interact with a system or set of virtual devices (iOS, its subsystems etc) meanwhile in the GB era the system was a set of concrete hardware chips connected by buses and you used mnemonics to orchestrate them (by low level byte messages). My conclusion is that you don't understand because it's a very alien looking framework. I'd also bet 5$ that in 20 years the changes won't be as dramatic as the 80s-00s shift.
I'd also bet 5$ that in 20 years the changes won't be as dramatic as the 80s-00s shift.
Thare's been a "dramatic change" between the 80's and 00's? If so, I must have missed it. I look at, say, IBM 1620 versus the machines designed in the 1980's, and that is change to me. In the 1980's, essentially everything we use now in the area of personal computing architectures has already been invented and put to use. Essentially, by the time you get to the 1980's, everything has been so homogenized and regularized that I really don't see anything new that happened since then.
Perhaps some change is going on now - AMD's hUMA seems to me to be the first major departure from the 1980's machine model in those twenty five years or so, but even then, I'm not sure how much is that "new" - IBM has had heterogenous CPUs on their mainframes for quite some time, so conceptually, it's just another idea getting to the PC world from the world of mainframes.