You could think of it as putting a low-pass filter on some of the good ideas from the ’60s and ’70s, as computing spread out much, much faster than educating unsophisticated people can happen. In the last 25 years or so, we actually got something like a pop culture ... So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture.
The history of mass computing involves numerous 'bottleneck' events where an increase in usage was bought with a curtailment of hardware and software quality. You had the first blast of cheap minis available outside military and academic environments. Then you had cheap home computers with limited BASIC implementations. That was followed by web apps scripted with half-assed '90s JavaScript and served from cramped mass hosting servers. Then you had smartphones -- iOS using a somewhat spruced-up but constrained version of the decades-old NeXTSTEP, and Android using an outdated and pared-down version of Java.
This is pretty much what "Worse Is Better" is about. Cheap, readily-available software that runs on cheap, readily-available hardware is always going to have a huge head start.
Unix is another example of a worse reimplementation of an earlier system to fit the hardware constraints of the day.
The thing is though, the security model of multics would be a much better fit for today's security needs, but we don't have it because the hardware that could run it was too expensive 40 years ago, which seems crazy when I think about it. Sometimes it feels like the industry as a whole is no longer ambitious, that building fundamentally better systems is no longer considered important. It's nice that you can run unix on your phone, but i would like to run something better than unix on my desktop. Where are the OS's that are ambitious enough to eventually turn into scarlett johansson in the movie 'Her'?
A Conversation with Alan Kay, ACM Queue, 2004, https://queue.acm.org/detail.cfm?id=1039523