walls of the kind that Apple is surrounding their products with today
When I learned to program in the 1980s, I taught myself Pascal from a book. Three years later I encountered my first Pascal compiler, through a summer program at a local university, because the sticker price for Apple Pascal was $495 when it came out in 1979 (in 2009 dollars, that would be over $1000) and I was a kid.
(The PC folks were luckier by the time I was coding... Turbo Pascal came out for $50 in 1984. That's only $100 in 2009 dollars. Too bad I couldn't afford to change hardware...)
Now I can pay Apple $99 per year (in 2009 dollars) and get the ability to write iPhone programs, using pro-level tools, and install any iPhone program I want on my personal phone and those of my friends. Or I can build a web app in Javascript or Ruby or whatever and share it with all my friends, without asking anyone's permission, for $9 a month in hosting. (Or pay-as-you-go hosting on the cloud, if uptime is no concern.) Or, if I really want a machine that is completely open and ready to run Linux on, I can buy one for pocket change. I've literally given away the equivalent of three Linux PCs in the last few months because I don't have enough space in the back of my closet.
(The latter is a particularly important point. One reason closed hardware is growing in popularity is that there is so much other cheap hardware out there. Back when a computer cost $2500 it was relatively important to make sure that you could run whatever software you needed on that one computer. Now, if someone offers you a machine that's sealed against malware -- with the caveat that it's also sealed against anyone who won't pay an extra $99, including you -- it's a lot more tempting, because, hey, if Computer A won't run your software we can just buy Computer B, C, or D, new, for a few hundred bucks. Or one can just walk down the street on trash day and pick up potential Linux PCs. I'm just not seeing these signs of Nerd Apocalypse.)
That's a good argument; one that I've seen raised in favor of web apps in the ChromeOS context multiple times. I have no doubt that Nerd Apocalypse in your sense is not on the radar. The question I'm trying to raise (and I'm really trying to raise a question; not defending one model or the other, at least not yet) is not essentially about the economics, though.
I'm wondering how likely it is for a child to get curious about what this "computing" thing is, discover the basic distinction between hardware and software, wonder how one goes beyond the software that ships as default with the device, how one makes a computer do something vaguely defined, random, outside the intended use scenarios, fun, or extremely specific, with an iPad, Chrome OS device, or similar future sandboxed information appliance. How much more or less likely it is compared to a 386DX with a bulky CRT, that runs DOS and expects the user to type a word by default, or an Amiga 500 that expects the user to insert a floppy by default. Getting curious in the first place is a prerequisite; it comes before one can consider paying Apple $99 a year to get the tools with which to satisfy that curiosity.
My line of thinking is that we're witnessing not a huge shift in what personal computing is defined as, but the gradual total disappearance of what we've come to know as computing from the retail market in favour of narrowly targeted information appliances with intentional limitations that aim to make "computing" invisible in the whole experience. It's true that these devices will not saturate or dominate the market any time soon. But the number of pockets that vote for them will definitely bias the industry towards a specific direction, and the culture they create in this decade will set the tone for the culture of the next era, just like how our past experiences with Pascal, Windows 3.1, DOS, what have you, influence the way we evaluate today's technology.
When I learned to program in the 1980s, I taught myself Pascal from a book. Three years later I encountered my first Pascal compiler, through a summer program at a local university, because the sticker price for Apple Pascal was $495 when it came out in 1979 (in 2009 dollars, that would be over $1000) and I was a kid.
(The PC folks were luckier by the time I was coding... Turbo Pascal came out for $50 in 1984. That's only $100 in 2009 dollars. Too bad I couldn't afford to change hardware...)
Now I can pay Apple $99 per year (in 2009 dollars) and get the ability to write iPhone programs, using pro-level tools, and install any iPhone program I want on my personal phone and those of my friends. Or I can build a web app in Javascript or Ruby or whatever and share it with all my friends, without asking anyone's permission, for $9 a month in hosting. (Or pay-as-you-go hosting on the cloud, if uptime is no concern.) Or, if I really want a machine that is completely open and ready to run Linux on, I can buy one for pocket change. I've literally given away the equivalent of three Linux PCs in the last few months because I don't have enough space in the back of my closet.
(The latter is a particularly important point. One reason closed hardware is growing in popularity is that there is so much other cheap hardware out there. Back when a computer cost $2500 it was relatively important to make sure that you could run whatever software you needed on that one computer. Now, if someone offers you a machine that's sealed against malware -- with the caveat that it's also sealed against anyone who won't pay an extra $99, including you -- it's a lot more tempting, because, hey, if Computer A won't run your software we can just buy Computer B, C, or D, new, for a few hundred bucks. Or one can just walk down the street on trash day and pick up potential Linux PCs. I'm just not seeing these signs of Nerd Apocalypse.)