Seriously, you guys are making me sad. Use some imagination -- there is a lot more to tinker with for your dollar in 2010 than there was in the good old days. For the price of your dad's ][e you can now buy multiple linux boxes, a bucket of Arduinos or a NerdKit, robotic legos, and -- if you are so inclined -- an iPad with developer account on top of it.
At first I was wondering what the ulterior motive to all this drama was. I'm beginning to think it's just link-baiting.
The concern is not with whether there are enough machines to tinker with today or whether there will be that many five years down the line. The concern is with what kind of computing culture is being instilled by the largest mobile computing device manufacturer of the world, which also happens to be one of the most inspiring connoisseurs of interaction design of the last two decades. And where this will lead what we know as "computing" to on the 20-year scale.
It's certain that people who already know that hacking and tinkering is what they want to do with computers will find enough suitably priced computers to tinker with in the next 20 years. Nobody is debating that. The concern is over whether the potential-tinkerers of the years beyond will be able to discover hacking through the immanent curiosity of young age when they run into walls of the kind that Apple is surrounding their products with today.
walls of the kind that Apple is surrounding their products with today
When I learned to program in the 1980s, I taught myself Pascal from a book. Three years later I encountered my first Pascal compiler, through a summer program at a local university, because the sticker price for Apple Pascal was $495 when it came out in 1979 (in 2009 dollars, that would be over $1000) and I was a kid.
(The PC folks were luckier by the time I was coding... Turbo Pascal came out for $50 in 1984. That's only $100 in 2009 dollars. Too bad I couldn't afford to change hardware...)
Now I can pay Apple $99 per year (in 2009 dollars) and get the ability to write iPhone programs, using pro-level tools, and install any iPhone program I want on my personal phone and those of my friends. Or I can build a web app in Javascript or Ruby or whatever and share it with all my friends, without asking anyone's permission, for $9 a month in hosting. (Or pay-as-you-go hosting on the cloud, if uptime is no concern.) Or, if I really want a machine that is completely open and ready to run Linux on, I can buy one for pocket change. I've literally given away the equivalent of three Linux PCs in the last few months because I don't have enough space in the back of my closet.
(The latter is a particularly important point. One reason closed hardware is growing in popularity is that there is so much other cheap hardware out there. Back when a computer cost $2500 it was relatively important to make sure that you could run whatever software you needed on that one computer. Now, if someone offers you a machine that's sealed against malware -- with the caveat that it's also sealed against anyone who won't pay an extra $99, including you -- it's a lot more tempting, because, hey, if Computer A won't run your software we can just buy Computer B, C, or D, new, for a few hundred bucks. Or one can just walk down the street on trash day and pick up potential Linux PCs. I'm just not seeing these signs of Nerd Apocalypse.)
That's a good argument; one that I've seen raised in favor of web apps in the ChromeOS context multiple times. I have no doubt that Nerd Apocalypse in your sense is not on the radar. The question I'm trying to raise (and I'm really trying to raise a question; not defending one model or the other, at least not yet) is not essentially about the economics, though.
I'm wondering how likely it is for a child to get curious about what this "computing" thing is, discover the basic distinction between hardware and software, wonder how one goes beyond the software that ships as default with the device, how one makes a computer do something vaguely defined, random, outside the intended use scenarios, fun, or extremely specific, with an iPad, Chrome OS device, or similar future sandboxed information appliance. How much more or less likely it is compared to a 386DX with a bulky CRT, that runs DOS and expects the user to type a word by default, or an Amiga 500 that expects the user to insert a floppy by default. Getting curious in the first place is a prerequisite; it comes before one can consider paying Apple $99 a year to get the tools with which to satisfy that curiosity.
My line of thinking is that we're witnessing not a huge shift in what personal computing is defined as, but the gradual total disappearance of what we've come to know as computing from the retail market in favour of narrowly targeted information appliances with intentional limitations that aim to make "computing" invisible in the whole experience. It's true that these devices will not saturate or dominate the market any time soon. But the number of pockets that vote for them will definitely bias the industry towards a specific direction, and the culture they create in this decade will set the tone for the culture of the next era, just like how our past experiences with Pascal, Windows 3.1, DOS, what have you, influence the way we evaluate today's technology.
I agree with you about opportunities to tinker/hack. Just check out Maker Faire, or Ramsey Electronics or Ardunio stuff.
But I think there is a valid concern about the trend to legislate against tinkering--restricting sales of radios that can hear on cell phone frequencies, DVDCSS, DRM, suppression of chemistry sets, evacuating a school due to misunderstanding of an electronics experiment.
But I think there is a valid concern about the trend to legislate against tinkering--restricting sales of radios that can hear on cell phone frequencies, DVDCSS, DRM, suppression of chemistry sets, evacuating a school due to misunderstanding of an electronics experiment.
This is a question of perception. Faced with something that doesn't make sense, a geek will see it as a challenge, a puzzle, a worthy adversary. Geeks love challenges.
A lot of people don't. I have many people who have me on speed dial for when they feel like tossing their laptops out the window. It's for these people that the iPad is designed.
All of this "death of tinkering" shows me that none of you guys actually take the time to understand people other than yourselves, and this whole movement speaks to just how self centered the whole geek tribe is.
I'm not sure what you're disagreeing with. To the extent that the iPad makes computing more accessible to more people, that's a good thing. The problem is with the entirely separate issue of Apple actively putting roadblocks in the way of those of us who would like to customize it.
The point of the argument is that in the old days you didn't have to buy anything extra; you could tinker with your main computer. This drew in some opportunistic hackers.
> "didn't have to buy anything extra; you could tinker with your main computer"
You realize that mass adoption of the "home computer" is a very, very recent phenomenon, right? I grew up with a computer in the late 80s/early 90s because my father was in the field, but most of my neighbours did not start getting computers until the mid 90s. IMHO the image of the precocious youngster learning to hack on the machine his family just happened to have is a bit of selection bias.
People look back at the wonderful days of the 70s and 80s with rose tinted glasses - the truth of the matter is back then computers were exclusively hacking machines (i.e., you couldn't work them at all without some fairly in-depth knowledge), you did buy them just to hack on it.
A lot of people grew up in 80s with computers used primarily for games. I was a kid in glory days of 8-bit computers (Atari, Commodore, Sinclair). There were many many millions sold of those.
So yes, many precocious youngsters of my generation learned to hack just because they happened to have computer (bought for non-hacking purpose) and it was tempting and possible to tinker.
People had these computers for reasons other than to tinker. The example given in TFS is about a dad using a machine as a wordprocessor. I have an arduino board because I grew up tinkering and the arduino lets me relive the late nights spent up learning, but they are not something that anybody who isn't already a tinkerer will buy.
But the Apple IIe cost $1400 in 1983 (over $3000 in today's dollars), so most people never had a "main computer" to begin with. Now vastly more people can own a computer at all. And the type of person who spent a month's wages to tinker with a computer in the 1980s is not going to buy an iPad as their only computer today.
> And the type of person who spent a month's wages to tinker with a computer in the 1980s is not going to buy an iPad as their only computer today.
So, the only people that bought an Apple IIe were tinkerers already? What about the example of the father that bought it for the word processor? Is that example entirely unreasonable or blatantly false?
Why are we talking about the Apple IIe? I'm sure a lot more people that learned to code and about the guts of the machine in the 80's did so on Speccies (£180 in 1984), C64s ($595 in 82) or Amigas ($699 in 1987) than on Apples/IBMs.
I was bought a Spectrum as a child (and saved up christmas money for an Amiga). My parents weren't at all technical. I learnt a bit of BASIC but not much (I didn't have the patience to program much), but I did get a decent understanding of what went on inside. So I was happy to build my own computers.
I have a geothermal HVAC system, and it's not very well designed. If the loop temperatures drop below a certain point it becomes ineffective, and I need to manually switch to a secondary heat source.
In a single afternoon, I was able to hook some one-wire sensors into the loops, attach them to a widget that made the sensor data available via IP, write a daemon that monitors the temperature, and then automatically notifies the IP-enabled thermostat to flip over to secondary heat if the loops go below effective temperatures, and to flip back when they recover.
The next day, I made it so the loop temperatures were recorded and graphed, and that I receive an SMS alert if there's a complete HVAC failure (loops too cold + secondary heat failure).
At first I was wondering what the ulterior motive to all this drama was. I'm beginning to think it's just link-baiting.