I like kids. No, scratch that - I love kids. Hell, not many people know this, but I actually used to be a kid. Infinitely curious with an imagination that knows no bounds, children have a way of seeing the world which is uniquely their own. They are nothing less than little people, with their own sets of rules, societies and laws, and one of the main reasons so many people find it difficult or awkward to interact with kids is that these people try to force children to step into in our "real" (ahem), man-made world, instead of working our way towards being accepted into the grand societies that children have built.
I must have been around 6 or 7 years old when I got my first computer; the family's Coleco Adam. Unlike most kids who had the marvelous opportunity to be exposed to computing at such an early age, I did not go on to become a hardcore, Godlike programmer nerd. This may have been due, in no small part, to my computer's tendency to "generate a surge of electromagnetic energy on startup, which can erase the contents of any removable media left in or near the drive." However, it did serve as a critically important introduction to the logic of programming, user interfaces, gaming, and science fiction. Several hundred "goto line"s and "run 80"s later, the path to personal technocracy had been laid.
Naturally, upon realization that my exposure to these themes was critical in shaping me into the strapping fellow you
see read before you, I became a strong evangelist for the sort of exploration encouraged by these early computers. In his post “Tinkerer's Sunset”, Mark Pilgrim details his version of this shared experience many young kids went through at that age:
"As it happens, this computer came with the BASIC programming language pre-installed. You didn’t even need to boot a disk operating system. You could turn on the computer and press Ctrl-Reset and you’d get a prompt. And at this prompt, you could type in an entire program, and then type RUN, and it would motherfucking run.
I was 10. That was 27 years ago, but I still remember what it felt like when I realized that you — that I — could get this computer to do anything by typing the right words in the right order and telling it to RUN and it would motherfucking run.
That computer was an Apple ][e."
Whenever I try and explain the reasons why I choose to use Linux as my primary operating system, I tend to do a pretty piss-poor job of explaining why I think other people should too. I wax philosophical about vagaries like “open source”, “accessibility” and “community development”, but all I really should be saying to say is “kids can understand how to make stuff work”. I'm a firm believer that holistic understanding of the functioning of any system is critically important to understand how to best put that system to use. In computing, the ideal – if not the only – way to do that is to make things accessible and open.
There is a mistaken assumption in society that “kids are better at computers than adults”. Anecdotal evidence suggests that this is not the case; read the comments on a torrent page or head to a gaming forum to realize that holistic understand of computing and web technology is actually a bell curve, peaking somewhere around my age group. Older folks, in many cases, may feel that they have 'missed the boat' and remain fearful and wary of doing things such as shopping online, while, for the most part, teens and tweens might have in fact missed another boat – the one with all the opportunities to learn about the foundations upon which their Facebooks, Twitters, and IMs are built.
We are, it would seem, breeding a generation of ignorant Internet users who have no idea what lies below their points and their clicks. Why is this important? Because until now, it didn't take much for someone who was interested in programming to grasp the layer below (electronics). But with the extra layers we keep introducing – and just as quickly, abstracting – we are rapidly falling back to a time when kids really don't understand how things work any more than grandparents do. And, no, a generic Computer Science Degree at Generic State University is in no way a viable substitute for an early, ingrained passion for programming and networking. Those folks end up in generic programmer/analyst jobs in generic cubicles at generic companies, trying to hook up with the default hot secretary.
Essentially, what devices like the iPad do is build technology along what I refer to as “Digital Serfdom”. Much like the disconnect between serfs and landowners, the chasm between the brilliant Apple engineer and the lowly consumer who spends $499+ on a tablet PC is one of control, permission and a firmly established order. Apple very much dictates what you can and cannot do on your iPad – make no illusions, ceci n'est pas un ordinateur. It is, however, a seamless, near-perfect, and beautiful appliance which will be used, overwhelmingly, to consume media, Not unlike - what were those called again? - televisions . While laptops, handhelds and other mobile devices are continuously getting massive increases in power and capabilities – further detracting from the silly, long-discredited notion that “cloud computing is the future” - Apple is releasing devices that are increasingly underpowered, expensive, feature-crippled, and dependent upon the power of (and access to) “the cloud”. The Internet itself is becoming less about computers and more about “appliances”; neatly-wrapped little gizmos that do one or two things really well, while abstracting layer upon layer of functionality to hide what could essentially be a very powerful and useful piece of hardware.
Apple wouldn't have it any other way, of course. I can't help but imagine the amazing possibilities that would be afforded by a device like the iPad if Apple didn't actively forbid programmers from running their own code on their own device. The physical equivalent would be for you to buy a shiny new Honda SUV from the dealer, only to find out that you can't install anything – tires, a radio, a remote starter, a ski rack – unless those are also made by Honda. To most people who work in tech or on the web today, this would seem like a huge step backwards from the read/write web: a device masquerading as a computer but with a fraction of a computer's ability to generate content. However, these same people will be lining up like Lemmings to get one so that they can be the first to show them off to friends at the next tech conference their company is flying them to.
I have no illusions that demand for devices like the iPad will not climb higher and higher in the coming years, because our thirst for increasingly dumbed-down, easy-to-use, broadly-appealing Walmart-style fare will only grow. Eventually, the very notion of a computer with full access to itself will be a thing of the past, and we'll just be carrying little appliances with a single button on them that guesses what we want them to do whenever we press it.
I'm not upset, really. Perhaps I'm just a little baffled that so many folks – many of whom I call friends - who vaunt the value of open source and open access still hide behind the safety of their shiny little closed aluminum unibody laptops for little reason other than ego, fear and pride. No one wants to admit that they were wrong or that they made a bad decision, but they should seriously know better. Even sadder, though, is that fewer and fewer people seem to care or have the foresight about the vast, infinite worlds of wonder, imagination and possibilities that are going to be lost forever to their children – and to mine – because they succumbed to the marketing and peer-pressure imposed by the powers-that-be to erode our right to dream, create, break, and fix things.
Whether my kids choose to exercise it or not, I want them to rest assured that they can do all those things to their heart's content. Just not on an Apple computer.