Published: 2016-02-08

Kids these days

I recently commented in a discussion about young people and computers. I wanted to republish my comment here and expand upon it just a bit. My original comment is below. The context is in reply to an article that claims kids don't know anything about computers, but seems to imply that they did in the early 90s, or at least that it was more likely that interactions with a computer during that time period would instill knowledge about the workings of the machine.

I used to work in “academic technology” (sort of IT meets education) at a university, and definitely I agree with the author that anyone who claims that “kids today” are a bunch of computer geniuses is badly mistaken.

The mistake, I believe, is one of confusing frequency of use with expertise. Young people tend to use mobile devices and computers regularly, and for more “things” than older people (due in part, perhaps, to device convergence). However, they largely fail to grasp even basic abstractions like files, or the difference between a web browser and the web site currently displayed.

Their computer use also tends to focus almost exclusively on communication and media consumption. They know how to open a web browser and get to Facebook and Netflix. But many have never used a computer to create anything more complicated than a poorly-formatted Word document. Many others have used Photoshop or something else in the same vein, but only by rote, following specific instructions to generate specific output.

Their computers are constantly “slowing down” (becoming infected with malware) and they don't understand why. This is actually a massive impediment to BYoD policies at universities and online learning in general: you can't assume that arbitrary young people can keep a computer working properly for an entire semester (this is, perhaps, more of a sad commentary on the state of the web and the commercial software ecosystem, but that's another comment).

I disagree with the author, however, that things have gotten worse. The problem as I see it is that things haven't gotten any better. When I was a kid, I was basically the only person I knew who knew more about computers than how to get to the games. In fact, for a good part of my childhood, I was the only person I knew who had a computer at home.

Today, there are some small fraction of children who have a Raspberry Pi at home, who are learning to program, and who actually end up with some grasp of how things work. So from where I'm sitting, it looks a lot like the situation hasn't changed much in 30+ years.

This is kind of sad from my perspective (I find computing and all its possibilities to be fascinating), but does it really matter? Do we need everyone to know how computers work? Should everyone be a programmer? Would anything change for the better if they were? What would we sacrifice in getting to that point?

People used telephones for decades with almost no knowledge about how they worked. Everyone didn't have to become phreakers for phones to change the world. But it's not a perfect analogy, phones aren't as powerful, phreaking was always more or less illegal, or at least unsavory.

So I guess I’m just not sure.

In general, I find it plausible that today's gizmos don't lend themselves to building a deep understanding of computing. In some sense, that is because we, as software developers, have done our jobs well, perhaps a tiny bit too well. We have built many layers of abstraction atop the actual machine to mask the underlying complexity from end users.

This was seen as the most expedient way to make computers useful for average people. However, it had some side effects. For one, people now expect computers to be dead simple to use and they tend to balk at anything that seems "nerdy". A corollary to this is that people tend to become rather distraught when the abstractions break down or don't work smoothly.

This can be seen in new GNU/Linux users who refuse to use a command line even when it would be simpler, faster, and less mentally taxing than using a GUI.

But back to the point.

While I believe iPhones and tablets probably don't teach kids much about computers, I also have to wonder whether that really matters. Contrary to many in my industry, I really don't think everyone needs to be a computer programmer, and I don't think that making everyone become one will somehow save (or even improve) the world. There are many fields in which people can benefit from exposure to some basic computer programming concepts. However, there's no real reason that a biologist, or a statistician, needs a deep understanding about how computers work, the kind of understanding that kids in the 80s got because there was no other way.

To the extent that computer programming makes an excellent mental exercise for children, I think there is merit in its introduction to children, but there are many, many activities that make equally excellent mental exercises and even programming can be done at a high enough level so as to reduce the importance of a holistic understanding of the machine.

To me, the most important goal should be to ensure that there are avenues available for children to actually learn about computing. The Raspberry Pi project is an excellent example. I also think that schools should introduce kids to programming, though it shouldn't be to the exclusion of other core subjects. There are certainly ways to teach various other subjects that utilize computers and programming and these should be identified and implemented.