Recently, a New York Times technology article devoted to venture capitalists use of the younger generation for their market research remarked:
And people now in junior high and high school have spent their lives with technology. “This is the first generation for whom the computer is a native language,” said Jim Gauer, managing director of Palomar Ventures, a Los Angeles firm. “We’re all going to have to get re-educated and learn that language.”
In a similar vein, a recent video on teachertube.com, “Pay Attention,” hyped the same message with a barrage of statistics. Students will have logged x (where x is a very large integer) hours playing video games, read x number of emails, and IMed x number of messages. The message was simple: contemporary students were the digital generation, and teachers better get with program and plan a lesson around a cellphone.
Where are these students? Who has them in class? The digital generation strikes me as the most unexamined assumption in contemporary business and education culture. I teach at a metro university in a relatively affluent area of the US. One would expect that my students would be representative of the “raised-by-computers” generation. They are not. I'd estimate that over the past five years, only 15 percent, at most, of any of my classes has been computer/information literate. True, most can do word processing, manage email, download music and images, and IM—but not very well. Beyond these basics, their skill set is pretty minimal. The digital generation, with few exceptions, does not exist.
No one is born digital and only a few are raised digital. And absolutely one has “the computer as a native language.” And who do these marketers think is teaching the rest how to become digital? I'm reminded of the fundamental scientific and computing advances made during the Cold War. It was not the Sputnik generation who made them—those raised on science enriched high school programs—but the previous generation.