Millennials do not necessarily possess special skills at basic computer programs compared to older generations.
Marketers and social theorists love to talk about digital natives. This group is supposedly a generation of early adopters under the age of about 35, uniquely adept at technology compared to their older counterparts. But according to a recent editorial in Nature, these digital natives are a figment of our collective imagination—about as easy to find as “a yeti with a smartphone.”
The editorial points to a review paper published this June in the journal Teaching and Teacher Education, which concluded that “information-savvy digital natives do not exist.” Despite assertions that younger generations learn differently and require specialized, multimedia teaching strategies because they grew up with smartphones and the web, the authors say that there is no evidence to suggest that digital natives are more tech-savvy or good at multitasking than older generations.
This idea of the digital native was born out of a 2001 essay by educator Marc Prensky, who claimed that a new generation was especially skilled at processing multiple streams of information and using technology, reports Discover Magazine. Prensky argued the world should adapt its teaching methods accordingly. But Paul Kirschner, co-author of the Teaching and Teacher Education study and a professor of educational psychology at the Open University in the Netherlands, argues that we hurt, rather than help, students learn when we assume that they have unique technological skills. “We have to treat people as human, cognitive learners and stop considering one specific group to have special powers,” Kirschner tells Discover.
This idea is backed up by other research showing that millennials do not necessarily possess special skills at basic computer programs compared to older generations. Research has also shown that multitasking is neither a special domain of the young nor an effective way to get good results for virtually anyone.
If the idea of “digital natives” was just jargon that advertisers used to sell to the under-30 crowd, all this might not matter much. But the idea that digital natives are fundamentally different is influencing everything from the way curriculums are designed to the way companies shape their corporate work environments.
A better approach might be to rethink how we define generations. Digital natives have not developed unique intellectual abilities from their proximity to technology because basic human cognition doesn’t change from generation to generation. But Jean Twenge, a professor of psychology at San Diego State University, argues that categorizing people into distinct generations can be useful for certain things. She points out that millennials show significant differences from older peers in terms of their workplace preferences, life goals, religious participation, alcohol and drug use, and trust in institutions. In other words, generations may have different habits and world views—but the way people learn won’t change so quickly.