Why we shouldn’t fear ‘digital natives’
By Jeannette Paule
There appears to be a growing fear in society of the group labeled “digital natives.” The fear generates from the belief that these young people are somehow superior and more knowledgeable in computer technology and social media, just because they grew up with it.
A digital native, as defined by Marc Prensky who coined the phrase, is a “native speaker of the digital language of computers, video games, and the Internet.” The rest of us are digital immigrants, as we “become fascinated by and adopt many or most aspects of the new technology.”
The problem we face is the interpretation of what a “native speaker” might mean. It is assumed that because the digital natives grew up with technology around them, they understand it all and have the necessary skills and maturity to function within, and implement, complex systems.
The reality is, just because you speak your native language, doesn’t mean you can write the next great novel; just because you can use a technology doesn’t mean you understand the overall implications of its power or how to use it effectively.
Forbes magazine reports comments such as “hire a digital native to do it” in regard to the staffing requirements for social media at a Fortune 500 company. This belief in an inherent ability threatens to become a divide within the workplace between the young and those who are still young but just not as ‘native.’ Unrealistic expectations are being placed on the digital native crowd to perform, while the more experienced fear their own obsolescence or are disappointed in their secondary role.
JISC InfoNet researcher Doug Belshaw describes digital literate as “knowing how the web works, understanding how ideas spread through networks, and able to use digital tools to work purposefully towards a pre-specified goal.” In all things, understanding comes through lessons combined with experience. The question then becomes, has the first group of digital natives been taught sufficiently and do they have enough experience to wear the all-encompassing label that is being placed on them?
The Columbia Spectator at Columbia University ran an article written by a student titled “Digital natives stuck in the Stone Age,” with the by-line “Too many Columbians browse websites without knowing how to make them.” The author, Alex Collazo, discusses an ignorance he perceives among students in the basics of Internet use.
“Watching a Columbian student use a computer can be a painful experience and makes one question whether simply growing up on a computer is sufficient to instruct a person in its use,” says Collazo. He sees his fellow students unable to create a simple website, browsing without antivirus software, and unable to clear a browsing history of naughty websites, all of which should be basic tasks to a “digital native.”
This spring, the Guardian launched a digital literacy campaign to upgrade computer science and IT in UK schools. The campaign focused on having coding and ICT taught properly in schools. Basic skills are learned easily – surfing the Internet, playing a game, word processing – but higher skills to understand how the computer works are necessary to be competitive in the economies of the future. These things must be taught.
The first group of digital natives currently entering the workforce are expected to function on a very high level around technology, especially social media; a level of functioning that involves learned skill but which is currently implied to be inherent.
I often hear parents or grandparents say: “The kids are better at this than I am. They pick it up so fast.” While functionally children are able to learn technology very quickly, it is dangerous to believe they are “better” at it than an adult who understand the implications of using those technologies.
For example, at one and a half years old my baby figured out how to switch between programs on the family iPad using the ‘four finger swipe.’ We had to Google the function to figure out how this was happening. The baby knows how to mash buttons but doesn’t understand the power of this technology, how to apply limitations to usage, or fully understand the content consumed – especially if accidentally viewing inappropriate content.
The same applies to school age children and teenagers. The implications of technology must be taught at an ever younger age to help prepare children for adulthood, especially those who have never known an existence without a computer in the home and instant connectivity to all parts of the globe. Monitoring, teaching and discussion are the only ways to turn these digital natives into good digital citizens.
The divide between the natives and the rest of us will only grow deeper and more destructive unless our belief that these young people have magical knowledge surrounding technology ends. Inaccurate and inappropriate pressure will not help them reach their potential in school, the workplace, or life; it will only serve to frustrate them for not meeting expectations. On the other side, teachers and employers will be disappointed with the results they receive from these same individuals.
Appropriate use of technology is not a natural-born ability, unless you’re a fan of Star Trek. Until my consciousness is downloaded into a new body, I’ll be accepting the technological abilities of my local “digital native” based on the person, not a label. Parents and employers would be well served to do the same.
The above article is reproduced from Kiwi Commons, a news, and information weblog dedicated to
providing readers with the most relevant and up-to-date resources available on
Internet safety, cyberbullying, social media and digital legacy.