It took an offer to appear on a national TV show for Wade Warren to reluctantly give up what he calls his “technology” for a week.
That was the only way, his mother says, that he would ever pack his 2006 MacBook (with some recent upgrades, he’ll tell you), his iPad tablet computer and, most regretfully, his Nexus One smart phone into a cardboard box and watch them be hustled out the door of his room to a secret hiding place.
Wade, who’s 14 and heading into ninth grade, survived his seven days of technological withdrawal without updating his 136 Twitter followers about “wonky math tests” and “interesting fort escapades,” or posting on his photography product review blog, or texting his friends about … well, that’s private. But he has returned to his screens with a vengeance, making up for lost time.
Though he’s vowing that he is going to reduce his screen time, “I haven’t really noticed a sharp drop in my computer usage,” he concedes in a phone interview, with the faint sound of computer keys clicking as he talks. The idea behind the show, called “Nick News with Linda Ellerbee: Middle School Unplugged,” was that time away from gadgets might cause young people like Wade to see the benefits of disengaging from their screens and connecting in person with friends and family.
But it seemed to have the opposite effect on Wade: “I sort of learned the magnitude of how [technology] helps me.” Not carrying a phone was a factor in his getting lost on his own in downtown San Francisco, near where he lives, an experience that troubled him.
Wade is a “digital native” whose world — half in cyberspace, half on terra firma — is breeding what might be called a new species of thinkers. The early 21st century may be a watershed moment in how humans learn and communicate, a change perhaps not equaled since the invention of the printing press nearly six centuries ago.
Today’s technology may be determining not just how we spend our time: It actually may be “rewiring” the way we think, how we experience the world around us.
Techno-Cassandras fret over what’s happening to our attention spans, our ability to think and read deeply, to enjoy time with our own thoughts or a good book.
Techno-enthusiasts scoff that those concerns are nothing new: Socrates, it’s pointed out, thought that writing itself would harm a person’s ability to internalize learning, the printed word acting as a substitute for true understanding. Technologies such as printing, and in recent decades television and the pocket calculator, have all served time as villains only to become innocuous, commonplace parts of modern life. Why should helpful new technologies from Facebook and Twitter to iPhones and laptops be any different?
Those caught in the middle are aware that something significant is happening but are wary about whether they or others are grasping the big picture. Is technology making us dumb and distracted or turning us into expert information finders and magnificent multitaskers? Is being connected online 24/7 good or bad? Is there even a good way to tell?
* * * * *
“I think it’s subtler than ‘Is [the Internet] making us smarter or making us stupid?’ ” says Nicholas Carr. “It’s how it’s making us smarter or how it’s making us stupider that’s interesting.”
Mr. Carr’s book, “The Shallows: What the Internet Is Doing to Our Brains,” is currently bearing the standard for the techno-worried. In it, he begins by telling of his own trouble in reading at length and thinking as deeply as he once could. After some research he concludes that too much time online is not only changing the way his brain works, but everyone else’s, too.
“The possibility that we’re altering some basic things about the way we think without carefully weighing the consequences is troubling,” he says. “However important it is to connect quickly with others and exchange messages, there is also a crucial role for solitary thought in our intellectual lives. And we seem to be rushing to dismiss the importance of solitary thought.”
His plaintive cry: I want my old brain back.
“As we practice these very busy modes of skimming and juggling tasks, we think we’re being productive and, you know, sometimes it can be quite entertaining and quite fulfilling,” he says in a Monitor interview. “But what I don’t think we fully realize is that we’re altering in a deep way our ability to pay attention, our ability to be contemplative, to be reflective — the things that we might be losing.”
Carr, a gifted writer admired for his ability to examine and explain the effects of technology on society, is hardly alone. Others, including scholars and scientists, are asking the same troubling questions, especially about the young “digital generation” whose members are growing up in their own screen-filled worlds.
“The brain of a child who is immersed in six to seven hours of digitally dominated media daily and reads only a little off-line will have differences from a child immersed only in books and who learns to attend, concentrate and think about what he or she reads,” writes Maryanne Wolf, a professor of child development who directs the Center for Reading and Language Research at Tufts University in Medford, Mass.
“The problem with much of our digital media is that they engage attention quickly and then engage again and again. Children are constantly moving to the next piece of information, she says. “My worry is that children are becoming wonderfully engaged with the superficial levels of information but unaware of the need to probe and think for themselves.”
Nora Volkow, a brain researcher and director of the National Institute of Drug Abuse, agrees: “The technology is rewiring our brains.”
A two-class society may develop, with a mostly younger generation who are “the people of the screen” and a mostly older generation who are “the people of the book” — with two quite different ways of understanding the world, theorizes British neuroscientist Susan Greenfield.
“At the beginning of the 21st Century, we may be standing on the brink of a mind-makeover more cataclysmic than anything in our history,” she wrote in 2006. “The science and technology that is already becoming central to our lives will soon come to transform not just the way we spend each day, but the way we think and feel.”
Humor essayist Garrison Keillor recently summed up the generational difference this way. “[O]ur children are writing up a storm, often combining letters and numerals (U R 2 1derful), blogging like crazy, reading for hours off their little screens, surfing around from Henry James to Jesse James to the epistle of James to pajamas to Obama to Alabama to Alanon to non-sequiturs, sequins, penguins, penal institutions…,” he mused in a New York Times essay. A young mind today won’t stay focused on any one thing, “like a hummingbird in an endless meadow of flowers,” he writes.
Others say they just have an innate feeling that Carr and his ilk are on to something. John Miedema, who lives in Ottawa, says that he can tell the difference between reading online and in print. “The quality of the memories feels different” online, says Mr. Miedema, the author of the book “Slow Reading.” “The quality of the memories is less rich than it is when I read more slowly.”
His “aha” moment, Miedema says, was when he read Carr’s explanation of the difference between quick skimming and scanning on the Web, which lodges in the brain’s short-term memory and is quickly lost, and the long-term memories that a more thoughtful kind of slow reading provides. “I share Nicholas Carr’s feeling that my brain has been rewired,” he says.
Among the pet peeves of those critical of online reading are hyperlinks, those underlined words or phrases that when clicked on take the reader to another web page. “The web is almost built for distraction,” Miedema says. “The links are designed to take you away from what you are reading.” The evidence, he says, is clear. “People don’t really read on the web.” They skim, he says.
Some research shows that online browsing doesn’t result in learning that really sticks. “We’re often not learning when we’re multitasking; we’re just skimming the surface,” Dr. Wolf says.
Even common courtesy can be a victim of our obsession to stay online. In a widely quoted passage in Ken Auletta’s book “Googled: The End of the World as We Know it,” Google co-founder Larry Page is scheduled to meet with Barry Diller, a high-powered media mogul. But during their meeting, Mr. Page continues to stare into the screen of his mobile device. “[Diller] said to Larry, ‘Is this boring?’ ‘No. I’m interested. I always do this,’ Page said. ‘Well, you can’t do this,’ Diller said. ‘Choose.’ ‘I’ll do this,’ Page said matter-of-factly, not lifting his eyes from his hand-held device.”
Some polls and studies seem to back up the “Internet is rewiring brains” argument. Nearly 30 percent of Americans under the age of 45 say using devices like smart phones and PCs increases their feelings of stress and makes it more difficult to concentrate, a New York Times/CBS News poll found last month.
Other polls point to the pervasive allure of being “connected” online. One found that a third of women ages 18 to 34 check their Facebook accounts as soon as they wake up in the morning, even before they visit the bathroom or brush their teeth. And while some 54 percent of teens send text messages by phone to their friends daily, just 33 percent actually talk face to face with them, a poll from the Pew Internet & American Life Project found.
Americans are living more of their lives online. A Harris Interactive poll last winter found American adults surf the Net on average 13 hours per week, not counting e-mails. The number was just seven hours per week in 2002.
And while only 23 percent of adults think they personally spend too much time on their Internet-linked gadgets, according to a Rasmussen Reports survey earlier this year, 75 percent think young children spent too much time online and playing video games.
But plenty of high-powered intellects remain skeptical that hours spent online is “rewiring our brains” or making us dumber.
“It’s indisputable that the Internet has made us smarter. … The range of things you can explore in a day is just fantastic compared to 20 years ago,” says David Weinberger, senior researcher at the Berkman Center for Internet and Society at Harvard University in Cambridge, Mass. “There’s no question that we feel the Internet has made us better researchers, better thinkers, better writers.”
Steven Pinker, a professor of psychology at Harvard, points out that one kind of deep thinking — scientific research — is flourishing today as the Internet allows unprecedented levels of collaboration and cooperation. “Discoveries are multiplying like fruit flies, and progress is dizzying,” he wrote last month in The New York Times.
Paul Saffo, a longtime Silicon Valley technology forecaster, says the engineering students he teaches at Stanford University in California show outstanding skills in what he calls “associative memory” — how to know what to look for. “They’re fast with [making] connections,” he says. “Yes, they’re probably less likely to read a 500-page book than their parents were. But … I can remember when I was in college, I didn’t exactly leap at the opportunity to spend a day reading a 500-page book either.”
The “rewiring our brains” argument could just as easily be blamed on watching too much television, “if it’s even really happening,” Mr. Saffo suggests. “I’ve had an e-mail account since 1984. And I’ve got two computers running in here. But the biggest problem in my office is tripping over all the books.”
What the Internet has done for him is “cut in on my time” to read books by giving him more choices and temptations, he says. “But it hasn’t made me become more shallow.”
Perhaps the printed book, revered by old-school scholars as the ideal vehicle for promoting deep thinking but bereft of hyperlinks and static and unchanging, is actually holding back our thinking process and intellectual endeavors, Mr. Weinberger argues.
Books “are not the shape of knowledge,” he says. “They’re a limitation on knowledge.” The idea of a single author presenting her ideas “was born of the limitations of paper publishing. It’s not necessarily the only way or the best way to think and to write.”
Paper requires a writer to divide topics and to “close them off,” Weinberger says. “All these are very unnatural things. The world does not consist of topics that begin on Page 1 and end on Page 256. The Internet has a better ability to reflect the structure of knowledge than books do.”
On the web, if a writer allows readers to comment he can’t expect to command an argument without interruption. But his thinking may be stimulated by what others have to say. “It seems to me we’re better off for that,” Weinberger says. “It’s going to be distracting, sure,” he adds, but if they’re saying interesting things, “that’s also enriching. … Isn’t that better?”
The world, as Internet visionary Ted Nelson has written, is full of cross-connections among myriad topics that can’t be neatly divided up. Those chains of relationships map neatly with hyperlinks and the “webby” online world. The discomfort being felt by those old enough to have known a world without the Internet may not persist, Weinberger says. “Now we have a generation coming up that hasn’t lived through the transition” from a print world to an online world, he says.
* * * * *
No one, including Carr and Wolf, argues that people in the 21st century can or should stop using the Internet and gadgets that link to it. And no one really knows what the right amount of online activity should be or how individuals can best manage it.
“It has to begin with people questioning [the use of technology] in their own lives,” offers Carr, who says he didn’t intend his book to provide answers so much as to examine the problem. “We’re all responsible for how we spend our time and the choices we make.”
People addicted to being online are not going to stop using the Internet altogether, “anymore than a food addict is going to stop eating food,” says Kimberly Young, a psychologist in Bradford, Pa., who is founder and director of The Center for Internet Addiction Recovery.
For children, getting them involved in real-world activities is a start, she says.
“If young people are engaged in band, swimming, extracurricular things where they’re meeting other kids, I think they’re OK,” says Dr. Young, who notes that while Internet addiction has not been formally recognized as a mental problem in the United States, it is already being treated by professionals such as herself.
Wolf makes sure she stays off-line at specific times. “For a half hour before bedtime and a half hour in the morning I do nothing digital,” she says.
Then there’s the software solution. Freedom, a program developed by Fred Stutzman at the University of North Carolina, locks users’ computers out of Internet access for up to eight hours at a time.
Even if we’ve lost our ability to read deeply, we can regain it. “Our brains are very adaptable and flexible,” Carr adds. “If you change your habits, your brain is very happy to go along. The hard thing is to change your habits.”
Meanwhile, Wade Warren’s mom, Stephania Serena, is living on the front lines, trying to decide how to manage her son’s immersion in the digital world he spends so much of his life in.
“I’m not the perfect role model necessarily for my kids. I work on the computer, I’m on a lot,” says Ms. Serena, who is a designer and photographer. “It’s crazy. I think we need to be more disciplined and it’s really hard.” She’s been known to keep working on her iPhone while trying to fix dinner at the same time.
“I think it’s hard enough for adults, but it’s a million times harder for kids,” she says.
She knows Wade is a child of the Internet. “One of his first sentences was ‘on, off, peto.’ ‘On, off, computer.’ He called it ‘peto.’ We have a little recording of it,” she says.
It may come down to personal responsibility. “You have to be in charge. You can’t let the computer be in charge,” she allows.
Wade did some cooking with her during the week his gadgets were hidden away, and his mom noticed the new level of attention to others. “Since then we’ve been making ice cream,” she says. “I wish he spent more time outdoors, but we’re getting there.”