Hooked on Gadgets, and Paying a Mental Price
When one of the most important e-mail messages of his life landed in his in-box a few years ago, Kord Campbell overlooked it.
Not just for a day or two, but 12 days. He finally saw it while sifting through old messages: a big company wanted to buy his Internet start-up.
“I stood up from my desk and said, ‘Oh my God, oh my God, oh my God,’ ” Mr. Campbell said. “It’s kind of hard to miss an e-mail like that, but I did.”
The message had slipped by him amid an electronic flood: two computer screens alive with e-mail, instant messages, online chats, a Web browser and the computer code he was writing. (View an interactive panoramic photograph of Mr. Campbell's workstation.)
While he managed to salvage the $1.3 million deal after apologizing to his suitor, Mr. Campbell continues to struggle with the effects of the deluge of data. Even after he unplugs, he craves the stimulation he gets from his electronic gadgets. He forgets things like dinner plans, and he has trouble focusing on his family.
His wife, Brenda, complains, “It seems like he can no longer be fully in the moment.”
This is your brain on computers.
Scientists say juggling e-mail, phone calls and other incoming information can change how people think and behave. They say our ability to focus is being undermined by bursts of information.
These play to a primitive impulse to respond to immediate opportunities and threats. The stimulation provokes excitement — a dopamine squirt — that researchers say can be addictive. In its absence, people feel bored.
The resulting distractions can have deadly consequences, as when cellphone-wielding drivers and train engineers cause wrecks. And for millions of people like Mr. Campbell, these urges can inflict nicks and cuts on creativity and deep thought, interrupting work and family life.
While many people say multitasking makes them more productive, research shows otherwise. Heavy multitaskers actually have more trouble focusing and shutting out irrelevant information, scientists say, and they experience more stress.
And scientists are discovering that even after the multitasking ends, fractured thinking and lack of focus persist. In other words, this is also your brain off computers.
“The technology is rewiring our brains,” said Nora Volkow, director of the National Institute of Drug Abuse and one of the world’s leading brain scientists. She and other researchers compare the lure of digital stimulation less to that of drugs and alcohol than to food and sex, which are essential but counterproductive in excess.
Technology use can benefit the brain in some ways, researchers say. Imaging studies show the brains of Internet users become more efficient at finding information. And players of some video games develop better visual acuity.
More broadly, cellphones and computers have transformed life. They let people escape their cubicles and work anywhere. They shrink distances and handle countless mundane tasks, freeing up time for more exciting pursuits.
For better or worse, the consumption of media, as varied as e-mail and TV, has exploded. In 2008, people consumed three times as much information each day as they did in 1960. And they are constantly shifting their attention. Computer users at work change windows or check e-mail or other programs nearly 37 times an hour, new research shows.
The nonstop interactivity is one of the most significant shifts ever in the human environment, said Adam Gazzaley, a neuroscientist at the University of California, San Francisco.
“We are exposing our brains to an environment and asking them to do things we weren’t necessarily evolved to do,” he said. “We know already there are consequences.”
Mr. Campbell, 43, came of age with the personal computer, and he is a heavier user of technology than most. But researchers say the habits and struggles of Mr. Campbell and his family typify what many experience — and what many more will, if trends continue.
For him, the tensions feel increasingly acute, and the effects harder to shake.
The Campbells recently moved to California from Oklahoma to start a software venture. Mr. Campbell’s life revolves around computers. (View a slide show on how the Campbells interact with technology.)
He goes to sleep with a laptop or iPhone on his chest, and when he wakes, he goes online. He and Mrs. Campbell, 39, head to the tidy kitchen in their four-bedroom hillside rental in Orinda, an affluent suburb of San Francisco, where she makes breakfast and watches a TV news feed in the corner of the computer screen while he uses the rest of the monitor to check his e-mail.
Major spats have arisen because Mr. Campbell escapes into video games during tough emotional stretches. On family vacations, he has trouble putting down his devices. When he rides the subway to San Francisco, he knows he will be offline 221 seconds as the train goes through a tunnel.
Their 16-year-old son, Connor, tall and polite like his father, recently received his first C’s, which his family blames on distraction from his gadgets. Their 8-year-old daughter, Lily, like her mother, playfully tells her father that he favors technology over family.
“I would love for him to totally unplug, to be totally engaged,” says Mrs. Campbell, who adds that he becomes “crotchety until he gets his fix.” But she would not try to force a change.
“He loves it. Technology is part of the fabric of who he is,” she says. “If I hated technology, I’d be hating him, and a part of who my son is too.”
Mr. Campbell, whose given name is Thomas, had an early start with technology in Oklahoma City. When he was in third grade, his parents bought him Pong, a video game. Then came a string of game consoles and PCs, which he learned to program.
In high school, he balanced computers, basketball and a romance with Brenda, a cheerleader with a gorgeous singing voice. He studied too, with focus, uninterrupted by e-mail. “I did my homework because I needed to get it done,” he said. “I didn’t have anything else to do.”
He left college to help with a family business, then set up a lawn mowing service. At night he would read, play video games, hang out with Brenda and, as she remembers it, “talk a lot more.”
In 1996, he started a successful Internet provider. Then he built the start-up that he sold for $1.3 million in 2003 to LookSmart, a search engine.
Mr. Campbell loves the rush of modern life and keeping up with the latest information. “I want to be the first to hear when the aliens land,” he said, laughing. But other times, he fantasizes about living in pioneer days when things moved more slowly: “I can’t keep everything in my head.”
No wonder. As he came of age, so did a new era of data and communication.
You're not really multitasking, no matter what you think
At home, people consume 12 hours of media a day on average, when an hour spent with, say, the Internet and TV simultaneously counts as two hours. That compares with five hours in 1960, say researchers at the University of California, San Diego. Computer users visit an average of 40 Web sites a day, according to research by RescueTime, which offers time-management tools.
As computers have changed, so has the understanding of the human brain. Until 15 years ago, scientists thought the brain stopped developing after childhood. Now they understand that its neural networks continue to develop, influenced by things like learning skills.
So not long after Eyal Ophir arrived at Stanford in 2004, he wondered whether heavy multitasking might be leading to changes in a characteristic of the brain long thought immutable: that humans can process only a single stream of information at a time.
Going back a half-century, tests had shown that the brain could barely process two streams, and could not simultaneously make decisions about them. But Mr. Ophir, a student-turned-researcher, thought multitaskers might be rewiring themselves to handle the load.
His passion was personal. He had spent seven years in Israeli intelligence after being weeded out of the air force — partly, he felt, because he was not a good multitasker. Could his brain be retrained?
Mr. Ophir, like others around the country studying how technology bent the brain, was startled by what he discovered.
The Myth of Multitasking
The test subjects were divided into two groups: those classified as heavy multitaskers based on their answers to questions about how they used technology, and those who were not.
In a test created by Mr. Ophir and his colleagues, subjects at a computer were briefly shown an image of red rectangles. Then they saw a similar image and were asked whether any of the rectangles had moved. It was a simple task until the addition of a twist: blue rectangles were added, and the subjects were told to ignore them. (Play a game testing how well you filter out distractions.)
The multitaskers then did a significantly worse job than the non-multitaskers at recognizing whether red rectangles had changed position. In other words, they had trouble filtering out the blue ones — the irrelevant information.
So, too, the multitaskers took longer than non-multitaskers to switch among tasks, like differentiating vowels from consonants and then odd from even numbers. The multitaskers were shown to be less efficient at juggling problems. (Play a game testing how well you switch between tasks.)
Other tests at Stanford, an important center for research in this fast-growing field, showed multitaskers tended to search for new information rather than accept a reward for putting older, more valuable information to work.
Researchers say these findings point to an interesting dynamic: multitaskers seem more sensitive than non-multitaskers to incoming information.
The results also illustrate an age-old conflict in the brain, one that technology may be intensifying. A portion of the brain acts as a control tower, helping a person focus and set priorities. More primitive parts of the brain, like those that process sight and sound, demand that it pay attention to new information, bombarding the control tower when they are stimulated.
Researchers say there is an evolutionary rationale for the pressure this barrage puts on the brain. The lower-brain functions alert humans to danger, like a nearby lion, overriding goals like building a hut. In the modern world, the chime of incoming e-mail can override the goal of writing a business plan or playing catch with the children.
“Throughout evolutionary history, a big surprise would get everyone’s brain thinking,” said Clifford Nass, a communications professor at Stanford. “But we’ve got a large and growing group of people who think the slightest hint that something interesting might be going on is like catnip. They can’t ignore it.”
Mr. Nass says the Stanford studies are important because they show multitasking’s lingering effects: “The scary part for guys like Kord is, they can’t shut off their multitasking tendencies when they’re not multitasking.”
Melina Uncapher, a neurobiologist on the Stanford team, said she and other researchers were unsure whether the muddied multitaskers were simply prone to distraction and would have had trouble focusing in any era. But she added that the idea that information overload causes distraction was supported by more and more research.
A study at the University of California, Irvine, found that people interrupted by e-mail reported significantly increased stress compared with those left to focus. Stress hormones have been shown to reduce short-term memory, said Gary Small, a psychiatrist at the University of California, Los Angeles.
Struggling to finish an article
Preliminary research shows some people can more easily juggle multiple information streams. These “supertaskers” represent less than 3 percent of the population, according to scientists at the University of Utah.
Other research shows computer use has neurological advantages. In imaging studies, Dr. Small observed that Internet users showed greater brain activity than nonusers, suggesting they were growing their neural circuitry.
At the University of Rochester, researchers found that players of some fast-paced video games can track the movement of a third more objects on a screen than nonplayers. They say the games can improve reaction and the ability to pick out details amid clutter.
“In a sense, those games have a very strong both rehabilitative and educational power,” said the lead researcher, Daphne Bavelier, who is working with others in the field to channel these changes into real-world benefits like safer driving.
There is a vibrant debate among scientists over whether technology’s influence on behavior and the brain is good or bad, and how significant it is.
“The bottom line is, the brain is wired to adapt,” said Steven Yantis, a professor of brain sciences at Johns Hopkins University. “There’s no question that rewiring goes on all the time,” he added. But he said it was too early to say whether the changes caused by technology were materially different from others in the past.
Mr. Ophir is loath to call the cognitive changes bad or good, though the impact on analysis and creativity worries him.
He is not just worried about other people. Shortly after he came to Stanford, a professor thanked him for being the one student in class paying full attention and not using a computer or phone. But he recently began using an iPhone and noticed a change; he felt its pull, even when playing with his daughter.
“The media is changing me,” he said. “I hear this internal ping that says: check e-mail and voice mail.”
“I have to work to suppress it.”
Kord Campbell does not bother to suppress it, or no longer can.
Interrupted by a Corpse
It is a Wednesday in April, and in 10 minutes, Mr. Campbell has an online conference call that could determine the fate of his new venture, called Loggly. It makes software that helps companies understand the clicking and buying patterns of their online customers.
Mr. Campbell and his colleagues, each working from a home office, are frantically trying to set up a program that will let them share images with executives at their prospective partner.
But at the moment when Mr. Campbell most needs to focus on that urgent task, something else competes for his attention: “Man Found Dead Inside His Business.”
That is the tweet that appears on the left-most of Mr. Campbell’s array of monitors, which he has expanded to three screens, at times adding a laptop and an iPad.
On the left screen, Mr. Campbell follows the tweets of 1,100 people, along with instant messages and group chats. The middle monitor displays a dark field filled with computer code, along with Skype, a service that allows Mr. Campbell to talk to his colleagues, sometimes using video. The monitor on the right keeps e-mail, a calendar, a Web browser and a music player.
Even with the meeting fast approaching, Mr. Campbell cannot resist the tweet about the corpse. He clicks on the link in it, glances at the article and dismisses it. “It’s some article about something somewhere,” he says, annoyed by the ads for jeans popping up.
The program gets fixed, and the meeting turns out to be fruitful: the partners are ready to do business. A colleague says via instant message: “YES.”