This Is Your Brain on Social Networks … Any Questions?

Have you ever fallen into a tech-hole?

You’re sitting at your computer, logged into your Facebook, Twitter and other social networking accounts, immersed in the links, videos, comments and other digital flotsam shooting down the info streams.  Meanwhile, a person, real flesh and blood, walks in the room and wants your attention.  You don’t hear his words; you mindlessly wave him away.  You’re busy … with your virtual friends.

goldman-caricature-430Perhaps that’s never happened to you.  As for me, I’ve spent a serious number of hours in the tech-hole.  Based on a recent Marist poll, the number of Web users with social networking accounts, and perhaps susceptible to this experience, is growing rapidly.  This furious growth has led some to question whether the effects of spending so much time on Facebook, Twitter and their ilk could be harmful.

In the U.K., neuroscientist Susan Greenfield took her concerns about social networks to the House of Lords, suggesting that the use of the sites could affect the human brain — especially a child’s brain — in profound ways. One of her more frightening points was that using the sites could yield a generation of grown-ups with the emotional depth and cognitive abilities of big babies.  The social networks provide experiences that are “devoid of cohesive narrative and long-term significance,” said Greenfield.  “As a consequence, the mid-21st century mind might almost be infantilized, characterized by short attention spans, sensationalism, inability to empathize and a shaky sense of identity.”  Among other things, she called for an investigation into whether the overuse of screen technologies could be linked to a recent spike in diagnoses of attention-deficit hyperactivity disorder.  People who spend formative years surfing the Internet, an environment characterized by “fast action and reaction,” could come to expect similar instant gratification in the non-virtual world, said Greenfield.

Her concerns have probably resonated with Web skeptics because she’s homed in on recognizably annoying online behavior. For example, if you’ve ever been irritated when a friend updates his or her status message to broadcast a favorite kind of toothpaste – e.g., “[Person X] is contemplating the different colors of AquaFresh” — Greenfield sympathizes. “Now what does this say about how you see yourself?” she asks of those prone to posting personal trivia. “Does this say anything about how secure you feel about yourself? Is it not marginally reminiscent of a small child saying ‘Look at me, look at me mummy!  Now I’ve put my sock on. Now I’ve got my other sock on.'”

Not everyone is receptive to Greenfield’s concerns.  Ben Goldacre, a British writer, broadcaster and doctor, and author of a Guardian column called Bad Science, says Greenfield is irresponsibly using her position as head of the Royal Institution of Great Britain — a body devoted to improving the public’s knowledge of science — because she doesn’t have any empirical evidence backing up her fears.  If Greenfield wants to promote awareness of the scientific method, says Goldacre, she shouldn’t be spending so much time airing her qualms about untested hypotheses.  Greenfield’s caveats that her purpose is to  raise questions, not give answers, aren’t enough for Goldacre; he says she’s recklessly generating scary headlines that frighten a Web-loving populace. “It makes me quite sad,” he writes, “when the public’s understanding of science is in such a terrible state, that this is one of our most prominent and well funded champions.”  In a heated BBC debate on the social networking controversy, you can see Goldacre square off against Dr. Aric Sigman, who says we should be wary about the time we spend in front of screens subtracting from the time we spend talking to people.

Despite the squabbling, it’s probably safe to say that thinkers on both sides of the issue would agree that more research is needed. To that end, various studies and polls have been published on the social networks in particular and increased Web use in general.  For example, the USC Annenberg Center for the Digital Future reported that households connected to the Internet were experiencing less “face-to-face family time, increased feelings of being ignored by family members using the Web, and growing concerns that children are spending too much time online.” On the other hand, a poll conducted by the Pew Internet & American Life Project suggests that use of cell phones and the Internet has not, generally speaking, contributed to social isolation (I urge you to view their conclusions for a much more precise explanation).

In the meantime, the tech-hole always beckons, so much so that Web addiction treatment centers have emerged to help people who can’t prioritize the real world over the virtual one.  While weighing in on the controversy, Maggie Jackson, the author of “Distracted: The Erosion of Attention and the Coming Dark Age,” offers this advice to Web users: “Going forward, we need to rediscover the value of digital gadgets as tools, rather than elevating them to social and cognitive panacea. Lady Greenfield is right: we need to grow up and take a more mature approach to our tech tools.” In other words, technology exists to support our relations with other human beings, not replace them.

In theory, it’s easy to remember that.  In practice, we might find ourselves sacrificing hours to the digital ether, convincing ourselves that we’re connected to everyone, but in reality being connected to no one.

Related Stories:

12/18: The Twitter “Craze:” Not So Much

12/18: Social Networks Grow in Popularity Among U.S. Residents

12/18: Technology’s Impact on Relationships

The Future of Technology and Journalism