One could argue that digital technology has helped make us better multitaskers. These days, we can simultaneously check our e-mails, monitor our Twitter feeds and listen to a podcast, all while eating our breakfast. Wouldn’t it make sense that such a capacity for divided attention is making our brains stronger?
Unfortunately, that might not be the case. Experiments comparing the ability of heavy multitaskers – thus designated based on self-reports about their technology use – to non-multitaskers, found that the latter group actually performs better on certain cognitive tasks. In a Stanford study, cited in a recent New York Times article, subjects participated in a test that required them to ignore extraneous inputs, a measure of their ability to filter out distractions. (You can take a test on ignoring distractions here.) In another test, participants had to switch between tasks, showing their ability to adjust to new information and task demands on the fly. (Take a task-switching test here.) In both cases, the non-multitaskers performed better than heavy multitaskers. Based on these and other experiments, the scientists surmised that multitaskers are more responsive to new incoming information. On the positive side, one might say the multitaskers are more alert to new stimuli; on the negative side, one could claim their multitaskers’ focus is more easily disrupted.
As with many scientific studies, the tests in this case might not truly reflect real world situations. A cognitive test in a laboratory could fall short of replicating the experience of juggling computer applications. As always, more study is needed to examine, among other things, how different amounts of multitasking affect performance on cognitive tasks, and whether the recency of one’s immersion in technology affects the ability to direct attention. Nonetheless, it would appear that heavy use of gadgets and computers is influencing our brain function.
On the plus side, there is also evidence that screen technology benefits certain cognitive skills. It has been demonstrated in the laboratory that playing action video games improves visual attention in several ways. Gamers show the ability to process more visual inputs than non-gamers, the ability to process inputs across a greater field of view, and a better ability to process inputs presented in rapid succession. Considering the deficits shown by people with disabilities and the demonstrated erosion of certain cognitive skills among the elderly, perhaps, action video games – or programs that mimic them – can be used therapeutically.
Above all else, the experiments reveal the apparent power of technology to mold our brains, for better and for worse. The question, however, may be whether we can harness our gadgets’ power to maximize the benefits and minimize the harm.