Americans perceive technological advances to be a double-edged sword and divide about whether or not technology makes us smarter or dumber, according to an Exclusive Point Taken-Marist Poll, commissioned by WGBH Boston for its new late-night, multi-platform PBS debate series Point Taken. Despite the perceived technological pitfalls, a slim majority of Americans consider the benefits of technology to outweigh the privacy risks associated with it.
The national survey was conducted by The Marist Poll in advance of this week’s Point Taken episode, airing Tuesday, June 21st at 11pmET (check local listings) and streaming on pbs.org/pointtaken. The series is hosted by Carlos Watson, Emmy Award winning journalist and OZY Media co-founder and CEO.
When assessing the overall effect of technology, 49% of Americans report it makes people dumber while 46% say technology makes individuals smarter. A slim majority, 51%, says the benefits to society outweigh the privacy risks of technological advances.
On the question of intelligence, Millennials, 53%, and Gen X, 53%, are more likely than Baby Boomers, 48%, and the Silent-Greatest generation, 38%, to say technology makes us dumber.
Generational differences also exist when assessing the risks of technology. Gen X, 50%, is the most likely to report the risks outweigh the benefits while the Silent-Greatest generation is the least likely to stress the potential privacy implications of technology. Also noteworthy, men, 55%, and college graduates, 58%, are more likely than women, 46%, and those without a college degree, 47%, to say technology’s benefits trump the risks.
“If you think younger people are all in for technological revolution, think again,” says Dr. Lee M. Miringoff, Director of The Marist College Institute for Public Opinion. “This national survey shows surprising differences among generations and their appreciation for innovation.”
“Digital and other technologies are central and definitional aspects of contemporary life,” says Denise DiIanni, series creator and Executive-in-Charge. “This week our panelists — and audiences — debate whether tech makes us smarter — or dumber.”
What are the benefits of technology? Many Americans, 74%, say technology improves education, and a majority, 54%, believes it makes individuals more productive.
Differences based on education and generation exist here, too. College graduates, 80%, are more likely than those without a college degree, 70%, to think technology makes education better. Generationally, Millennials, 82%, and members of Gen X, 80%, are more likely than Baby Boomers, 65%, and those in the Silent-Greatest generation, 65% to report technology improves education.
College graduates, 65%, are also more likely than those without a degree, 47%, to say technology makes us more productive. The Silent-Greatest generation, 64%, is more likely than other generations to say technology improves productivity. Americans who earn $50,000 or more, 58%, are more inclined than those who make less, 47%, to report technology aids productivity.
However, many Americans perceive technology to be detrimental to humanity and to personal relationships. 71% of residents nationally believe technology is making people less human, and 54% report it makes individuals less connected to family and friends.
Many residents think technology is making people less human. Members of Generation X, 77%, and Baby Boomers, 74%, are the most likely to have this view. Those in the Silent-Greatest generation, 60%, are the least likely to agree with this premise.
Those in the Silent-Greatest generation, 33%, are also the least likely of the generations to say technology enhances our connections to family and friends. Millennials, 48%, are the most likely to report technology keeps us connected but still less than a majority. Looking at gender, men, 48%, are more likely than women, 39%, to believe technology makes people more connected to their loved ones. In fact, a majority of women, 58%, report technology is detrimental to personal relationships.
This survey of 622 adults was conducted March 29th through March 31st, 2016 by The Marist Poll sponsored and funded in partnership with WGBH’s Point Taken. Adults 18 years of age and older residing in the contiguous United States were contacted on landline or mobile numbers and interviewed in English by telephone using live interviewers. Results are statistically significant within ±3.9 percentage points. The error margin was not adjusted for sample weights and increases for cross-tabulations.
One could argue that digital technology has helped make us better multitaskers. These days, we can simultaneously check our e-mails, monitor our Twitter feeds and listen to a podcast, all while eating our breakfast. Wouldn’t it make sense that such a capacity for divided attention is making our brains stronger?
Unfortunately, that might not be the case. Experiments comparing the ability of heavy multitaskers – thus designated based on self-reports about their technology use – to non-multitaskers, found that the latter group actually performs better on certain cognitive tasks. In a Stanford study, cited in a recent New York Times article, subjects participated in a test that required them to ignore extraneous inputs, a measure of their ability to filter out distractions. (You can take a test on ignoring distractions here.) In another test, participants had to switch between tasks, showing their ability to adjust to new information and task demands on the fly. (Take a task-switching test here.) In both cases, the non-multitaskers performed better than heavy multitaskers. Based on these and other experiments, the scientists surmised that multitaskers are more responsive to new incoming information. On the positive side, one might say the multitaskers are more alert to new stimuli; on the negative side, one could claim their multitaskers’ focus is more easily disrupted.
As with many scientific studies, the tests in this case might not truly reflect real world situations. A cognitive test in a laboratory could fall short of replicating the experience of juggling computer applications. As always, more study is needed to examine, among other things, how different amounts of multitasking affect performance on cognitive tasks, and whether the recency of one’s immersion in technology affects the ability to direct attention. Nonetheless, it would appear that heavy use of gadgets and computers is influencing our brain function.
On the plus side, there is also evidence that screen technology benefits certain cognitive skills. (Click here for a list of such articles.) It has been demonstrated in the laboratory that playing action video games improves visual attention in several ways. Gamers show the ability to process more visual inputs than non-gamers, the ability to process inputs across a greater field of view, and a better ability to process inputs presented in rapid succession. Considering the deficits shown by people with disabilities and the demonstrated erosion of certain cognitive skills among the elderly, perhaps, action video games – or programs that mimic them – can be used therapeutically.
Above all else, the experiments reveal the apparent power of technology to mold our brains, for better and for worse. The question, however, may be whether we can harness our gadgets’ power to maximize the benefits and minimize the harm.