Former Congressman Anthony Weiner’s sexting scandal sent shock waves throughout the nation. But, can American Internet users relate to Mr. Weiner’s questionable online behavior?
Although 82% of Internet users nationally say they have never sent or said anything over the Internet that they regret, a notable 18% have.
Younger Internet users are more likely than older ones to have engaged in regrettable online actions. 24% of Internet users younger than 45 years old compared with 13% of those 45 and older report this to be the case. And, men who use the Internet — 21% — are slightly more likely than female Internet users — 15% — to have sent or said something online they wish they could take back.
In general, what kind of impact does social media like Facebook have on relationships? Half of Internet users nationwide — 50% — think social media does more harm than good. About one-third — 33% — report social media does more good than harm, and 17% are unsure. Similar proportions of adults overall share these views. 51% of residents think social media does more harm than good while nearly three in ten adults — 29% — think it has a positive impact. 20% are unsure.
When ill, a sizeable proportion of the American population turns to the Internet to uncover the cause of their illness. According to this national Marist Poll, 37% of adults nationally say they self-diagnose using the Internet at least sometimes. This includes 6% who always go online to self-diagnose, 7% who do so often, and nearly one in four — 24% — who act as “cyber doctor” some of the time. However, one in ten residents — 10% — seldom self-diagnose online, and a majority — 53% — never do.
“Younger Americans are more likely to think the Internet makes them medical experts. Those under 45 are also less likely than older Americans to get an annual checkup,” says Dr. Lee M. Miringoff, Director of The Marist College Institute for Public Opinion. ”And, no surprise here, women are more likely than men to schedule an annual checkup.”
Overall, only 54% of Americans report they go to the doctor every year for a checkup. 36% visit their physician just when they are sick, and one in ten — 10% — say they never go to the doctor.
Jonathan Franzen, whose novel Freedom recently hit bookstore shelves, has an interesting idea about the role of novelists in the digital age.
In a video interview, the author discusses his earlier struggles with the question, “Why should I write?” and says he found at least one justification: a novel is a portal into a world in which the reader no longer feels alone. While a reader’s reality may be ruled by unjust people and dominated by forbidding customs, a book provides a connection to an author who might be equally appalled at the state of things. Reader and writer are united in their solitude, which is made much more bearable as a result. In this way, solitude serves an important purpose in that it compels a reader to read and a writer to write, requiring them to forge a connection that transcends the constraints of their lives.
Franzen goes on to say that in the digital age our opportunities for solitude are rapidly disappearing. Chained to our communication devices, we have nonstop access to our co-workers, friends and family. Certainly, this access diminishes our sense of aloneness. But, Franzen raises the point that the “beeping devices” in our lives may only provide superficial relief, leading us to endlessly check our messages as though we’re just one click away from a satisfying connection.
In that sense, Franzen articulates a battle that may have occurred in the minds of many writers who feel their craft is becoming obsolete. On one side, we have technology capturing all individuals in an expanding communication net,while on the other, we have practitioners of the “old” forms of media – novelists, playwrights, nonfiction writers, etc. – whose work isn’t easily integrated into this digital grid. Novelists, Franzen says, are tasked with enticing people away from their linked-up lives.
The entire interview is worth watching, but for Franzen’s comments on life in the digital age, fast-forward to the 9:06 mark:
Has the Internet made it more or less acceptable for a person to use and claim another person’s work as his or her own?
Most Americans believe the Internet has had an impact on the way they view the practice of plagiarism. According to the latest national Marist Poll, half of residents say the technology makes it less acceptable while 35% believe it makes it more acceptable. Just 8% think the Internet has had no impact on the acceptability of plagiarism, and 7% are unsure.
There is a slight generation gap on this question. A majority of Americans under the age of 45 — 52% — think the Internet makes it less acceptable to claim another person’s work as his or her own. 48% of those 45 and older agree.
Does the Internet “dumb” Americans down? That’s the question the Marist Poll asked in its latest national survey. And, the answer is, “No,” for more than two-thirds. 68% of residents think the Internet makes us smarter while just 23% believe it has made us less intelligent. 9% are unsure.
Perhaps, one of the most interesting findings is there is no generation gap on this question. 69% of Americans younger than 45 report the Internet makes us more intelligent, and the same proportion of those 45 and older agree.
But, women have a slightly better perception of the so-called Information Superhighway than do men. 71% of women think the Internet makes us smarter while 66% of men agree.
With the news from The Marist Poll that an overwhelming 68% of U.S. residents believe the Internet is making us smarter, I’m beginning to think I should just hop on the bandwagon and see where it takes me. Still, I can’t help asking why people are so optimistic.
The general argument linking smarts to the Web seems to go like this: Because of this vast online memory store, parts of our mind that would have been tied up in the dark days preceding the Web are freed to accomplish new tasks. With the Web harboring all the data we need, we know finding an answer is as simple as typing a query into a search engine, and this certainty alters our approach to any task that requires information we lack. Now, we don’t have to spend time and effort acquiring such knowledge; the Internet holds it for us, and we are more productive under this lightened load.
Some people characterize the Internet as an extension of our brains. In his Atlantic article “Get Smarter,” Jamais Cascio discusses the rise of computers and devices dubbed “exocortical technology,” which allow us to perform tasks we never dreamed of. He writes: “As the digital systems we rely upon become faster, more sophisticated, and (with the usual hiccups) more capable, we’re becoming more sophisticated and capable too.” The article is fascinating, and I encourage you to read it – among other things, it suggests that in addition to computers, drugs will be developed that help us perform cognitive tasks better.
But I can’t stop myself from protesting that the Web, one of these “sophisticated” systems, has spawned a certain amount of unpleasantness: paparazzi-fueled “news,” silly viral videos, a huge number of scams … the list goes on. While the Web can be seen as a tool to help us achieve things, it also appears to be able to distract us, sell us things we don’t need, and lead us down fruitless paths as we seek information. One could argue that the Web is still in its infancy, and guides will emerge to point us in the right directions. But one could also argue that powerful entities who see the medium as a piggy bank waiting to be smashed don’t want that to happen.
Nicholas Carr, whose article “Is Google Making Us Stupid,” also in the Atlantic, created quite a buzz among tech pundits, points out that for all of the Internet’s innovative power, it could be altering something fundamental about the way we read. Carr writes: “In the quiet spaces opened up by the sustained, undistracted reading of a book … we make our own associations, draw our own inferences and analogies, foster our own ideas.” Such deep reading, he says, isn’t encouraged by the Web’s architecture, which is designed to accommodate shallow, fast processing: the more we click, the more some company stands to sell us something.
I doubt Carr was surprised when a survey from the Pew Internet & American Life Project revealed 81% of experts believe “Nicholas Carr was wrong: Google does not make us stupid.” He knows as much as anyone that the bandwagon is alluring and swift, with some authority figures at the wheel. So while the Web skeptics and evangelists will go back and forth (the evangelists enjoying the majority position), one thing is abundantly clear: most people trust the Web to propel them into the future. If that’s the case, then regulation, analysis, and organization are in order. Perhaps we need the skeptics to keep the bandwagon from tipping over.
Social networks such as Facebook, MySpace, and LinkedIn have yet to convince the majority of U.S. residents to sign up, but if a recent national Marist Poll on the topic is any indication, it’s only a matter of time before they do.
The survey found that 41% of U.S. residents have a profile on a social networking site, a 9 percentage point jump since Marist last asked about social networking in June.
Notably, the social networking generation gap may be shrinking. Although those under age 45 still outnumber the proportion of older Americans who stay connected online, more Americans age 45 and older have discovered the interactive joys of trading witticisms, sharing photos, and swapping links. 23% of people in that age group now report having an account compared with 14% when Marist last asked this question in June.
Growth continues for people under 45, as well. 65% of residents under 45 years old say they have a social networking profile while 59% said the same in the last poll.
Americans who are employed are also more likely to appreciate the advantages social networking affords. Nearly half — 48% — of people with a job have a profile compared with only three in ten adults who are not working.
When it comes to having a social networking profile, women are more likely to connect with family, friends, and colleagues online. 45% of women report they have profiles, and 36% of men say the same.
Relationships and Social Networking
The substantial increase may be explained by the general perception that social networks are a good way to strengthen connections to friends and family. 68% of U.S. residents with profiles say the sites help their personal relationships while 12% say they hurt them. 20% are unsure.
Age is also a factor on this question. More younger Americans with a social networking profile think using this form of communication helps relationships compared with those who are older. 71% of those under age 45 think this is the case compared with 63% who are 45 and older.
Have you ever fallen into a tech-hole?
You’re sitting at your computer, logged into your Facebook, Twitter and other social networking accounts, immersed in the links, videos, comments and other digital flotsam shooting down the info streams. Meanwhile, a person, real flesh and blood, walks in the room and wants your attention. You don’t hear his words; you mindlessly wave him away. You’re busy … with your virtual friends.
Perhaps that’s never happened to you. As for me, I’ve spent a serious number of hours in the tech-hole. Based on a recent Marist poll, the number of Web users with social networking accounts, and perhaps susceptible to this experience, is growing rapidly. This furious growth has led some to question whether the effects of spending so much time on Facebook, Twitter and their ilk could be harmful.
In the U.K., neuroscientist Susan Greenfield took her concerns about social networks to the House of Lords, suggesting that the use of the sites could affect the human brain — especially a child’s brain — in profound ways. One of her more frightening points was that using the sites could yield a generation of grown-ups with the emotional depth and cognitive abilities of big babies. The social networks provide experiences that are “devoid of cohesive narrative and long-term significance,” said Greenfield. ”As a consequence, the mid-21st century mind might almost be infantilized, characterized by short attention spans, sensationalism, inability to empathize and a shaky sense of identity.” Among other things, she called for an investigation into whether the overuse of screen technologies could be linked to a recent spike in diagnoses of attention-deficit hyperactivity disorder. People who spend formative years surfing the Internet, an environment characterized by “fast action and reaction,” could come to expect similar instant gratification in the non-virtual world, said Greenfield.
Her concerns have probably resonated with Web skeptics because she’s homed in on recognizably annoying online behavior. For example, if you’ve ever been irritated when a friend updates his or her status message to broadcast a favorite kind of toothpaste – e.g., “[Person X] is contemplating the different colors of AquaFresh” — Greenfield sympathizes. “Now what does this say about how you see yourself?” she asks of those prone to posting personal trivia. “Does this say anything about how secure you feel about yourself? Is it not marginally reminiscent of a small child saying ‘Look at me, look at me mummy! Now I’ve put my sock on. Now I’ve got my other sock on.’”
Not everyone is receptive to Greenfield’s concerns. Ben Goldacre, a British writer, broadcaster and doctor, and author of a Guardian column called Bad Science, says Greenfield is irresponsibly using her position as head of the Royal Institution of Great Britain — a body devoted to improving the public’s knowledge of science — because she doesn’t have any empirical evidence backing up her fears. If Greenfield wants to promote awareness of the scientific method, says Goldacre, she shouldn’t be spending so much time airing her qualms about untested hypotheses. Greenfield’s caveats that her purpose is to raise questions, not give answers, aren’t enough for Goldacre; he says she’s recklessly generating scary headlines that frighten a Web-loving populace. “It makes me quite sad,” he writes, “when the public’s understanding of science is in such a terrible state, that this is one of our most prominent and well funded champions.” In a heated BBC debate on the social networking controversy, you can see Goldacre square off against Dr. Aric Sigman, who says we should be wary about the time we spend in front of screens subtracting from the time we spend talking to people.
Despite the squabbling, it’s probably safe to say that thinkers on both sides of the issue would agree that more research is needed. To that end, various studies and polls have been published on the social networks in particular and increased Web use in general. For example, the USC Annenberg Center for the Digital Future reported that households connected to the Internet were experiencing less “face-to-face family time, increased feelings of being ignored by family members using the Web, and growing concerns that children are spending too much time online.” On the other hand, a poll conducted by the Pew Internet & American Life Project suggests that use of cell phones and the Internet has not, generally speaking, contributed to social isolation (I urge you to view their conclusions for a much more precise explanation).
In the meantime, the tech-hole always beckons, so much so that Web addiction treatment centers have emerged to help people who can’t prioritize the real world over the virtual one. While weighing in on the controversy, Maggie Jackson, the author of “Distracted: The Erosion of Attention and the Coming Dark Age,” offers this advice to Web users: “Going forward, we need to rediscover the value of digital gadgets as tools, rather than elevating them to social and cognitive panacea. Lady Greenfield is right: we need to grow up and take a more mature approach to our tech tools.” In other words, technology exists to support our relations with other human beings, not replace them.
In theory, it’s easy to remember that. In practice, we might find ourselves sacrificing hours to the digital ether, convincing ourselves that we’re connected to everyone, but in reality being connected to no one.
Less than one-fifth of Americans say they are going wild with their own personal websites. Just 18% report they’ve created their own web page compared with 82% who have not.
So, just who are going digital? More men than women maintain their own websites — 21% to 14%, respectively. Not surprisingly, younger Americans and more affluent residents nationwide beat out their counterparts on this question. 21% of those younger than 45 have launched their own online page compared with 15% of those 45 and above. 22% of Americans with an annual household income of $50,000 or more are likely to have a website. This compares with 16% who earn less than $50,000. Region does come into play as well. More people in the Northeast and West have their own URL than do those in the Midwest and South.
If you were to ask me a year ago if I ever thought I’d be overseeing a website like The Marist Poll’s, honestly, I would have laughed in your face. You see, my feet were firmly planted in traditional broadcast news (whatever that means anymore), and although I would often talk about changing careers, I knew where my heart was. I had no real plans to make a move. Perhaps, I should have listened to John Lennon’s prophetic words, “Life is what happens to you while you’re busy making other plans.”
Please, don’t misunderstand me! I love Pebbles and Pundits. It allows me to flex my editorial muscle, and that was the challenge. My background is more editorial than technical. But, we, at The Marist Poll had a message to get out, and it was my job to figure out how to do so.
Now, if you’re one of the 82% of Americans who The Marist Poll discovered does not have a personal website, my guess is it’s not due to a lack of ideas or messages. My money is on a fear to broach the digital divide. But, fear not. If I could spearhead a project like Pebbles and Pundits, you certainly can launch a website of your very own!
Don’t believe me? Well, here’s some advice that I hope will help. First, don’t go it alone. You’re intimidated, and that’s ok. Admit your limitations and get past them. The best way to do that — research. We chose WordPress as our publishing platform. WordPress is free and user-friendly. It also has a ton of widgets to help you add to your site as well as a community of users! I’m also not too proud to admit that I turned to WordPress for Dummies for advice. Now, this isn’t a commercial for WordPress. There are many other platforms out there and tons of books on the market to assist you in building a website. In fact, I typed the search string, “Creating a website,” into BarnesandNoble.com and discovered 712 books on the topic. When I did the same search on Amazon.com, I found 2,163 books. If you don’t want to invest the money in buying a book, there are online resources you can check out. Your local public library is also a good place to start. And, while you are researching, don’t forget. You will need a hosting platform. Explore all of your options. There are inexpensive ones out there.
Plus, keep an eye on the latest tech news. There are websites that often review companies’ services (many of which are free). Those sites can be extremely useful in your project. I, personally, like TechCrunch. It offers an RSS feed that helps keep you up to date even if you forget to go to the website!
Friends and family can also prove to be valuable sources of untapped knowledge. Maybe, your best friend’s uncle is a computer guru. Ask if you can shoot him an email with your questions or pick his brain over a cup of coffee. It couldn’t hurt. Surrounding yourself with good, intelligent people is a great strategy. Not only can they help you get your website up and running, but they’ll also be there when you need to troubleshoot problems which will inevitably arise. And, ask, for comments and feedback on your site. It’s the most honest way to find out what needs fixing. Nothing is ever perfect, and there is a learning curve. Look at obstacles as learning experiences which will ultimately make your website even better!
Most importantly, though, have fun with the process especially if your website is a personal project and not one for your company or organization. In either case, remember to be realistic in your goals. Rome wasn’t built in a day, and my gut tells me that if you’re a novice, neither will your website. Maybe, you want to market your content to a certain audience but the comments your visitors are leaving show that’s not who you’re attracting. That’s ok. The greatest thing about having your own site is that you can change it whenever you want.
The internet, to a certain extent, has made formal publishers obsolete. It allows every person, young and old, to be his or her own publisher. The resources to assist you are available. So, don’t let your fear prevent you from communicating your message to the digital world.