olympic rings in a snowy town

Time Machine: Winter Olympics

Every four years, the Winter Olympics return with a blend of athletic spectacle, global politics, and national pride. But heading into Milan–Cortina 2026, something unusual is happening: enthusiasm is higher now than in the past ... Read Now >

News

olympic rings in a snowy town

Time Machine: Winter Olympics

Every four years, the Winter Olympics return with a blend of athletic spectacle, global politics, and national pride. But heading into Milan–Cortina 2026, something unusual is happening: enthusiasm is higher now than in the past few games, summer and winter. New MRI-Simmons data shows 54% of Americans say they’re fans of the Winter Games — a return to pre-pandemic levels of interest.

Our Time Machine starts in Calgary in 1988, Games remembered for extending the Winter Olympics to 16 days for the first time and introducing sports like curling and short-track speed skating as demonstrations. It was a period of athletic focus—bobsleigh, biathlon, figure skating, and the debut of freestyle skiing—even as the Winter Olympics continued a trend toward media spectacle that arguably began in Lake Placid 1980 when the American “Miracle on Ice” hockey team beat the Soviets.

By the early 90s pollsters noticed something else: interest wasn’t guaranteed. After Albertville 1992, the Seton Hall Sports Poll recorded a 26% drop in viewer interest and a “20-plus percent” decline in TV ratings. Even then, Americans were already showing signs of fluctuating enthusiasm long before cord-cutting or social media reshaped viewership.

Two years later, as the Winter Games shifted to a schedule offset from the Summer Games, controversy dominated — and not for the first time. Just before Lillehammer in 1994, a poll found Americans opposed Tonya Harding skating in Norway by a two-to-one margin, even as the Games themselves later became remembered as one of the most beloved ever.

Heading into Nagano in 1998, the Games reinvigorated some public interest with the introduction of snowboarding, the return of curling, and the debut of women’s ice hockey. Opening the men’s ice hockey tournament to the world’s professionals added new star power—a reminder that the Olympics have often adapted to win back audiences.

Salt Lake City in 2002 offered another twist. Before the Games, 45% of Utahans didn’t want the Olympics at all. But after the events concluded—widely celebrated as one of the most successful U.S. Games in history—public opinion flipped. The Olympics have often had a way of winning back hearts once the flame is lit.

Turin 2006 generated strong hometown feelings, with 81% of Turinese residents saying the Olympics would be “strategic for the city.” Yet nearly 20% still didn’t entirely understand what the Games would entail—a reminder that local opinion often starts with uncertainty and caution.

By Vancouver 2010, public sentiment centered on inclusion: 73% of Canadians wanted women’s ski jumping added to the Games, and when the Olympics finally arrived, 46% of newspaper editors, news directors, and editors at the major news websites across the country named it the top news story of the year in Canadian Press surveys. Even in the digital age’s early acceleration, the Games could unify national attention.

Sochi 2014 was different. A Pew poll found Americans deeply split over whether it was a “good decision” to hold the Games in Russia. Age mattered: 49% of adults 18–29 said hosting the Games there was a good idea, compared with just 24% of those 50+. Concerns over terrorism and Russia’s policies toward LGBTQ+ people dominated the reasoning among those who opposed the location. The sports were on the ice; the debates were elsewhere.

In PyeongChang 2018, geopolitics again overshadowed athletics. An Ipsos global survey found that 50% of people worldwide were worried North Korea would behave provocatively, with U.S. concern—at 58%—among the highest. That anxiety loomed large even as Korean diplomacy attempted to calm tensions, and the games came off without a geopolitical hitch.

Beijing 2022 arrived amid both diplomatic strain and public disinterest. Pew found that 91% of Americans had heard little or nothing about the U.S. diplomatic boycott, yet nearly half approved of it. Views of China were similarly icy: 54% described China as a competitor and 35% as an enemy. Perhaps the political noise helped depress pre-game excitement. Ipsos found 42% of Americans were interested in the game prior to their start — lower than interest ahead of many prior summer and winter games.

And that brings us back to now and the run-up to Milan 2026. According to MRI-Simmons, 22% of Americans say they plan to watch (remember the same survey showed 54% say they’re fans) which, in a fractured media age, may prove to be a strong outcome. It is, after all, about 55 million people!

If past Winter Games were shaped by boycotts, geopolitical tension, ratings drops, or breakthrough sports moments, perhaps Milan–Cortina will be defined by something different: all the new ways fans (and non-fans) see and interact with the Games.

vaccine in arm

Time Machine: Flu Shots

For more than six decades, the flu shot has been a key tool in protecting public health in the United States. What started as a limited campaign to vaccinate high-risk populations has grown into a yearly ritual for millions of Americans. Behind every injection is a story of shifting public opinion, policy decisions, and medical advances that together have shaped how Americans view and use flu vaccines.

In 1957, during the Asian Flu pandemic, Americans became more aware of the dangers of influenza. A Gallup Poll from that year showed widespread concern about the flu and support for vaccination.

Two decades later, in 1976, a swine flu scare prompted a national vaccination campaign. Early public opinion polls, such as a Gallup Poll from August, indicated that about half (53%) of Americans planned to get the jab.

By 1979, awareness of flu vaccinations had grown, but uptake remained limited. According to an Opinion Research Corporation survey from that year, only 15% of Americans said they planned to get a flu shot, despite widespread public health campaigns encouraging vaccination for high-risk groups like older adults. Surveys from this period revealed mixed attitudes: while many recognized the importance of flu shots for preventing illness, others expressed skepticism about their effectiveness or concerns about potential side effects.

The 1980s and 1990s marked a period of expanding access. Employers, schools, and healthcare providers offered flu shots more widely, and federal public health campaigns emphasized the benefits of annual vaccination for some groups.

By 1999, polls indicated that most Americans viewed flu shots positively, and vaccine coverage was steadily increasing, though differences persisted by age and region. For instance data from this era shows that coverage among those 65 and over peaked around 65% and hovered there until 2013.

The early 2000s brought new challenges and awareness. Following 9/11, public health messaging emphasized pandemic preparedness, which boosted interest in flu vaccination. Surveys from 2001 to 2003 showed that Americans largely supported flu shots, particularly for children and other vulnerable groups. Coverage remained uneven, but the overall trend reflected growing trust in the vaccine’s role in preventing serious illness. Indeed an October 2001 American Society of Health-Systems Pharmacists found 58% of Americans said at least one person in their family had or would be getting a flu shot that year.

By the the 2010’s, public health campaigns targeted children, older adults, and people with chronic conditions. A 2010 Commonwealth Survey reported 46% of American adults had or would get a flu shot. Coverage for adults continued in the 40-50% range, while pediatric rates remained higher. Polling during this period showed strong support for routine vaccination every year. 2012 showed that 45% of U.S. adults had gotten a flu shot in the last 12 months, and in 2018 54% of adults planned to get one in preparation for the upcoming flu season.

The COVID-19 pandemic disrupted routine vaccinations, and flu shot coverage declined in the early 2020s. CDC data from the 2023–2024 season shows that overall influenza vaccination rates dropped, particularly among children. Researchers note that rates also vary across counties, with lower coverage in low-income areas. Public opinion polls reflect this shift: some Americans continue to see flu shots as essential for protecting themselves and others, while others are hesitant, citing safety, necessity, or mistrust of public health guidance.

Together, these moments show how public opinion, scientific research, and government action have influenced flu shot use over decades. From early pandemic campaigns to modern concerns about access and equity, the story of flu vaccination in America illustrates the ongoing interplay between policy, medicine, and the public.

This post was written by Marist Poll Media Team member Julia Tartaglia.

View of a group of workers in safety vests and hard hats from the back.

Time Machine: Labor Unions

In any political debate, you’ll hear candidates express support for unions, labeling them as the cornerstone of American labor and manufacturing. But how important are unions nowadays and how many union members are left?

If fewer than 10% of Americans were members of a union in 2024, then why are unions still such a contentious point of discussion? Perhaps it’s the long history of organized labor in America.

In July 1936, just one year after the passage of the Wagner Act, which guaranteed workers the right to unionize, Gallup asked Americans if they supported labor unions. 76% said yes. Around this time, as a result of New Deal policies and Depression-era frustrations, membership surged. Union membership rose from about 13% of American workers in 1935 to 29% in 1939.

The passage of the Taft-Hartley Act in 1947 sought to amend and limit union power. The prohibition of practices such as closed shops (businesses which required union membership for employment) were instituted. This did not hinder overall support for unions however. According to the Department of the Treasury, labor unions would reach their peak membership in the 1950s with one-third of American workers claiming membership.

Through the post-war decades, union support continued to hold strong. Gallup, in 1957, found that 75% of Americans approved of labor unions, an all-time high. Trouble was on the way however, as media reports of corruption within unions such as the Teamsters began to shift public opinion. According to Gallup, by 1979, approval had dropped to 55%. Ronald Reagan’s firing of striking air traffic controllers in the early 80s seemed to continue eroding support. That year, 1981, Gallup found only 14% of Americans thought labor leaders had high ethical standards, and, by 1983, just 20% of American workers belonged to a union.

In the latter decades of the 20th century, union membership declined rapidly. This can largely be attributed to the loss of U.S. manufacturing plants and factories. Industries which were traditionally union strongholds, such as the steel, auto, and manufacturing industries, saw factories close or move overseas. As blue-collar jobs disappeared, union membership dropped.

But what about the public perception of organized labor? According to Pew in 1999, 70% of Americans still said that labor unions were necessary to protect working people.

Coming out of the 2008 recession, the early 2010s saw a surge in union support from new demographics. As income inequality grew, professions such as teaching became hotbeds for activism, materializing in strikes and walkouts.

Nonetheless, membership continued to decline AND support continued to rebound. Then came the pandemic. Workers in essential roles, such as healthcare, food service, and delivery, began demanding more compensation for their efforts. Gallup, in 2022, found that 71% of Americans approved of unions, the highest percentage since 1965.

Post-pandemic, unions continued to hold strong public support, achieving high-profile victories such as the Union of Auto Workers (UAW) winning concessions from automakers. A 2023 CNN/SSRS poll showed 76% of voters supported striking auto workers.

Today, though union membership only hovers around 10% according to the Bureau of Labor Statistics, their cultural presence as a tool for social equity has not faded. As income inequality continues to grow, these unions show big corporations that people — workers — still have at least some of the power.

Night Skyline with streaks of car lights on a highway

Time Machine: Auto Industry

For over a century, the American auto industry has shaped how people live, travel, and connect. But, behind the freedom of the open road is a story of change driven by safety concerns, environmental goals, and public demand. From seat belts to electric vehicles, what Americans expect from their cars has evolved.

This transformation is even seen in pop culture. In Cars 3, Lightning McQueen is challenged by a new generation of racers designed for speed and efficiency, reflecting real-world shifts in how vehicles are built and used. Over time, government policy, public opinion, and technology have worked together to build a safer and cleaner road ahead.

In 1956, President Eisenhower signed the Federal-Aid Highway Act into law, launching a $25 billion (nearly $300 billion today) project to build 41,000 miles of interstate highways. At the time, it was the largest public works project in U.S. history and, over the ensuing decades, radically changed American life. That same year, a Gallup poll found that 76% of Americans supported building more express or superhighways between large cities, showing strong public backing for the large project even before construction began.

But, with new, faster highways came concerns about auto safety.

In 1966, passage of Highway Safety Act and the National Traffic and Motor Vehicle Safety Act authorized the federal government to create and enforce more safety rules. This led to requirements such as having seat belts and head rests in the cars, energy-absorbing steering wheels, and shatter-resistant windshields. A Gallup poll from the same year showed that 54% of Americans who had seat belts in their car used seat belts “some of the time” while 32% used seat belts “always.”

Two decades later, airbags became widely available after years of research and limited production runs. In 1986, Mercedes-Benz became the first automaker to make airbags standard in its U.S. vehicles. In 1984, a Cambridge Reports National Omnibus Survey showed that 37% supported requiring passive restraints, such as airbags, while 42% favored mandatory seat belt laws, highlighting public support for more government vehicle safety rules.

By 1995, 49 states had seat belt laws. Airbags saved 475 lives in 1995 and seat belts cut fatal injuries by 45% to front seat passenger occupants, proving their life-saving impact. A 1999 Roper Starch Worldwide/Insurance Research Council Poll found that among those who had recently bought or leased a car in the past three years, 61% said safety features like airbags and anti-lock brakes were the most important factor influencing their decision.

About 20 years after airbags became the “new” auto safety standard, Electronic Stability Control (ESC) became mandatory in all new passenger vehicles sold in the U.S. This 2012 requirement marked another major step forward auto safety — this time by focusing on preventing crashes in the first place. That same year, the federal government finalized a rule requiring backup cameras in all new cars by 2018 to address rising concerns over backing accidents. By 2017, consumer demand for advanced driver assistance was growing, with 30% of drivers (asked of those who do not already have each item) saying they would want blind spot awareness and 27% selecting automatic emergency braking, according to an AARP/GfK poll.

Another huge change in auto culture came in response to the 1973-74 oil embargo. Congress passed the Energy Policy and Conservation Act of 1975, establishing the first-ever Corporate Average Fuel Economy (CAFE) standards. These rules required automakers to dramatically improve fuel efficiency from 15.2 mpg to 27.5 mpg by 1985. At the time, public opinion strongly supported improvement in fuel efficiency. A 1979 Opinion Research Corporation poll found that 89% of Americans favored providing incentives to automakers to improve fuel efficiency.

In 2007, the Energy Independence and Security Act was signed into law, aiming to reduce U.S. dependence on foreign oil and cut emissions by improving energy efficiency. One of its most impactful provisions raised fuel economy standards to 35 miles per gallon by the year 2020. A 2008 Rockefeller Foundation/Time Magazine poll found that 79% of Americans supported higher fuel economy standards, with 59% “strongly” in favor.

Reducing emissions have gone hand in hand with efficiency standards but, in 2021, President Biden signed Executive Order 14037 setting a national goal for 50% of all new passenger cars and light trucks sold by 2030 to be zero emission vehicles. The order also directed federal agencies to begin developing new emissions and fuel economy standards for vehicles starting with model year 2027. President Donald Trump rescinded the rules in 2025. But, by that time, roughly 7% of all new car sales were electric vehicles (EVs), so the impact may be muted.

From interstate highways to electric vehicles, the auto industry has changed alongside public opinion, government policy, and advancing technology. Safety features, fuel standards, and clean energy goals show how cars have evolved to meet the needs of a changing society. The road forward continues to be shaped by the push for a safer, more efficient future.

This post was written by Marist Poll Media Team member Tommy Rogers

Woman lighting a cigarette

Time Machine: Smoking in Public? Yuck!

In the summer of 1939, America was on the brink of war abroad and inhaling deeply at home. That August, a survey conducted by Roper for Fortune Magazine revealed that 41% of Americans smoked cigarettes. Smoking wasn’t just a habit, it was a cultural phenomenon. Movie stars lit up on screen, doctors appeared in cigarette ads, and you could find an ashtray in most living rooms, cars, and public spaces.

But, as war raged overseas, something curious happened on the home front. By July 1942, Gallup reported a striking shift: 63% of Americans said they smoked, up from just a few years prior. Cigarettes became intertwined with identity, stress relief, and patriotism. Packs were even included in soldiers’ rations.

As the postwar decades rolled on, smoking remained popular… with some cracks in its image. From 1960 to 1966, the U.S. government banned advertising that tar and nicotine cigarettes were “less harmful” and then, through the Federal Cigarette Labeling and Advertising Act, that warning labels should be on all cigarette packaging. In 1964, the big blow came when the Surgeon General released a report that linked smoking to cancer and other serious diseases.

Public perception shifted almost overnight. By January 1965, a Louis Harris & Associates poll showed that 62% of Americans supported warning labels on cigarette packages, while 29% opposed them. It was the first time many Americans confronted the darker side of smoking.

Fast forward to September 1974 and Americans were still deeply engaged with tobacco. Roper found that, in just the prior week, 41% had bought cigarettes, and 36% had smoked at least half a pack. The numbers were still high, but, by 1976, the tide was turning. That year, Roper found that 51% of Americans supported banning smoking in public places, while 44% still opposed the regulation.

As for compromises, Americans got creative. In 1977, Gallup found 68% supported designated smoking areas as a solution to balance public health with personal choice.

But, by 1982, the pendulum swung back. Roper found that 54% actually opposed a full public smoking ban, while 41% supported it. Resistance remained strong… until it didn’t.

By 1987, Gallup recorded a significant shift: 55% now favored a full ban on public smoking, and only 43% were against it. Americans were growing tired of the foggy haze.

The 1990s brought stronger regulation and even more debate. In March 1994, NBC News/Wall Street Journal/Hart-Teeter Research Co. found that 33% of Americans supported maintaining the smoking ban, another 33% favored increased restrictions, 22% said restrictions should stay the same, while only 8% thought there should be fewer restrictions. But here’s the kicker: a Gallup/CNN/USA Today poll from the same month found that 70% of smokers said they weren’t actually smoking less, despite new regulations.

Fast-forward to the modern era, and America’s once-smoky image is now mostly cleared up. By July 2019, Gallup found that 62% believed smoking should be completely illegal in public spaces, with just 37% disagreeing. And by November 2024, attitudes got even more personal, with Pew Research/SSRS finding that 49% of Americans claimed smoking around other people is never acceptable, and another 28% said it rarely is.

Even when we look at e-cigarette usage, only 7% of Americans told Gallup in July 2024 that they smoked an e-cig or “vaped” in the past week, while 93% had not.

From normalization to rejection, the story of cigarettes in America is one of dramatic transformation. Once glamorized, smoking is now more likely to be met with side-eyes than celebration.

This post was written by Marist Poll Media Team member Hunter Petro.

supermarket aisle

Time Machine: Food Labels

Ever wonder why your granola bar boasts about “whole” grains? Behind every word on a food label is a battle between public demand, industry influence, and government action. These labels didn’t just appear overnight. They reflect decades-long conversations about health risks, food access, and more. What started as quiet concern grew into a national conversation, one that changed the way we eat, shop, and think about what’s in our food.

In a 1950 Gallup Poll, only 1% of Americans said public health was the most important issue in the upcoming election. Concerns such as war, inflation, and the cost of living dominated public attention. Eight years later, Congress passed the Food Additives Amendment, which required industry to prove the safety of food additives before FDA approval. The law aimed to protect public health while also encouraging innovation in food technology. Two years later, in 1960, Congress enacted the Color Additive Amendment, requiring manufacturers to prove color additives in food, drugs, and cosmetics are safe. It also banned any additive that was found to cause cancer in people, under the Delaney Clause.

In 1968, President Nixon held the first White House Conference on Food, Nutrition, and Health. He called hunger a national problem and promised big changes. The event helped make food and nutrition a bigger priority for the government and helped drive future changes to food programs. That growing national focus on nutrition is reflected in public opinion. By 1970, Americans started to care more about what is in their food. A General Electric Quarterly Survey found that of the roughly four in ten Americans who believed the government should require industries to provide more information to consumers, 32% wanted more information about ingredients, preservatives, and additives on product labels. Another 20% said they wanted warnings about anything that could be harmful or addictive.

The Environmental Protection Agency (EPA) was created in 1970, leading to the 1972 Federal Insecticide, Fungicide, and Rodenticide Act. Despite this, an Opinion Research poll from 1974 found that 57% of Americans said pesticides should be safer — more than any other product listed in the question.

In the late 70’s, the Senate Select Committee on Nutrition and Human Needs, led by Senator George McGovern, introduced federal dietary goals linking diet to chronic disease. It recommended Americans eat less saturated fat and red meat. After backlash from the meat industry, the committee revised the guidelines, but it still shaped the future national nutrition policies. By 1980, the first official Dietary Guidelines for Americans was released by the USDA and HHS. Americans appeared to be paying attention: A 1982 survey from the American Medical Association found that 67% of Americans believed cutting back on fat, red meat, and dairy was a healthy choice.

In 1990, the FDA proposed broad reforms to food labeling, including mandatory nutrition facts, updated serving sizes, and new nutrient reference values. Congress passed the Nutrition Labeling and Education Act, which gave the FDA power to require nutrition labels on food products. Reflecting the impact of these changes, a 1991 Roper poll found that 88% of shoppers who read labels said nutrition information had at least some influence on what they buy.

The 1997 Food and Drug Administration Modernization Act made it easier for food companies to add health claims to labels, as long as the claims are backed by trusted government science groups. The claims may appear on products 120 days after the FDA is notified. Reading food labels was a common behavior by the late 1990s. A 1997 Wirthlin Worldwide survey found that 68% of Americans said they were reading food labels more often than they had just two or three years earlier.

Starting in 2006, the FDA required that trans fats be listed on the nutrition facts label. This labeling change led many food companies to reformulate their products to remove trans fats, since consumers could now identify and avoid them. A Roper poll that same year found that among Americans who checked nutrition labels, fat content (including saturated and trans fat) was the number one thing they looked for at 26%, followed closely by calories (25%), sugars (10%), and sodium (8%).

The 2010 Dietary Guidelines for Americans emphasized maintaining a healthy weight through calorie balance, physical activity, and choosing nutrient dense foods over those high in sodium, added sugars, and solid fats. It was released during rising national concern over obesity and chronic disease and the guidelines aimed to support healthier eating patterns across diverse U.S. populations. In early 2011, a survey by the Public Policy Institute of California found that 75% of Californians said obesity was a “very serious” public health problem. Another 21% said it was “somewhat serious.”

In 2014, the FDA proposed new food labels to better match how people really eat, like counting a 20 ounce soda as one serving. The labels also made calories easier to see and added a line for added sugars to help people make healthier decisions. Speaking of easier to see, that same year, a GfK Knowledge Networks poll found that 66% of Americans supported requiring labels for genetically modified ingredients (GMOs).

Next up were restaurants. Starting in 2018, chain restaurants with 20 or more locations were required to display calorie counts on menus and provide full nutritional information upon request. The same year the labeling rule took effect a Kaiser Family Foundation poll found that 74% of Americans supported these new rules.

In 2022, the Biden Administration hosted the White House Conference on Hunger, Nutrition, and Health, launching a national strategy to end hunger and reduce diet related diseases by 2030. Public concern about hunger and unequal access to healthy food was high at this time. According to a 2022 AP-NORC poll, 14% of Americans said poverty, hunger, homelessness were among the most important problems the government wanted to address in 2023.

Most recently, in December 2024, the FDA finalized new rules for using the healthy label on food packages. The updated definition limits added sugars, sodium, and saturated fat. It allows more nutrient rich foods like whole grains, nuts, and canned vegetables to qualify.

Together, these moments show how public concern, scientific research, and policy can shape what ends up on our plates and how we make decisions about the food we eat.

This post was written by Marist Poll Media Team member Tommy Rogers.

handgun leaning on table

Time Machine: Gun Laws in the U.S.

One minute it’s a normal day at school, the next it’s national news. Americans have seen it happen far too many times. Tragedy strikes and many Americans ask if it could have been prevented. Our relationship with guns is an endless debate. Support for stricter gun laws has shifted over time, with more support following tragedies. Yet…the last major federal law significantly restricting legal access to firearms was passed in 1994!

The National Firearms Act of 1934 was the first major gun law passed in the U.S. It made buying certain weapons, like machine guns, harder in an effort to reduce gang violence. However, firearms in the hands of any civilian can be dangerous. A World War II Veteran in 1949, named Howard Unrah, carried out one of the first mass shootings in the U.S. He killed 13 people using a Luger pistol but, despite the tragedy, there were no major revisions to gun laws until 1968, when the Gun Control Act was passed. The law made it harder for criminals/certain groups to buy guns. Additionally, the government had more control over the sales.

In 1968, when asked how to reduce crime, a Gallup Poll showed that only 1% of U.S. adults felt the need for stricter gun laws. The same poll showed that only 1% named gun control as the most important national problem, while 9% were more concerned about crime.

By the 1970’s, the debate started to grow. A 1976 Gallup Poll found that 35% of U.S adults named gun control as an important voting issue. By the late 1980’s, a majority of Americans supported stronger gun control laws. A Yankelovich/Time Magazine/CNN Poll found that 65% of adults supported stricter gun laws with 28% opposing. However, the opinions of gun owners were different. The same poll found that 63% of gun owners thought that stricter gun laws would not reduce violence with 31% believing they would. The other 6% were unsure.

In 1993, support for stricter gun control was rising. According to Time/CNN/Yankelovich Partners Poll, 70% of Americans favored stricter gun laws and only 27% opposed. Congress passed the Brady Bill that same year. The bill required a five-day waiting period for background checks prior to the purchase of all guns. A Gallup Poll from 1999 showed that 89% of Americans were still in favor of the law.

In 1994, Congress passed the Federal Assault Weapons Ban, which banned the sale and manufacture of specific semi-automatic firearms. After the law took effect, crimes with those banned weapons went down. But overall gun violence did not drop much. According to Christopher Koper, Professor at George Mason University, who studied the ban for the Justice Department, the law’s effects were limited because many older guns and magazines were still legal and widely available. The ban expired in 2004 and was not renewed, not because it was proven to fail, but because Republicans controlled Congress at the time.

Immediately following the 1999 Columbine school shooting, Princeton Survey Research Associates/Newsweek Poll found that only 11% of adults thought that stricter gun laws would be the best solution. The majority (49%) thought that there should be a larger focus towards kids who are anti-social. Also, 21% believed that schools should increase security and a few (14%) claimed that reducing violence in entertainment would be the solution.

Public support for stricter gun laws was less in the early 2000’s compared to 1993. A Time/CNN/Yankelovich Partners Poll found that 59% of Americans favored, and 35% opposed, stricter gun laws. This decline was short-lived as more and more terrorism attacks occur.

A few years after 9/11, a 2006 NORC General Social Survey found that 77% of Americans thought that gun laws should be stricter because of terrorism attacks. This increase from the early 2000’s shows that, as threats to public safety increase, more Americans are in favor of stricter gun control laws. This trend continued into 2007. Following a mass shooting at Virginia Tech University, a CBS News/New York Times Poll found that 32% of Americans believed that stricter gun laws could have prevented the incident. Furthermore, 21% believed it would have done at least a little bit. At the time, as terrorism attacks and mass shootings increased, so did support for stricter gun laws.

A major turning point in the gun debate came not from public opinion, rather the Supreme Court. The 2008 Supreme Court case District of Columbia v. Heller involved a special police officer, Dick Heller, who challenged the city’s laws that banned handgun registration and required that firearms must be kept unloaded and disassembled or trigger locked. He argued it violated the Second Amendment because guns at home should be available for personal protection. The court agreed with him, ruling the Second Amendment protects the rights to keep an operational gun at home for self-defense. The ruling strengthened individual gun rights even as many Americans supported stricter gun laws.

In the early 2010’s, mass shooting were happening more frequently, particularly in schools. After the 2012 Sandy Hook Elementary School in Newtown, Connecticut, Americans were left with the same question, would stricter gun laws have helped? A CBS News Poll found that the public was divided, with 26% thinking stricter gun laws would have helped a lot, 16% a little and 51% had no effect.

Also, in the wake of the mass shootings at a 2017 concert in Las Vegas and the 2018 Parkland school shooting, a Marist Poll/NPR/PBS Newshour found many 59% believed this country needs stricter gun laws. Only 25% believed that more people need to carry guns.

In April 2023, a Beacon Research/Shaw & Co. Research/Fox News Poll found that 43% of registered voters believed that stricter gun laws would make the country safer, while 25% believed it would make the country less safe. Many Americans today believe gun control can make our country safer but there is still a large number that disagree or feel it won’t make a difference.

Today, the debate over gun control versus gun rights remains deeply divisive. While support for stricter laws often rises after mass shootings, new laws don’t follow. Court rulings, politics, and different views on safety and gun rights all play a role. As shootings continue, many are left asking: how can the U.S. reduce gun violence while still protecting individual rights?

This post was written by Marist Poll Media Team member Tommy Rogers.

robot looking at camera

Time Machine: Concern About AI

Artificial intelligence once felt like something out of a movie, like C-3PO from Star Wars stepping off the screen and into real life. But, now, that future is real. People use AI to help with almost every part of life, even if they don’t realize it. People once saw AI as a futuristic idea, but now it has become a key part of everyday life. How did we get here, and how do Americans feel about AI? Let’s start by looking at how public opinion about AI has changed over time.

Back in 1965, a Louis Harris & Associates Poll asked Americans if they felt threatened by automation at work. Only 8% said yes. More than three in four said it wouldn’t make a difference. Most people didn’t worry about machines or robots taking their jobs. People saw technology back then more as a tool to help with work than a replacement for human workers.

That view has shifted as the technology has radically improved.

By 2008, awareness and concern about automation increased. A NORC/General Social Survey found that, of those full-time workers aged 25-62 who used a computer at work, 16% had heard of jobs at their own workplace being replaced by computers or some form of automation in the past three years. Technology kept advancing but most people had not yet seen the effects of automation in their own job. 

In 2011, public awareness of artificial intelligence grew rapidly. A Pew Research Center for the People and Press/Princeton Survey Research Associates International poll found that 22% of Americans heard a lot about IBM’s computer Watson winning Jeopardy!, and 35% heard a little. So, more than half of Americans had heard about AI and started to pay attention.

In 2017, as AI tools became more a part of everyday life, another Pew poll (Abt Associates/Pew Research Center) found that 46% of Americans used voice-controlled digital assistants like Apple’s Siri, Amazon Alexa, Google Assistant, or Microsoft Cortana. That meant almost half the population was now interacting with AI at least occasionally.

Now, in the “Age of AI,” what do people think? Earlier this year, a Quinnipiac University Poll found that 56% of Americans believe AI will decrease job opportunities. Only 13% said AI would help create more jobs. Another 24% of people believed that AI wouldn’t change the number of jobs.

But what about the benefits of AI? That same poll showed that many people already use AI tools in everyday life. They use it in many different aspects of work like writing emails, creating images, and analyzing data. Some now use AI to have conversations with bots. But there’s plenty of concern, too. Americans started asking who’s responsible for AI and who makes the rules? 

That Quinnipiac poll from earlier this year found 69% of Americans said the government wasn’t doing enough to regulate AI. Another concern was trust. When asked whether AI is being developed by people or companies who represent their interests (public’s best interest), the majority said no or that they didn’t know enough to say. Only 5% thought AI developers were on their side.

Over time, Americans have started using AI more in daily life. Polls show that even as the use increases, many remain unsure about how AI works and who controls it. Most believe the government is not doing enough to regulate AI. AI is advancing fast but the public still has many questions and concerns.

This post was written by Marist Poll Media Team member Tommy Rogers.

signs pointing in opposite directions saying Justice or Cruelty

Time Machine: Life or Death?

How has time changed Americans’ perspective on the death penalty? This week we’re taking a look back at the evolution of public opinion on capital punishment from the early 20th century to the present day.

Capital punishment has been a part of American history since the country’s inception. The first recorded death sentence in the American colonies was issued in 1608. Fast forward to 2025, and the situation is dramatically different: 144 countries have abolished the death penalty, leaving the U.S. in the minority. Currently 27 states continue to carry out capital punishment, along with the federal government and the military.

Public opinion on capital punishment has fluctuated over the decades. In one of the earliest polls on the topic during 1936, Gallup asked respondents if they supported the death penalty for murder. 61% of respondents were in support while 39% opposed. Gallup has continued to ask this question over the decades and by 1956, support had grown a bit to 64% with 34% opposed. But in the 1960s, attitudes appeared to shift: in 1966 47% of Americans said they opposed the death penalty for murder while 42% approved — the lowest level of support Gallup has ever found over decades of polling on the topic.

Indeed the very next year, 1967, approval for the death penalty in murder cases shot back up to 54% as unrest related to Civil Rights and the Vietnam War began to dominate headlines. Whether they were related is unclear.

As the 1970s unfolded, support kept climbing…and didn’t stop for several decades. Gallup found 66% support in 1976, 72% in and 80% in 1994 — the highest number Gallup has yet seen supporting the death penalty.

However, in the 2000s, the tide began to turn. When Gallup polled in 2008 64% of respondents still supported the death penalty for murder convictions, but 30% were opposed—a noticeable increase in opposition compared to past decades. By 2017, support was at 55% and opposition at 40% — the closest narrowest gap between support and opposition on this topic since 1972.

The conversation surrounding the death penalty have continued to evolve, especially as multiple states have abolished capital punishment for some crimes.

Over the past couple of years, public opinion has remained divided with a small majority supporting the death penalty. Most recently, in 2023, Gallup polled on capital punishment for murder convictions and 53% of respondents backed the penalty while 44% opposed.

The death penalty debate in America is no longer simply about “guilt” or “innocence” but has grown into a complex issue of fairness, ethics, and justice. In some ways, the data mirrors the slow unraveling of support for the practice and the end of the punishment in many states.

What does the future of capital punishment look like in the United States? While support may be steadily decreasing and many states have now banned the practice, other states — and the second Trump Administration’s Department of Justice — continue to sentence people to death and have, in some cases, moved to expand its use. Will public opinion follow?

This post was written by Marist Poll Media Team members Hunter Petro and AJ Gambino.

a vaccine and a vial

Time Machine: Trust in Vaccines

When a vaccine for polio was first introduced, humans had been grappling with the devastating effects of the disease for eternity. So, on the cusp of the cure, a February 1954 Gallup/National Foundation for Infantile Paralysis poll asked Americans, “If polio is finally licked, what would you say are the things which brought about the defeat of this disease?”

A majority, 63%, pointed to “research, science, experimentation, and doctors,” while 29% said people’s financial contributions made the difference. Just 13% credited vaccines or medicine.

But that didn’t mean the vaccine wasn’t popular. One year later, in April 1955, 84% of Americans told Gallup that they planned to have their child vaccinated with the new polio vaccine

Vaccines were broadly seen as the future of medicine.

In fact, 30 years later and a 1985 Opinion Research poll showed any concern about the safety of vaccines barely registering. Only 3% of Americans listed them as a top three health risk.

Public support for vaccines has often been strongest when outbreaks occur. In February 1991, a German measles outbreak spread among religious communities that refused vaccination, raising the question: Should the government step in?

A Lifetime Television/Princeton Survey Research Associates poll at that time found that 61% of Americans said yes, while 26% believed the government should stay out of it. When asked if vaccinations should be mandated for all children, support was even stronger, with 81% saying yes, while only 14% opposed.

A decade later Gallup found that 73% of Americans had heard “a great deal” or a “fair amount” about the advantages of childhood vaccinations. Almost everyone, 98%, agreed that vaccinating children was “extremely, very, or somewhat important.”

But vaccine hesitancy has always been an undercurrent. In 2002, the Harvard School of Public Health & International Communication Research asked about the smallpox vaccine’s effectiveness, and Americans were evenly split. 46% believed it would provide serious protection, while 48% were doubtful.

Then came COVID-19.

In August 2020, 46% of Americans told the Associated Press/AP-NORC they planned to get the COVID-19 vaccine. A quarter of respondents refused outright, and 29% weren’t sure.

By December 2020, when ABC News/Ipsos asked whether states should make the COVID-19 vaccination mandatory for people returning to work, 39% supported the idea, while 61% were against it.

Concerns about the vaccine were widespread. In January 2021, 37% of Americans told KFF/SRSS that fear of side effects was their main hesitation. Another 11% cited the speed of development and lack of adequate testing.

By April 2021, Fox News/Beacon Research/Shaw & Co. Research found that 28% of registered voters were still worried about how quickly the vaccine had been developed, and 16% didn’t believe it would work.

Debates over vaccine requirements extended to “vaccine passports.” In April 2021, Quinnipiac found Americans split, 49% thought requiring proof of vaccination was a good idea, while 45% disagreed.

And now, with COVID still circulating but attention having waned, vaccine skepticism has reached new levels of acceptance.

A new measles outbreak is spreading in Texas, infecting 200-300 people so far. The United States has withdrawn from the World Health Organization and fired hundreds of “disease detectors” from the CDC. And the new head of Health and Human Services, Robert Kennedy Jr. has a long history of making dubious claims about vaccine risks.

It’s been a long time since Jonas Salk and his polio vaccine were heralded as ringing in the age of modern medicine.

This post was written by Marist Poll Media Team member Hunter Petro.