skyline of Tehran

Time Machine: Iran from the American Perspective

For most of the last century of American history, public opinion on Iran has been shaped by moments of crisis, conflict, and perceived threat. Today, with the United States at war with Iran, those long-standing ... Read Now >

News

skyline of Tehran

Time Machine: Iran from the American Perspective

For most of the last century of American history, public opinion on Iran has been shaped by moments of crisis, conflict, and perceived threat. Today, with the United States at war with Iran, those long-standing attitudes are once again being measured, but the roots of American skepticism stretch back decades.

In the early years of U.S.–Iran relations, Iran was not seen as a central concern for most Americans. In 1952 as the threat of communism consumer American foreign policy — and prior to the CIA-backed coup that installed the Shah in power — only 35% of Americans said it would matter a “great deal” if communists took control of Iran, according to NORC. Even more than two decades later, in 1976, public appetite for involvement remained limited, with just 23% of Americans saying the U.S. should send military aid to Iran under the Shah, based on a Time Magazine/Yankelovich, Skelly & White survey. At that point, Iran was still relatively peripheral in the American public’s view of global priorities.

That changed dramatically with the 1979 Iran Hostage Crisis, a defining moment that cemented Iran as a primary adversary in the American imagination. In the midst of the crisis, 66% of Americans said the U.S. should attack Iran if American hostages were harmed, according to an NBC News/Associated Press survey, while 69% opposed sending the Shah back to Iran for trial, per an ABC News/Louis Harris & Associates poll. Through the 1980s and early 1990s, distrust remained deeply entrenched, with Gallup finding in periodic polling that only 5% to 13% of Americans viewed Iran favorably.

Following the September 11th attacks, perceptions of Iran became even more tied to national security concerns. In 2004, Gallup found just 17% of Americans held a favorable view of the country, while 77% viewed it unfavorably. In the same polling, 58% said Iran posed a long-term threat to the United States, 72% believed a nuclear-armed Iran would attack Israel, 66% feared attacks on the U.S. or Europe, and 82% thought Iran might provide nuclear weapons to terrorist groups. By 2006, Pew found Iran topping the list of countries representing the “greatest danger” to the U.S. with 27% of Americans naming the country, a sharp increase from 9% the year before.

Despite these concerns, support for direct military conflict has always remained fairly low. In 2010 a CBS News/60 Minutes/Vanity Fair Poll found Americans divided on what would justify war: 25% said only an attack on U.S. soil, another 25% pointed to an attack on U.S. naval forces, while smaller shares cited a nuclear test (11%) or an attack on Israel (10%).

By the mid-2010s, negative perceptions of Iran reached some of their highest levels. In 2015, 84% of Americans held an unfavorable view, according to Gallup. Opinions were split on the Obama administration’s nuclear agreement — with about 38% approving and 48% disapproving — but most Americans saying they were skeptical that Iran would uphold its commitments, according to a 2015 Pew Research Center poll.

Since then tensions continued to shape opinion. After the U.S. withdrew from the nuclear deal under President Trump, 62% of Americans believed Iran had violated the agreement, according to CNN/SSRS polling. The 2020 killing of Iranian General Qasem Soleimani further escalated tensions, and public opinion remained overwhelmingly negative in the years that followed. From 2021 through 2024, Gallup consistently found that between 13% and 17% of Americans viewed Iran favorably.

Now, amid renewed conflict, the public remains wary of deeper military involvement. This past summer a Washington Post poll found that 45% of Americans opposed U.S. airstrikes on Iran, compared with 25% who supported them, while 30% were unsure. Little has changed now that we are at war. Notably, no major polling since the outbreak of the current conflict has shown majority support for the war.

Americans have long viewed Iran through a lens of caution and concern, shaped by decades of geopolitical tension. But even as negative perceptions have remained consistent, support for military action has also been consistently limited. The result is a public that sees Iran as a threat, yet remains hesitant about the costs and consequences of war, a divide that continues to define opinion today.

Time Machine labeled image of row of houses against a blue sky with the text "American Dream"

Time Machine: The American Dream

The American Dream has long been a guiding idea in the United States, the belief that hard work can lead to a better life. People have imagined owning a home, having a steady job, and helping their children do better than they did. But, how have attitudes toward the Dream changed over time? Let’s step through the decades and find out.

In the 1950s and 1960s, many Americans were confident the Dream was within reach. People believed that owning a home, having a steady job, and helping their children do better than they had were realistic goals. At the same time, Americans trusted that the system around them was stable. In 1958, about 73% said they trusted the federal government to do what is right “just about always” or “most of the time.” This didn’t mean the government would give people success; it meant that rules, institutions, and the economy were stable, creating opportunities for those willing to work hard.

By the 1970s, rising inflation and economic uncertainty made some Americans question how success was achieved. Education was becoming more important for moving up in life. In April 1978, a CBS News/New York Times survey asked whether “in order to get ahead in life these days, it’s necessary to get a college education.” Americans were nearly evenly split: 49% said yes and 47% said no. This showed that the Dream was starting to feel conditionally dependent not just on hard work, but also on access to education and opportunity.

Through the late 20th century, Americans largely kept their confidence in opportunity. Gallup data from 1998 shows that 81% of respondents felt there was “plenty of opportunity to get ahead.” Even as optimism remained high, new challenges emerged: rising inequality, debates over access to education, and changes in the job market suggested the Dream might not be equally attainable for everyone.

The Great Recession made Americans rethink economic mobility. By 2019, Gallup reported that roughly 70% of Americans believed the Dream was achievable, though a growing minority doubted whether it was within reach for most people. Hard work was still valued, but confidence in whether the system was fair and opportunities were truly available had become shakier.

Today, Americans are nearly divided on the Dream’s attainability. In 2024, Pew Research Center found that 53% believe the Dream remains possible, while 41% say it was once attainable but is no longer. People also define the Dream differently: by 2025, Gallup reported that about half of respondents thought the American Dream is more about opportunity (51%) and the other half (49%) felt it was more about stability. Across all groups, one idea is clear: hard work matters, but opportunity is not guaranteed. Education, fairness, and economic stability now play a big role in whether the Dream feels achievable.

Over time, the American Dream has not disappeared. Instead, it has become conditional, nuanced, and connected to both effort and opportunity. Americans still believe in its promise, but their faith now reflects the realities of the modern economy and society.

This post was written by Marist Poll Media Team member Molly Goodger

graduating students in a outdoor hallway in caps and gowns

Time Machine: Higher Ed

For much of modern American history, college has been more than an educational path. It has been a symbol of opportunity, mobility, and the American Dream. But heading into the mid-2020s, something striking is happening: confidence in the value of higher education is no longer a given. Where a college degree was once widely seen as the clearest route to success, Americans today are increasingly weighing the benefits versus the rising costs.

In the decades following World War II, when higher education became a cornerstone of middle-class life, the GI Bill sent millions of veterans to college. Public universities expanded rapidly and a degree came to represent, not just learning, but economic security. For generations, Americans largely agreed: college was worth it, and then some.

By the end of the 70s, that belief was pronounced. According to a 1978 Gallup Poll, 36% of Americans said a college education was very important. By 1985, that share had jumped to 64% reflecting a growing sense that a diploma mattered more than ever.

Yet, even during this period of rising confidence, concerns were emerging. In 1985, 39% of Americans said most people could afford a college education. By 1991, that share had fallen to just 25%, based on Pew Research Center data, an early warning sign that cost was beginning to erode the universal promise of higher education.

Despite cost concerns, the importance of a degree was still ingrained in many Americans. Just after the Great Recession reshaped economic expectations, a 2011 Pew Research Center Social & Demographic Trends survey reported 74% of Americans said that, in order to get ahead in life, it was necessary to get a college education. At that moment, college still stood as the central gateway to opportunity.

By the early 2010s, public sentiment remained overwhelmingly positive. In 2010, roughly 75% of Americans said a college education was very important, according to Gallup— reflecting a now long-held belief that a degree was essential for landing a good job and achieving upward mobility.

But, by the middle of the 2010’s, cracks began to appear. As tuition rose faster than wages and student loan debt became a defining issue for young adults, Americans started to question the value proposition. The belief that college was indispensable no longer went unchallenged. Even as enrollment remained high, skepticism was quietly growing beneath the surface.

A 2015 Gallup–Purdue Index study found that only 38% of recent graduates strongly agreed their degree was worth what they paid, a notably lower share than among older alumni. Even among those who had already gone to college, satisfaction was slipping. The promise of college was still there, but it no longer felt guaranteed. By 2019, only 51% of Americans saw a college education as very important.

The 2020s have made that divide even clearer. According to a 2024 Pew Research Center survey, 49% Americans now say a college degree is less important today than it was 20 years ago for getting a good job, while just 32% say it’s more important. When cost enters the conversation, the picture sharpens further: the same Pew survey found that only 22% believe college is worth it if it requires taking out loans. A much larger share (47%) say it’s only worth it without debt, and 29% say it’s not worth the cost at all.

Last year, the shift in attitudes is unmistakable. A 2025 Gallup Poll showed only about one-third (35%) of Americans now say a college education is very important, down sharply from 75% in 2010, based on earlier Gallup tracking data. Confidence in higher education has also fallen: according to a 2024 Gallup survey, just 36% of Americans express a great deal or quite a lot of confidence in colleges and universities, compared with 57% in 2015.

Americans have not abandoned higher education, but they are clearly rethinking it. The college dream hasn’t disappeared, but it has become more conditional, more cautious, and more closely tied to affordability and outcomes.

This post was written by Marist Poll Media Team member A.J. Gambino.

olympic rings in a snowy town

Time Machine: Winter Olympics

Every four years, the Winter Olympics return with a blend of athletic spectacle, global politics, and national pride. But heading into Milan–Cortina 2026, something unusual is happening: enthusiasm is higher now than in the past few games, summer and winter. New MRI-Simmons data shows 54% of Americans say they’re fans of the Winter Games — a return to pre-pandemic levels of interest.

Our Time Machine starts in Calgary in 1988, Games remembered for extending the Winter Olympics to 16 days for the first time and introducing sports like curling and short-track speed skating as demonstrations. It was a period of athletic focus—bobsleigh, biathlon, figure skating, and the debut of freestyle skiing—even as the Winter Olympics continued a trend toward media spectacle that arguably began in Lake Placid 1980 when the American “Miracle on Ice” hockey team beat the Soviets.

By the early 90s pollsters noticed something else: interest wasn’t guaranteed. After Albertville 1992, the Seton Hall Sports Poll recorded a 26% drop in viewer interest and a “20-plus percent” decline in TV ratings. Even then, Americans were already showing signs of fluctuating enthusiasm long before cord-cutting or social media reshaped viewership.

Two years later, as the Winter Games shifted to a schedule offset from the Summer Games, controversy dominated — and not for the first time. Just before Lillehammer in 1994, a poll found Americans opposed Tonya Harding skating in Norway by a two-to-one margin, even as the Games themselves later became remembered as one of the most beloved ever.

Heading into Nagano in 1998, the Games reinvigorated some public interest with the introduction of snowboarding, the return of curling, and the debut of women’s ice hockey. Opening the men’s ice hockey tournament to the world’s professionals added new star power—a reminder that the Olympics have often adapted to win back audiences.

Salt Lake City in 2002 offered another twist. Before the Games, 45% of Utahans didn’t want the Olympics at all. But after the events concluded—widely celebrated as one of the most successful U.S. Games in history—public opinion flipped. The Olympics have often had a way of winning back hearts once the flame is lit.

Turin 2006 generated strong hometown feelings, with 81% of Turinese residents saying the Olympics would be “strategic for the city.” Yet nearly 20% still didn’t entirely understand what the Games would entail—a reminder that local opinion often starts with uncertainty and caution.

By Vancouver 2010, public sentiment centered on inclusion: 73% of Canadians wanted women’s ski jumping added to the Games, and when the Olympics finally arrived, 46% of newspaper editors, news directors, and editors at the major news websites across the country named it the top news story of the year in Canadian Press surveys. Even in the digital age’s early acceleration, the Games could unify national attention.

Sochi 2014 was different. A Pew poll found Americans deeply split over whether it was a “good decision” to hold the Games in Russia. Age mattered: 49% of adults 18–29 said hosting the Games there was a good idea, compared with just 24% of those 50+. Concerns over terrorism and Russia’s policies toward LGBTQ+ people dominated the reasoning among those who opposed the location. The sports were on the ice; the debates were elsewhere.

In PyeongChang 2018, geopolitics again overshadowed athletics. An Ipsos global survey found that 50% of people worldwide were worried North Korea would behave provocatively, with U.S. concern—at 58%—among the highest. That anxiety loomed large even as Korean diplomacy attempted to calm tensions, and the games came off without a geopolitical hitch.

Beijing 2022 arrived amid both diplomatic strain and public disinterest. Pew found that 91% of Americans had heard little or nothing about the U.S. diplomatic boycott, yet nearly half approved of it. Views of China were similarly icy: 54% described China as a competitor and 35% as an enemy. Perhaps the political noise helped depress pre-game excitement. Ipsos found 42% of Americans were interested in the game prior to their start — lower than interest ahead of many prior summer and winter games.

And that brings us back to now and the run-up to Milan 2026. According to MRI-Simmons, 22% of Americans say they plan to watch (remember the same survey showed 54% say they’re fans) which, in a fractured media age, may prove to be a strong outcome. It is, after all, about 55 million people!

If past Winter Games were shaped by boycotts, geopolitical tension, ratings drops, or breakthrough sports moments, perhaps Milan–Cortina will be defined by something different: all the new ways fans (and non-fans) see and interact with the Games.

vaccine in arm

Time Machine: Flu Shots

For more than six decades, the flu shot has been a key tool in protecting public health in the United States. What started as a limited campaign to vaccinate high-risk populations has grown into a yearly ritual for millions of Americans. Behind every injection is a story of shifting public opinion, policy decisions, and medical advances that together have shaped how Americans view and use flu vaccines.

In 1957, during the Asian Flu pandemic, Americans became more aware of the dangers of influenza. A Gallup Poll from that year showed widespread concern about the flu and support for vaccination.

Two decades later, in 1976, a swine flu scare prompted a national vaccination campaign. Early public opinion polls, such as a Gallup Poll from August, indicated that about half (53%) of Americans planned to get the jab.

By 1979, awareness of flu vaccinations had grown, but uptake remained limited. According to an Opinion Research Corporation survey from that year, only 15% of Americans said they planned to get a flu shot, despite widespread public health campaigns encouraging vaccination for high-risk groups like older adults. Surveys from this period revealed mixed attitudes: while many recognized the importance of flu shots for preventing illness, others expressed skepticism about their effectiveness or concerns about potential side effects.

The 1980s and 1990s marked a period of expanding access. Employers, schools, and healthcare providers offered flu shots more widely, and federal public health campaigns emphasized the benefits of annual vaccination for some groups.

By 1999, polls indicated that most Americans viewed flu shots positively, and vaccine coverage was steadily increasing, though differences persisted by age and region. For instance data from this era shows that coverage among those 65 and over peaked around 65% and hovered there until 2013.

The early 2000s brought new challenges and awareness. Following 9/11, public health messaging emphasized pandemic preparedness, which boosted interest in flu vaccination. Surveys from 2001 to 2003 showed that Americans largely supported flu shots, particularly for children and other vulnerable groups. Coverage remained uneven, but the overall trend reflected growing trust in the vaccine’s role in preventing serious illness. Indeed an October 2001 American Society of Health-Systems Pharmacists found 58% of Americans said at least one person in their family had or would be getting a flu shot that year.

By the the 2010’s, public health campaigns targeted children, older adults, and people with chronic conditions. A 2010 Commonwealth Survey reported 46% of American adults had or would get a flu shot. Coverage for adults continued in the 40-50% range, while pediatric rates remained higher. Polling during this period showed strong support for routine vaccination every year. 2012 showed that 45% of U.S. adults had gotten a flu shot in the last 12 months, and in 2018 54% of adults planned to get one in preparation for the upcoming flu season.

The COVID-19 pandemic disrupted routine vaccinations, and flu shot coverage declined in the early 2020s. CDC data from the 2023–2024 season shows that overall influenza vaccination rates dropped, particularly among children. Researchers note that rates also vary across counties, with lower coverage in low-income areas. Public opinion polls reflect this shift: some Americans continue to see flu shots as essential for protecting themselves and others, while others are hesitant, citing safety, necessity, or mistrust of public health guidance.

Together, these moments show how public opinion, scientific research, and government action have influenced flu shot use over decades. From early pandemic campaigns to modern concerns about access and equity, the story of flu vaccination in America illustrates the ongoing interplay between policy, medicine, and the public.

This post was written by Marist Poll Media Team member Julia Tartaglia.

View of a group of workers in safety vests and hard hats from the back.

Time Machine: Labor Unions

In any political debate, you’ll hear candidates express support for unions, labeling them as the cornerstone of American labor and manufacturing. But how important are unions nowadays and how many union members are left?

If fewer than 10% of Americans were members of a union in 2024, then why are unions still such a contentious point of discussion? Perhaps it’s the long history of organized labor in America.

In July 1936, just one year after the passage of the Wagner Act, which guaranteed workers the right to unionize, Gallup asked Americans if they supported labor unions. 76% said yes. Around this time, as a result of New Deal policies and Depression-era frustrations, membership surged. Union membership rose from about 13% of American workers in 1935 to 29% in 1939.

The passage of the Taft-Hartley Act in 1947 sought to amend and limit union power. The prohibition of practices such as closed shops (businesses which required union membership for employment) were instituted. This did not hinder overall support for unions however. According to the Department of the Treasury, labor unions would reach their peak membership in the 1950s with one-third of American workers claiming membership.

Through the post-war decades, union support continued to hold strong. Gallup, in 1957, found that 75% of Americans approved of labor unions, an all-time high. Trouble was on the way however, as media reports of corruption within unions such as the Teamsters began to shift public opinion. According to Gallup, by 1979, approval had dropped to 55%. Ronald Reagan’s firing of striking air traffic controllers in the early 80s seemed to continue eroding support. That year, 1981, Gallup found only 14% of Americans thought labor leaders had high ethical standards, and, by 1983, just 20% of American workers belonged to a union.

In the latter decades of the 20th century, union membership declined rapidly. This can largely be attributed to the loss of U.S. manufacturing plants and factories. Industries which were traditionally union strongholds, such as the steel, auto, and manufacturing industries, saw factories close or move overseas. As blue-collar jobs disappeared, union membership dropped.

But what about the public perception of organized labor? According to Pew in 1999, 70% of Americans still said that labor unions were necessary to protect working people.

Coming out of the 2008 recession, the early 2010s saw a surge in union support from new demographics. As income inequality grew, professions such as teaching became hotbeds for activism, materializing in strikes and walkouts.

Nonetheless, membership continued to decline AND support continued to rebound. Then came the pandemic. Workers in essential roles, such as healthcare, food service, and delivery, began demanding more compensation for their efforts. Gallup, in 2022, found that 71% of Americans approved of unions, the highest percentage since 1965.

Post-pandemic, unions continued to hold strong public support, achieving high-profile victories such as the Union of Auto Workers (UAW) winning concessions from automakers. A 2023 CNN/SSRS poll showed 76% of voters supported striking auto workers.

Today, though union membership only hovers around 10% according to the Bureau of Labor Statistics, their cultural presence as a tool for social equity has not faded. As income inequality continues to grow, these unions show big corporations that people — workers — still have at least some of the power.

Night Skyline with streaks of car lights on a highway

Time Machine: Auto Industry

For over a century, the American auto industry has shaped how people live, travel, and connect. But, behind the freedom of the open road is a story of change driven by safety concerns, environmental goals, and public demand. From seat belts to electric vehicles, what Americans expect from their cars has evolved.

This transformation is even seen in pop culture. In Cars 3, Lightning McQueen is challenged by a new generation of racers designed for speed and efficiency, reflecting real-world shifts in how vehicles are built and used. Over time, government policy, public opinion, and technology have worked together to build a safer and cleaner road ahead.

In 1956, President Eisenhower signed the Federal-Aid Highway Act into law, launching a $25 billion (nearly $300 billion today) project to build 41,000 miles of interstate highways. At the time, it was the largest public works project in U.S. history and, over the ensuing decades, radically changed American life. That same year, a Gallup poll found that 76% of Americans supported building more express or superhighways between large cities, showing strong public backing for the large project even before construction began.

But, with new, faster highways came concerns about auto safety.

In 1966, passage of Highway Safety Act and the National Traffic and Motor Vehicle Safety Act authorized the federal government to create and enforce more safety rules. This led to requirements such as having seat belts and head rests in the cars, energy-absorbing steering wheels, and shatter-resistant windshields. A Gallup poll from the same year showed that 54% of Americans who had seat belts in their car used seat belts “some of the time” while 32% used seat belts “always.”

Two decades later, airbags became widely available after years of research and limited production runs. In 1986, Mercedes-Benz became the first automaker to make airbags standard in its U.S. vehicles. In 1984, a Cambridge Reports National Omnibus Survey showed that 37% supported requiring passive restraints, such as airbags, while 42% favored mandatory seat belt laws, highlighting public support for more government vehicle safety rules.

By 1995, 49 states had seat belt laws. Airbags saved 475 lives in 1995 and seat belts cut fatal injuries by 45% to front seat passenger occupants, proving their life-saving impact. A 1999 Roper Starch Worldwide/Insurance Research Council Poll found that among those who had recently bought or leased a car in the past three years, 61% said safety features like airbags and anti-lock brakes were the most important factor influencing their decision.

About 20 years after airbags became the “new” auto safety standard, Electronic Stability Control (ESC) became mandatory in all new passenger vehicles sold in the U.S. This 2012 requirement marked another major step forward auto safety — this time by focusing on preventing crashes in the first place. That same year, the federal government finalized a rule requiring backup cameras in all new cars by 2018 to address rising concerns over backing accidents. By 2017, consumer demand for advanced driver assistance was growing, with 30% of drivers (asked of those who do not already have each item) saying they would want blind spot awareness and 27% selecting automatic emergency braking, according to an AARP/GfK poll.

Another huge change in auto culture came in response to the 1973-74 oil embargo. Congress passed the Energy Policy and Conservation Act of 1975, establishing the first-ever Corporate Average Fuel Economy (CAFE) standards. These rules required automakers to dramatically improve fuel efficiency from 15.2 mpg to 27.5 mpg by 1985. At the time, public opinion strongly supported improvement in fuel efficiency. A 1979 Opinion Research Corporation poll found that 89% of Americans favored providing incentives to automakers to improve fuel efficiency.

In 2007, the Energy Independence and Security Act was signed into law, aiming to reduce U.S. dependence on foreign oil and cut emissions by improving energy efficiency. One of its most impactful provisions raised fuel economy standards to 35 miles per gallon by the year 2020. A 2008 Rockefeller Foundation/Time Magazine poll found that 79% of Americans supported higher fuel economy standards, with 59% “strongly” in favor.

Reducing emissions have gone hand in hand with efficiency standards but, in 2021, President Biden signed Executive Order 14037 setting a national goal for 50% of all new passenger cars and light trucks sold by 2030 to be zero emission vehicles. The order also directed federal agencies to begin developing new emissions and fuel economy standards for vehicles starting with model year 2027. President Donald Trump rescinded the rules in 2025. But, by that time, roughly 7% of all new car sales were electric vehicles (EVs), so the impact may be muted.

From interstate highways to electric vehicles, the auto industry has changed alongside public opinion, government policy, and advancing technology. Safety features, fuel standards, and clean energy goals show how cars have evolved to meet the needs of a changing society. The road forward continues to be shaped by the push for a safer, more efficient future.

This post was written by Marist Poll Media Team member Tommy Rogers

Woman lighting a cigarette

Time Machine: Smoking in Public? Yuck!

In the summer of 1939, America was on the brink of war abroad and inhaling deeply at home. That August, a survey conducted by Roper for Fortune Magazine revealed that 41% of Americans smoked cigarettes. Smoking wasn’t just a habit, it was a cultural phenomenon. Movie stars lit up on screen, doctors appeared in cigarette ads, and you could find an ashtray in most living rooms, cars, and public spaces.

But, as war raged overseas, something curious happened on the home front. By July 1942, Gallup reported a striking shift: 63% of Americans said they smoked, up from just a few years prior. Cigarettes became intertwined with identity, stress relief, and patriotism. Packs were even included in soldiers’ rations.

As the postwar decades rolled on, smoking remained popular… with some cracks in its image. From 1960 to 1966, the U.S. government banned advertising that tar and nicotine cigarettes were “less harmful” and then, through the Federal Cigarette Labeling and Advertising Act, that warning labels should be on all cigarette packaging. In 1964, the big blow came when the Surgeon General released a report that linked smoking to cancer and other serious diseases.

Public perception shifted almost overnight. By January 1965, a Louis Harris & Associates poll showed that 62% of Americans supported warning labels on cigarette packages, while 29% opposed them. It was the first time many Americans confronted the darker side of smoking.

Fast forward to September 1974 and Americans were still deeply engaged with tobacco. Roper found that, in just the prior week, 41% had bought cigarettes, and 36% had smoked at least half a pack. The numbers were still high, but, by 1976, the tide was turning. That year, Roper found that 51% of Americans supported banning smoking in public places, while 44% still opposed the regulation.

As for compromises, Americans got creative. In 1977, Gallup found 68% supported designated smoking areas as a solution to balance public health with personal choice.

But, by 1982, the pendulum swung back. Roper found that 54% actually opposed a full public smoking ban, while 41% supported it. Resistance remained strong… until it didn’t.

By 1987, Gallup recorded a significant shift: 55% now favored a full ban on public smoking, and only 43% were against it. Americans were growing tired of the foggy haze.

The 1990s brought stronger regulation and even more debate. In March 1994, NBC News/Wall Street Journal/Hart-Teeter Research Co. found that 33% of Americans supported maintaining the smoking ban, another 33% favored increased restrictions, 22% said restrictions should stay the same, while only 8% thought there should be fewer restrictions. But here’s the kicker: a Gallup/CNN/USA Today poll from the same month found that 70% of smokers said they weren’t actually smoking less, despite new regulations.

Fast-forward to the modern era, and America’s once-smoky image is now mostly cleared up. By July 2019, Gallup found that 62% believed smoking should be completely illegal in public spaces, with just 37% disagreeing. And by November 2024, attitudes got even more personal, with Pew Research/SSRS finding that 49% of Americans claimed smoking around other people is never acceptable, and another 28% said it rarely is.

Even when we look at e-cigarette usage, only 7% of Americans told Gallup in July 2024 that they smoked an e-cig or “vaped” in the past week, while 93% had not.

From normalization to rejection, the story of cigarettes in America is one of dramatic transformation. Once glamorized, smoking is now more likely to be met with side-eyes than celebration.

This post was written by Marist Poll Media Team member Hunter Petro.

supermarket aisle

Time Machine: Food Labels

Ever wonder why your granola bar boasts about “whole” grains? Behind every word on a food label is a battle between public demand, industry influence, and government action. These labels didn’t just appear overnight. They reflect decades-long conversations about health risks, food access, and more. What started as quiet concern grew into a national conversation, one that changed the way we eat, shop, and think about what’s in our food.

In a 1950 Gallup Poll, only 1% of Americans said public health was the most important issue in the upcoming election. Concerns such as war, inflation, and the cost of living dominated public attention. Eight years later, Congress passed the Food Additives Amendment, which required industry to prove the safety of food additives before FDA approval. The law aimed to protect public health while also encouraging innovation in food technology. Two years later, in 1960, Congress enacted the Color Additive Amendment, requiring manufacturers to prove color additives in food, drugs, and cosmetics are safe. It also banned any additive that was found to cause cancer in people, under the Delaney Clause.

In 1968, President Nixon held the first White House Conference on Food, Nutrition, and Health. He called hunger a national problem and promised big changes. The event helped make food and nutrition a bigger priority for the government and helped drive future changes to food programs. That growing national focus on nutrition is reflected in public opinion. By 1970, Americans started to care more about what is in their food. A General Electric Quarterly Survey found that of the roughly four in ten Americans who believed the government should require industries to provide more information to consumers, 32% wanted more information about ingredients, preservatives, and additives on product labels. Another 20% said they wanted warnings about anything that could be harmful or addictive.

The Environmental Protection Agency (EPA) was created in 1970, leading to the 1972 Federal Insecticide, Fungicide, and Rodenticide Act. Despite this, an Opinion Research poll from 1974 found that 57% of Americans said pesticides should be safer — more than any other product listed in the question.

In the late 70’s, the Senate Select Committee on Nutrition and Human Needs, led by Senator George McGovern, introduced federal dietary goals linking diet to chronic disease. It recommended Americans eat less saturated fat and red meat. After backlash from the meat industry, the committee revised the guidelines, but it still shaped the future national nutrition policies. By 1980, the first official Dietary Guidelines for Americans was released by the USDA and HHS. Americans appeared to be paying attention: A 1982 survey from the American Medical Association found that 67% of Americans believed cutting back on fat, red meat, and dairy was a healthy choice.

In 1990, the FDA proposed broad reforms to food labeling, including mandatory nutrition facts, updated serving sizes, and new nutrient reference values. Congress passed the Nutrition Labeling and Education Act, which gave the FDA power to require nutrition labels on food products. Reflecting the impact of these changes, a 1991 Roper poll found that 88% of shoppers who read labels said nutrition information had at least some influence on what they buy.

The 1997 Food and Drug Administration Modernization Act made it easier for food companies to add health claims to labels, as long as the claims are backed by trusted government science groups. The claims may appear on products 120 days after the FDA is notified. Reading food labels was a common behavior by the late 1990s. A 1997 Wirthlin Worldwide survey found that 68% of Americans said they were reading food labels more often than they had just two or three years earlier.

Starting in 2006, the FDA required that trans fats be listed on the nutrition facts label. This labeling change led many food companies to reformulate their products to remove trans fats, since consumers could now identify and avoid them. A Roper poll that same year found that among Americans who checked nutrition labels, fat content (including saturated and trans fat) was the number one thing they looked for at 26%, followed closely by calories (25%), sugars (10%), and sodium (8%).

The 2010 Dietary Guidelines for Americans emphasized maintaining a healthy weight through calorie balance, physical activity, and choosing nutrient dense foods over those high in sodium, added sugars, and solid fats. It was released during rising national concern over obesity and chronic disease and the guidelines aimed to support healthier eating patterns across diverse U.S. populations. In early 2011, a survey by the Public Policy Institute of California found that 75% of Californians said obesity was a “very serious” public health problem. Another 21% said it was “somewhat serious.”

In 2014, the FDA proposed new food labels to better match how people really eat, like counting a 20 ounce soda as one serving. The labels also made calories easier to see and added a line for added sugars to help people make healthier decisions. Speaking of easier to see, that same year, a GfK Knowledge Networks poll found that 66% of Americans supported requiring labels for genetically modified ingredients (GMOs).

Next up were restaurants. Starting in 2018, chain restaurants with 20 or more locations were required to display calorie counts on menus and provide full nutritional information upon request. The same year the labeling rule took effect a Kaiser Family Foundation poll found that 74% of Americans supported these new rules.

In 2022, the Biden Administration hosted the White House Conference on Hunger, Nutrition, and Health, launching a national strategy to end hunger and reduce diet related diseases by 2030. Public concern about hunger and unequal access to healthy food was high at this time. According to a 2022 AP-NORC poll, 14% of Americans said poverty, hunger, homelessness were among the most important problems the government wanted to address in 2023.

Most recently, in December 2024, the FDA finalized new rules for using the healthy label on food packages. The updated definition limits added sugars, sodium, and saturated fat. It allows more nutrient rich foods like whole grains, nuts, and canned vegetables to qualify.

Together, these moments show how public concern, scientific research, and policy can shape what ends up on our plates and how we make decisions about the food we eat.

This post was written by Marist Poll Media Team member Tommy Rogers.

handgun leaning on table

Time Machine: Gun Laws in the U.S.

One minute it’s a normal day at school, the next it’s national news. Americans have seen it happen far too many times. Tragedy strikes and many Americans ask if it could have been prevented. Our relationship with guns is an endless debate. Support for stricter gun laws has shifted over time, with more support following tragedies. Yet…the last major federal law significantly restricting legal access to firearms was passed in 1994!

The National Firearms Act of 1934 was the first major gun law passed in the U.S. It made buying certain weapons, like machine guns, harder in an effort to reduce gang violence. However, firearms in the hands of any civilian can be dangerous. A World War II Veteran in 1949, named Howard Unrah, carried out one of the first mass shootings in the U.S. He killed 13 people using a Luger pistol but, despite the tragedy, there were no major revisions to gun laws until 1968, when the Gun Control Act was passed. The law made it harder for criminals/certain groups to buy guns. Additionally, the government had more control over the sales.

In 1968, when asked how to reduce crime, a Gallup Poll showed that only 1% of U.S. adults felt the need for stricter gun laws. The same poll showed that only 1% named gun control as the most important national problem, while 9% were more concerned about crime.

By the 1970’s, the debate started to grow. A 1976 Gallup Poll found that 35% of U.S adults named gun control as an important voting issue. By the late 1980’s, a majority of Americans supported stronger gun control laws. A Yankelovich/Time Magazine/CNN Poll found that 65% of adults supported stricter gun laws with 28% opposing. However, the opinions of gun owners were different. The same poll found that 63% of gun owners thought that stricter gun laws would not reduce violence with 31% believing they would. The other 6% were unsure.

In 1993, support for stricter gun control was rising. According to Time/CNN/Yankelovich Partners Poll, 70% of Americans favored stricter gun laws and only 27% opposed. Congress passed the Brady Bill that same year. The bill required a five-day waiting period for background checks prior to the purchase of all guns. A Gallup Poll from 1999 showed that 89% of Americans were still in favor of the law.

In 1994, Congress passed the Federal Assault Weapons Ban, which banned the sale and manufacture of specific semi-automatic firearms. After the law took effect, crimes with those banned weapons went down. But overall gun violence did not drop much. According to Christopher Koper, Professor at George Mason University, who studied the ban for the Justice Department, the law’s effects were limited because many older guns and magazines were still legal and widely available. The ban expired in 2004 and was not renewed, not because it was proven to fail, but because Republicans controlled Congress at the time.

Immediately following the 1999 Columbine school shooting, Princeton Survey Research Associates/Newsweek Poll found that only 11% of adults thought that stricter gun laws would be the best solution. The majority (49%) thought that there should be a larger focus towards kids who are anti-social. Also, 21% believed that schools should increase security and a few (14%) claimed that reducing violence in entertainment would be the solution.

Public support for stricter gun laws was less in the early 2000’s compared to 1993. A Time/CNN/Yankelovich Partners Poll found that 59% of Americans favored, and 35% opposed, stricter gun laws. This decline was short-lived as more and more terrorism attacks occur.

A few years after 9/11, a 2006 NORC General Social Survey found that 77% of Americans thought that gun laws should be stricter because of terrorism attacks. This increase from the early 2000’s shows that, as threats to public safety increase, more Americans are in favor of stricter gun control laws. This trend continued into 2007. Following a mass shooting at Virginia Tech University, a CBS News/New York Times Poll found that 32% of Americans believed that stricter gun laws could have prevented the incident. Furthermore, 21% believed it would have done at least a little bit. At the time, as terrorism attacks and mass shootings increased, so did support for stricter gun laws.

A major turning point in the gun debate came not from public opinion, rather the Supreme Court. The 2008 Supreme Court case District of Columbia v. Heller involved a special police officer, Dick Heller, who challenged the city’s laws that banned handgun registration and required that firearms must be kept unloaded and disassembled or trigger locked. He argued it violated the Second Amendment because guns at home should be available for personal protection. The court agreed with him, ruling the Second Amendment protects the rights to keep an operational gun at home for self-defense. The ruling strengthened individual gun rights even as many Americans supported stricter gun laws.

In the early 2010’s, mass shooting were happening more frequently, particularly in schools. After the 2012 Sandy Hook Elementary School in Newtown, Connecticut, Americans were left with the same question, would stricter gun laws have helped? A CBS News Poll found that the public was divided, with 26% thinking stricter gun laws would have helped a lot, 16% a little and 51% had no effect.

Also, in the wake of the mass shootings at a 2017 concert in Las Vegas and the 2018 Parkland school shooting, a Marist Poll/NPR/PBS Newshour found many 59% believed this country needs stricter gun laws. Only 25% believed that more people need to carry guns.

In April 2023, a Beacon Research/Shaw & Co. Research/Fox News Poll found that 43% of registered voters believed that stricter gun laws would make the country safer, while 25% believed it would make the country less safe. Many Americans today believe gun control can make our country safer but there is still a large number that disagree or feel it won’t make a difference.

Today, the debate over gun control versus gun rights remains deeply divisive. While support for stricter laws often rises after mass shootings, new laws don’t follow. Court rulings, politics, and different views on safety and gun rights all play a role. As shootings continue, many are left asking: how can the U.S. reduce gun violence while still protecting individual rights?

This post was written by Marist Poll Media Team member Tommy Rogers.