In 2009, former governor Sarah Palin took a provision in the Affordable Care Act (ACA) allowing Medicare payment for doctors to spend time discussing end-of-life issues with patients, sprinkled it with pixie dust, and turned it into the idea that there were “death panels” in Obamacare. It was like pouring gasoline on the already fiery partisan battles about the law. Republican politicians started to echo the myth. Death panels were quickly all over the news. In an entirely well-intentioned effort to play umpire for the American people, the news media sought to provide the facts including, and especially, by debunking the lie about death panels. Pundits, reporters, and experts on television news and newspaper stories provided the truth—there were no death panels in the law. Some of the facts may have reached the public, but what people mostly “heard” was a constant refrain: DEATH PANELS. In the end, the lie was elevated—not debunked—and the news agenda was hijacked by the misinformation it was trying to correct.

For Palin’s misinformation gambit to succeed, it had to tap into something real (there is usually a shred of belief or fact underlying misinformation people can grab on to.) In this case, it was the dislike the far right had for President Obama and for the federal government. And it was true that doctors were to have conversations with seniors about the end of life, something almost everyone in health care was calling for, although these were about honoring patient’s wishes, and notably, the provision authorizing Medicare payment was not included in the final law.

One way we know about the amplification of the death panel myth is from our polling (the kind of polling we are replicating now in our health misinformation tracking polls). In 2010, a remarkable 41% of the public said they believed there were death panels in the ACA.  And the lie persisted: in 2014, the same number said there were death panels in the law. In 2019, the number was still 38%. By 2023, as the law became more popular and Obama faded from the scene, the number who believed the lie fell to 8%.  But the myth still had some staying power: 70% still said they weren’t sure if there were death panels in the ACA.

The Palin death panels example is now eerily familiar. Most of the misinformation in health today is initially generated by a small number of actors and, despite the impression that it is everywhere, is viewed by a relatively small number of people on social media. An even smaller number actively engage with it by posting about it or sharing it with others. It’s when misinformation then gets mixed into politics and news media coverage and amplified that it can take off, reach significant numbers of people, and have a larger national effect.

The big, if somewhat unique, example of this multiplier effect was with the COVID-19 vaccine. Former president Trump, some other Republican governors, and conservative media outlets made the vaccine a symbol of resistance to the heavy hand of the federal government, and made not getting the shot an affirmation of personal freedom, sharply dividing the country along partisan lines on the COVID vaccine (Understanding the US failure on coronavirus—an essay by Drew Altman | The BMJ). The result: in our monthly Vaccine Monitor surveys throughout the pandemic, party affiliation was the strongest predictor of almost any position we asked about on COVID. Most cases of health misinformation, however, are not fueled by a president and do not capture the attention of the entire nation. Vaccines are also a somewhat unique case too, because there has long been a well-organized anti-vaccine movement.

Take as one more example, the “Meet Baby Olivia” video. Baby Olivia was a video on fetal development, which was posted on Facebook in 2021 by the anti-abortion group Live Action. The American College of Obstetricians and Gynecologists said the video was “designed to manipulate the emotions of viewers rather than to share evidence-based, scientific information about embryonic and fetal development.” At its highpoint, in June 2022, the Baby Olivia video generated 4,700 comments on Facebook, and as a guestimate based on similar social media posts, maybe three or four times that much overall engagement in the form of likes, shares, or comments. That’s a lot of people at a town hall or campaign rally, but a minuscule number both on Facebook on any given day and in terms of potential impact on the public. But then, like a smaller scale version of the dynamic we saw play out with death panels, Baby Olivia became a political issue. A legislative proposal was made in North Dakota and then in nine more states requiring that schools show baby Olivia or a similar video to students. Media coverage of the controversy surrounding the legislation mushroomed. Baby Olivia escaped its initial niche on Facebook and became a much bigger phenomena.

It may be that we are exaggerating the impact of a lot of sensationalistic, untrue, and ideologically motivated misinformation on social media, partly because it can be so outrageous, when it actually reaches small numbers of already like-minded people who seek it out. What may matter far more is how it sometimes spreads outward from social media to politics, finds prominent political surrogates, and garners general media attention. That’s when it reaches much larger populations who may be uncertain about what’s true and what is not and can be persuaded by it.  Media segmentation along partisan lines and the chase for clicks create a perverse incentive that can amplify misinformation further: the more outrageous and sensational the misinformation, the more visibility it and its purveyors are likely to get. Political and news media amplification then leads to renewed social media attention, creating an unvirtuous cycle of misinformation.

The best solution is to prevent misinformation, and those who peddle it, from getting a foothold on social media in the first place. However, policing misinformation on social media is largely the job of social media platform companies who, under cover of the tech swoon, have been backing away from self-regulation. The government does not have the authority to regulate misinformation on the platforms in the U.S., although a recent Supreme Court decision gave the government license to continue to communicate with platform companies about misinformation for now.

The news media will and should cover compelling and timely health politics stories such as anti-Obamacare rallies or state laws requiring anti-abortion videos in schools. As the death panel experience shows, sometimes it can be tricky to do so without inadvertently elevating misinformation and those who spread it. However, the primary focus of reporters and editors is on their beats and stories, not confronting misinformation. Fact checking in news organizations is organized as a separate function and product, and its purpose is narrower; it is mostly about holding candidates and elected officials accountable for wrong numbers and statements. There are a few journalists who have made misinformation a beat or a regular focus, and very few in health (including several in our newsroom).

And while it’s not health misinformation, you can see the media struggling to walk the line between calling out extreme misinformation on social media and elevating it in the aftermath of the Trump assassination attempt.

More fundamentally, the news media generally sees itself as being in the business of covering the news, not educating the public or addressing deficits in public knowledge. How to navigate the misinformation minefield is one of the issues we plan to take up working with the journalism community across the country in our new Health Misinformation and Trust Initiative.

View all of Drew’s Beyond the Data Columns

KFF Headquarters: 185 Berry St., Suite 2000, San Francisco, CA 94107 | Phone 650-854-9400
Washington Offices and Barbara Jordan Conference Center: 1330 G Street, NW, Washington, DC 20005 | Phone 202-347-5270

www.kff.org | Email Alerts: kff.org/email | facebook.com/KFF | twitter.com/kff

The independent source for health policy research, polling, and news, KFF is a nonprofit organization based in San Francisco, California.