VOLUME 3

This is Irving Washington and Hagere Yilma. We direct KFF’s Health Misinformation and Trust Initiative and on behalf of all of our colleagues across KFF who work on misinformation and trust we are pleased to bring you this edition of our bi-weekly Monitor. 

This edition focuses on intentionally false or misleading information online and its potential impact on public trust in health care. We share a recent report that exposed a covert U.S. military social media disinformation campaign in the Philippines that may have undermined public confidence in vaccines. We also examine how false claims about sunscreen and non-FDA-approved “miracle cures” may be discouraging people from taking important preventative measures and seeking legitimate medical treatment. Finally, we explore the rise of counterfeit diabetes and weight-loss drugs like Ozempic and the potential impact on trust in the pharmaceutical supply chain.


Recent Developments

US Military’s Covert Anti-Vaccine Disinformation Campaign in the Philippines

A photograph of a person holding a vaccine and a syringe
Tetra Images/Getty Images

In June, a Reuters investigation revealed that the U.S. military conducted a covert anti-vaccine disinformation campaign in the Philippines between 2020 and 2021. As a part of the campaign, U.S. military officials used fake social media accounts to spread false information to people in the Philippines about COVID-19 equipment from China. This included masks, test kits, and the Chinese Sinovac vaccine, which is the first COVID-19 vaccine that was available in the Philippines. Health experts said these actions put  lives at risk and undermined confidence in public health systems. The Philippines, a long-time ally of the U.S., had one of the highest COVID-19 death rates in the region and some of the lowest vaccination rates. Reuters was unable to determine the reach and impact of the disinformation campaign, but it likely hindered vaccination efforts in the Philippines. Studies suggest that any anti-vaccine campaign can potentially undermine overall public trust in vaccines.

Sunscreen Misinformation is Spreading on TikTok

A photograph of a person sitting on a beach applying sunscreen
RuslanDashinsky/Getty Images

A recent rise in false or misleading social media posts claiming sunscreen causes cancer is discouraging people from using it. These claims, often linked to past sunscreen recalls involving benzene, confuse consumers about the safety of oxybenzone. Unlike benzene, oxybenzone is a common sunscreen ingredient that experts say does not cause cancer and is important for protecting against UV rays that do.

Sunscreen is an important tool for protecting people from harmful UV rays and skin cancers like melanoma. Despite its proven benefits, these claims reflect concerns that young adults have about sunscreen’s safety. According to a nationally representative survey conducted for the Orlando Health Cancer Institute, one in seven (14%) adults under 35 mistakenly believe that daily sunscreen use is more damaging to the skin than direct sun exposure. Health professionals can work to debunk or refute these claims, as skin cancer is the most common form of cancer in the U.S. and consistent sunscreen use remains one of the most effective ways to prevent it.

Unproven Health Remedies and Supplements Promoted on Social Media

A photograph depicting bottles with organic remedies
Iryna Veklich/Getty Images

Social media users have seen an increase in “miracle cures” and detoxes for cancer, but many of these remedies lack scientific support and may even be harmful. Memorial Sloan Kettering Cancer Center has debunked several of these unproven treatments, such as cannabis oil, which is often marketed as a cancer treatment despite the lack of clinical trials supporting its efficacy. Similarly, Sloan Kettering debunked the effectiveness of detox diets, explaining that the body naturally detoxifies itself and that these diets can lead to nutrient deficiencies and other health problems.

Social media has also fueled the popularity of unproven supplements marketed to manage menopause symptoms. Concerns and misconceptions about hormone therapy have led some women to seek natural alternatives to relieve hot flashes, night sweats, and other disruptive symptoms. However, experts caution that these menopause supplements are not FDA-approved and usually contain unregulated herbal blends that make them ineffective or dangerous. This trend highlights both the gaps in public knowledge about menopause and the prevalence of misinformation about hormone therapy, which is a safe and effective treatment for moderate-to-severe symptoms. 

Polling Insights:

The KFF Health Misinformation Tracking Poll Pilot (June 2023) found that about one in four adults (24%) say they use social media at least once a week to find health information and advice, including larger shares of Hispanic (49%) and Black adults (35%) compared to White adults (15%), and younger adults compared to older adults (32% of those ages 18-29 compared to 14% of those ages 65+).

The poll also found that social media use is correlated with being exposed and inclined to believe health misinformation. For example, the survey found that a majority of those who use social media for health information and advice at least weekly say that they have heard at least one of the false COVID-19 or vaccine claims tested in the survey and think it is definitely or probably true, compared to four in ten of those who don’t use social media for health advice (Figure 1).


Emerging Misinformation Narratives

WHO Issues Warning on Rise of Counterfeit Semaglutides Like Ozempic

Iuliia Burmistrova/Getty Images

Fake weight-loss drugs are on the rise, potentially eroding trust in legitimate healthcare providers and pharmaceutical supply chains. In June, The World Health Organization (WHO) issued a warning about a rise in counterfeit versions of Ozempic and other semaglutides, a class of GLP-1 drugs intended for diabetes management and weight-loss.

The FDA had warned about counterfeits in the U.S. last year, but this is the WHO's first formal warning. The new warning resurfaces a false claim that Ozempic contains lizard venom. This claim comes from a misunderstanding of Ozempic’s origins. In the 1990s, researchers studied a hormone in the venom of the Gila monster that helps the lizard regulate its blood sugar during hibernation. These researchers then created a synthetic version of the hormone, ushering in a new class of diabetes drugs, including Ozempic. The Gila monster’s venom is not present in those drugs. 

One conspiracy news article promoting the false lizard venom claims was originally published on June 9 and later republished on another conspiracy news website on June 23. The author wrote, “...by labeling reptile venom peptides as ‘FDA-approved medications,’ the western medicine system has quite literally figured out a way to get people to voluntarily maim or kill themselves, even while paying for the privilege of doing so.” 

Conversation about the WHO’s warning on Ozempic represents only a slight bump in overall online conversations about the medication over the past two months, accounting for 4.5% of Ozempic-related posts between June 19 and June 26. It’s expected that preexisting false claims about the safety and effectiveness of Ozempic may continue to reemerge—however, these false claims won’t necessarily gain traction.

Polling Insights:

The May 2024 KFF Health Tracking Poll finds that 12% of adults say they have ever used GLP-1 drugs, including 6% who say they are currently using them. The share who report ever taking these drugs rises to about four in ten (43%) among adults who have been told by a doctor that they have diabetes, a quarter (26%) of adults who have been told they have heart disease, and one in five (22%) adults who have been told by a doctor that they are overweight or obese in the past five years (some of whom also have diabetes or heart disease; Figure 2).

While most adults who have ever taken a GLP-1 drug say they got a prescription from their primary care doctor or specialist (79%), some report getting them from an online provider or website (11%), a medical spa or aesthetic medical center (10%), or from some other source (2%; Figure 3).

Health Discussions to Watch

COVID-19 Vaccine Safety: On June 17, the Kansas attorney general filed a lawsuit against Pfizer for allegedly covering up COVID-19 vaccine risks, specifically pregnancy complications and heart inflammation. Social media posts in both English and Spanish used the allegations in the suit to falsely claim that COVID-19 vaccines are unsafe and ineffective. A video of the Kansas attorney general announcing the lawsuit was shared by several social media accounts on X. The single most popular post sharing the video received 782,500 views, 26,000 likes, 5,000 shares, and 459 comments as of June 26. Most comments on the post express support for the lawsuit and distrust in COVID-19 vaccine safety.

Bird Flu: As H5N1 bird flu continues infecting animals in the U.S., false claims about the origins of the outbreak are appearing on social media platforms, primarily on X. Some recent viral posts on X falsely claim that the bird flu outbreak is a “plandemic” created by the government to sway election results, increase profits for pharmaceutical companies, and harm farmers. One social media post on X shared a news clip about a bird flu outbreak on an Australian farm. The text of the post read, “More sinister Globalist tactics to attack farmers, reduce food production & bring back more experimental vaccines.” That post received 319,800 views, 7,200 likes, 4,700 shares, and 1,400 comments as of June 26.


Research Updates

Fact-Checking and Media Literacy Efforts Reduce Misinformation Belief, but Heighten Skepticism Toward Accurate Information

A study published in Nature Human Behaviour looked at how fact-checking and media literacy interventions affect people in the US, Poland, and Hong Kong. The study found that while these efforts were effective in reducing belief in false information, they also increased skepticism about accurate information. This unintended consequence highlights the challenge of maintaining trust in reliable health information amid efforts to combat misinformation. Health professionals may need to adopt nuanced strategies that minimize distrust while effectively addressing health misinformation in public communication.

Source: Hoes, E., Aitken, B., Zhang, J., Gackowski, T., & Wojcieszak, M. (2024). Prominent misinformation interventions reduce misperceptions but increase scepticism. Nature Human Behaviour, 1-9.

Impact of Misinformation Concerns on News Choices

A study from the Harvard Misinformation Review looked at how concerns about misinformation affect people’s news choices. The study found that that people who are more worried about false information tend to choose news that matches their political views. For example, liberals who are worried about misinformation tended to favor sources like MSNBC or the Huffington Post. Democrats were especially likely to favor news that matched their political views when worried about misinformation. Feeling confident about spotting false information did not change this behavior. These findings show how concerns about misinformation can lead people to only read news that aligns with their beliefs. This can deepen political divisions and make it more challenging to address false or misleading information in public discussions.

Source: Harris, E. A., DeMora, S. L., & Albarracín, D. (2024). The consequences of misinformation concern on media consumption. Harvard Kennedy School (HKS) Misinformation Review. https://doi.org/10.37016/mr-2020-149


AI and Emerging Technologies

A photograph of a person typing at a keyboard - with a search box overlaid on top of the photo
Chadchai Ra-ngubpai/Getty Images

Google’s AI Overviews Feature

In May, Google unveiled its AI Overview feature, which is designed to help consolidate and streamline searches for users. Now, when you search for something on Google, a short answer summarizing results from reliable sources will appear at the top of that Google search. However, ABC notes concerns that these summaries may prioritize convenience over accuracy. The system pulls from a variety of web sources, including blogs and satirical sites, and has had some early challenges with inaccurate answers that were difficult for users to verify. To address this, Google is limiting how frequently AI Overviews draws its information from these unreliable sources and will begin providing links to the original sources. The potential for AI Overviews to provide users with accurate and reliable health information depends on its ability to adapt to evolving disinformation tactics and improve transparency around information sources.

About The Health Misinformation and Trust Initiative: The Health Misinformation and Trust Initiative is a new KFF program aimed at tracking health misinformation in the U.S, analyzing its impact on the American people, and mobilizing media to address the problem. Our goal is to be of service to everyone in the health misinformation and trust field and to help efforts to counter misinformation, build trust, and be more effective.


View All Health Misinformation Monitors

Sign up to receive Health Misinformation Monitor
email updates


Support for the Misinformation and Trust initiative is provided by the Robert Wood Johnson Foundation (RWJF). The views expressed do not necessarily reflect the views of RWJF and KFF maintains full editorial control over all of its policy analysis, polling, and journalism activities. The Public Goods Project (PGP) provides media monitoring data KFF uses in producing the Monitor.


KFF Headquarters: 185 Berry St., Suite 2000, San Francisco, CA 94107 | Phone 650-854-9400
Washington Offices and Barbara Jordan Conference Center: 1330 G Street, NW, Washington, DC 20005 | Phone 202-347-5270

www.kff.org | Email Alerts: kff.org/email | facebook.com/KFF | twitter.com/kff

The independent source for health policy research, polling, and news, KFF is a nonprofit organization based in San Francisco, California.