The independent source for health policy research, polling, and news.
VOLUME 12
This is Irving Washington and Hagere Yilma. We direct KFF’s Health Misinformation and Trust Initiative and on behalf of all of our colleagues across KFF who work on misinformation and trust we are pleased to bring you this edition of our bi-weekly Monitor.
This volume examines the role of news influencers in spreading misinformation to younger and polarized audiences, as well as misleading narratives about harm reduction amid disparities in opioid overdose rates affecting Black and Indigenous Americans. It also explores biases in artificial intelligence (AI) in healthcare and considerations for improving engagement with health information among marginalized communities and older adults.
A Pew Research Center poll found a shift in how Americans consume news, with 21% regularly turning to social media influencers for updates. These news influencers are primarily independent content creators with large followings, some of whom are journalists, who post about current events and civic issues on social media platforms. However, this shift toward influencer-based news may raise serious questions about the quality and accuracy of health information reaching the public. While some influencers provide credible insights, others spread false or misleading health claims. Even some influencers with medical credentials included in the sample study by Pew have spread false information about health topics like COVID-19. Other news influencers also contributed to the spread of health misinformation. For example, Joe Rogan, included in Pew’s sample of news influencers, has been criticized for using his platform to amplify false or misleading health information, including AIDS denialism and false claims about vaccine safety.
News influencers’ social media profiles can also be tied to their political ideologies, further complicating the landscape of health information. The Pew analysis found that most news influencers identify with certain political leanings, with right-leaning influencers outnumbering left-leaning influencers on most platforms. Among those who get news from influencers, conservative Republicans and liberal Democrats are more likely than moderate Republicans and Democrats to feel that the content they receive from influencers aligns with their own views. This dynamic can lead to a reinforced echo chamber, where followers consume information that supports their existing beliefs. Political figures who become influencers further complicate the landscape by blending their personal brands with their political roles, and using social media to promote their ideologies. Pew’s sample of news influencers included Mike Huckabee, former Arkansas Governor and nomination for U.S. Ambassador to Israel under the new Trump administration. Huckabee has used his social media platform to comment on health policy, including sharing misleading claims, such as equating abortions later in pregnancy with infanticide.
Although opioid deaths declined in 2023, a KFF analysis found disparities across race and ethnicity, with American Indian/Alaska Native (AIAN) and Black communities experiencing the highest rates of drug overdose deaths. Although opioid death rates began to show signs of decline across several demographic groups by 2023, the high vulnerability of AIAN and Black populations persists, reflecting broader systemic challenges, including limited access to treatment and harm reduction resources. Prior to naloxone becoming more widely available, research suggested that compared to White people, Black people had limited access to naloxone, a lifesaving drug that reverses opioid overdoses. Misinformation and stigma about naloxone may contribute to this disparity, such as the incorrect belief that naloxone leads to increased substance use. Fear of police harassment for carrying naloxone or being accused of drug possession can also deter people from accessing or using the drug, especially given the intersection of criminalization and substance use. Some public health campaigns aim to raise awareness about naloxone, but it is unclear whether these efforts are reaching or being received by Black and Indigenous communities. Addressing these issues along with affordability concerns may help address the lack of equitable access to naloxone.
False claims that harm reduction strategies promote drug use continue to surface in response to news about these programs, even when the news is positive. Harm reduction for opioids can involve various strategies to mitigate risks among users, such as syringe needle distribution and distribution of opioid overdose medications. For example, on November 14, NPR published an article about vending machines that provide free harm reduction tools like naloxone to anyone and sterile syringes to harm reduction clinic clients with a special code, explaining how these tools can help prevent overdose deaths and reduce the spread of infectious diseases. The article also prebunked false claims that harm reduction strategies may increase drug use and crime. Despite sharing this encouraging evidence, several Facebook users responded to NPR’s article with the false claim that harm reduction strategies promote drug use.
Recent misleading claims about harm reduction strategies have also come from public health officials. On October 16, two former top officials at the Department of Health and Human Services published an op-ed on Fox News claiming that the current administration’s harm-reduction strategies have “produced more harm, not less.” The authors claimed that harm reduction strategies promote “acceptance of a drug-using lifestyle.” On the day the article was published, Robert F. Kennedy Jr. shared it on X, stating, “Agreed. To end the opioid crisis we need common sense solutions not ‘harm reduction.’” Most comments agreed with the op-ed, with some falsely claiming that harm reduction strategies provide people with illegal drugs. The article continued to spread on social media platforms, where many posts and comments advocated against harm reduction, falsely claiming that harm reduction can “promote” drug use and discourage treatment.
Despite research showing that harm reduction strategies can prevent overdose deaths, reduce the spread of infectious diseases, and do not increase drug use or crime, the false claim that harm reduction promotes drug use persists. This may reveal public confusion about the goals and functions of harm reduction principles and programs, which aim to reduce the negative effects of drug use and link people to treatment if and when they are ready. Stigmatizing attitudes toward people who use drugs also suggest that the public may not understand that substance use disorder (SUD) is a chronic health condition and although some people are more at risk for it, SUD is fairly common, as nearly three in 10 U.S. adults say they or someone in their family has experienced opioid addiction.
Artificial intelligence (AI) is increasingly used in U.S. healthcare, but algorithms often embed racial and ethnic biases that perpetuate disparities in health outcomes. For example, algorithms used to predict health outcomes may inaccurately assess risks for minority groups due to underrepresentation in the training data. Research shows that AI systems often rely on data that overlooks factors such as a patient’s socioeconomic conditions (e.g., access to transportation or healthy food), leading to inaccurate treatment recommendations or misunderstandings about a patient’s ability to follow medical advice. Studies have also found that AI chatbots can share false medical myths about patients of color. Biases in AI systems are further exacerbated by historical inequities and a lack of transparency. Researchers suggest that reforms should focus on increasing diversity in clinical trials, incorporating socio-environmental factors into health care, and improving equity in algorithm development.
Polling Insight:
When it comes to health information, the public is not yet convinced that AI chatbots can provide accurate information. KFF’s June Health Misinformation Tracking Poll found that just one in three adults say they are “very confident” (5%) or “somewhat confident (31%) that the health information and advice they may come across on AI chatbot platforms is accurate (Figure 1). About six in ten adults say they are “not too confident” or “not at all confident.”
Notably, adults under age 50 and Black and Hispanic adults are somewhat more likely than those over age 50 and White adults, respectively, to say they have confidence in the accuracy of health information from AI chatbots, though about half or more across age and racial and ethnic groups say they are not confident.
A study published last year in Science Communication conducted focus groups of Black and Latino adults in Boston, MA to understand how historically excluded and marginalized populations perceive science-related issues and engage with misinformation. Participants expressed distrust of authority figures, celebrity endorsements, and fact-checking efforts. Many cited challenges in accessing accurate information due to institutional biases and systemic inequalities, such as discriminatory media algorithms and lack of representation in traditional newsrooms. Participants resonated most with health and wellness topics, especially messages that used storytelling, cultural figures, or social media as a guide. The findings suggest a need for culturally sensitive strategies, including leveraging trusted local leaders and improving access to multilingual, community-specific information to combat misinformation effectively.
Source: Amazeen, M. A., Vasquez, R. A., Krishna, A., Ji, Y. G., Su, C. C., & Cummings, J. J. (2024). Missing voices: examining how misinformation-susceptible individuals from underrepresented communities engage, perceive, and combat science misinformation. Science Communication, 46(1), 3-35.
A study in Frontiers in Psychology examined how algorithm-driven content may contribute to information overload and avoidance in older adults. Because algorithms often prioritize similar or excessive information, older adults may become fatigued and avoid engaging with health content. The study suggests that mindfulness, the practice of staying focused and non-judgmental, may help mitigate this effect. While mindfulness does not directly reduce fatigue from information overload, it helps individuals engage more effectively with health information. The findings highlight the importance of providing diverse, accessible health content and incorporating mindfulness techniques to improve how older adults process and respond to health-related information.
Source: Zhong, F., & Gu, C. (2024). The impact of health information echo chambers on older adults avoidance behavior: the mediating role of information fatigue and the moderating role of trait mindfulness. Frontiers in Psychology, 15, 1412515.
About The Health Information and Trust Initiative: the Health Information and Trust Initiative is a KFF program aimed at tracking health misinformation in the U.S, analyzing its impact on the American people, and mobilizing media to address the problem. Our goal is to be of service to everyone working on health misinformation, strengthen efforts to counter misinformation, and build trust.
Sign up to receive KFF Monitor
email updates
Support for the Health Information and Trust initiative is provided by the Robert Wood Johnson Foundation (RWJF). The views expressed do not necessarily reflect the views of RWJF and KFF maintains full editorial control over all of its policy analysis, polling, and journalism activities. The Public Good Projects (PGP) provides media monitoring data KFF uses in producing the Monitor.