AI-generated deepfake videos are being used at scale to impersonate real doctors and public health experts, pushing unproven supplements and false medical claims across major social media platforms. The investigation shows this is not a one‑off scam but a growing tactic that undermines trust in evidence-based healthcare.
How the deepfake scheme works?
British fact-checking charity Full Fact identified hundreds of AI‑generated clips that reuse or manipulate genuine conference and broadcast footage of clinicians to make it appear they endorse specific supplements. In one case, footage of Professor David Taylor‑Robinson from a 2017 Public Health England event was altered to make him describe “thermometer leg,” a fabricated menopause symptom, before steering viewers to a supplements site.
These videos collectively gained hundreds of thousands of views on TikTok alone, with one clip surpassing 365,000 views and thousands of likes and bookmarks, showing how quickly such fakes can spread before detection.
Platforms and company responses
The first reports from the University of Liverpool to TikTok did not trigger removal, and the platform initially judged the content compliant, only later restricting visibility and eventually deleting the account after further complaints from the professor and his family. TikTok later admitted a moderation error and said the content breached rules on harmful misinformation and deceptive impersonation, while Meta and YouTube stated they remove or label health misinformation and synthetic media but did not clearly detail all enforcement actions in this case.
The clips frequently funneled viewers to buy probiotics and Himalayan shilajit linked to U.S. supplements firm Wellness Nest, which denied direct involvement and said the accounts were unaffiliated third parties using what appeared to be affiliate marketing links.
Why this is a public health threat?
Other well‑known figures, including former Public Health England chief Duncan Selbie and the late television doctor Michael Mosley, have also been deepfaked in health scam content, illustrating how scammers exploit trusted faces to lend fake authority to risky claims. Stanford pain specialist Dr Sean Mackey, who was also impersonated, warned that such deepfakes can push people toward unproven products and away from established treatments, turning misleading ads into a genuine public health risk.
Viewers can reduce their risk by treating miracle‑sounding cures and heavily promoted supplements with caution, checking whether the supposed doctor actually appears on reputable institutional sites, and confirming any treatment changes with a qualified healthcare professional rather than relying on viral videos.






