One of the most dangerous shifts I’ve seen in my lifetime isn’t technological—it’s epistemological.
Somewhere along the way, we stopped caring how we know what we know.
Medical misinformation didn’t start with social media, but social media gave it jet fuel. Suddenly, anecdote became equivalent to data. Confidence replaced credentials. And the phrase “do your own research” stopped meaning “evaluate evidence” and started meaning “agree with me.”
During COVID, this exploded in real time.
Unqualified influencers promoted cures that had no empirical support. Some were harmlessly useless. Others were actively dangerous. The common thread wasn’t malice—it was certainty. Absolute, unwavering certainty delivered with a ring light and a follower count.
Medicine doesn’t work that way.
Real medical knowledge is slow. It’s iterative. It involves being wrong, correcting course, and sometimes admitting that we don’t know yet. That humility is a feature, not a flaw—but it doesn’t play well on social media.
What the public often doesn’t see is that evidence isn’t one study or one headline. It’s convergence. Reproducibility. Peer review. Boring consistency across time and populations. None of that fits neatly into a 60-second video.
So when someone says, “Doctors don’t want you to know this,” that should be your first red flag. Medicine is not a conspiracy—it’s a bureaucracy. If something worked, we’d be using it, billing for it, and arguing over the guidelines.
Instead, what we’re left with is erosion. Patients arrive already diagnosed by the internet. Trust is fractured. Outcomes suffer—not because medicine failed, but because expertise was drowned out by volume.
Opinions aren’t evidence.
And confidence is not competence.
Those distinctions matter. Lives depend on them.
