Skip to content
Separating Snake Oil From Science
Go to my account
Science & Technology

Separating Snake Oil From Science

When the lines between expert and huckster are increasingly blurry, how do we determine what’s authoritative?

Illustration by Simone Altamura/The Dispatch.
Scroll to the comments section

Recently, during a negotiation with my 10-year-old about how many stuffed animals are too many, he told me that a messy room was a “sign of creativity.” I was skeptical, but he explained that he had learned this from a YouTuber and—this was crucial—the YouTuber had shown a research study.  

I attempted to explain that all research studies are not created equal, and some studies are better than others, but it fell on deaf ears. Mom, it’s science. 

This interaction is not far from many of the interactions I have every day with people online, many of whom are trying to determine the accuracy of claims about child-rearing and parenting, often related to medical topics. And, very often, these claims they are seeing are, ostensibly, based on research studies. On TikTok, on YouTube, on Instagram—doctors or other seemingly credentialed individuals are posting carousels with screenshots of scientific studies or green-screening videos of themselves in front of research papers.

Some of these people are true experts in their fields, and some of these scientific papers are excellent, and some of the findings in the papers are accurate. But some of these online voices—even some people with medical degrees or other credentials—aren’t reading the literature correctly, and the papers they are showing are based on poor methods or biased data and do not support the claims that are being made. But for consumers of this content, it can be extremely difficult, basically impossible, to separate truth from fiction. 

The core problem: We have what economists call a “pooling equilibrium,” a concept derived from the study of labor markets. Imagine that workers come in two types: hard-working and lazy. The employer wants to hire only hard-working people, but there is no way to tell who’s hard-working and who is not from just looking at them. Now, imagine there is a credential job-seekers can get—say, a college degree. This takes work, which means this credential is easier for the hard-working group to get than the lazy group. If the degree is hard enough to get, we end up with a separating equilibrium. The hard-working people get the degree, the lazy ones do not, and the employer can use that degree to tell whom they want to hire.

This breaks down if the credential becomes easier to get. If college magically becomes easy (say, because of widespread grade inflation), then both the hard-working job-seekers and the lazy ones will get the credential, and now the employer is back to being unable to tell who is who using that marker. Then, we have a pooling equilibrium.

Health communication—especially across social media—has become that pooling equilibrium. 

It used to be the case that when the results of scientific studies were disseminated, the messaging came from your doctor or national organizations like the American Academy of Pediatrics (AAP) or the Centers for Disease Control and Prevention (CDC). These organizations are by no means perfect, but this messaging was developed by people trained in interpreting medical data and putting it in a broader context. Their messaging was often based on thorough reviews of many studies and expert consensus. Part of what dictated this structure was the fact that access to these studies was limited and excluded the general public, creating a barrier that generated this separating equilibrium.

Over time, access to scientific research has become more democratized. Open access to scientific journals has become more widespread, and there has been a greater push to try to help the general public get access to and understand research findings. This has a lot of benefits, but it has made it far easier for non-experts to appear like experts when they want to.

“The essence of snake oil is that the person selling it promises a full solution with very little work. Cut out sugar, and your child will be cured. Buy my supplement and your health will improve. All you need for your inflammation is my $85.99 parasite cleanse.”

Two other factors have made this even more of an issue. The first is that the algorithm of social media, which values views and engagement, has incentivized people who are true experts to produce social content that looks very similar to those who are not. The second is that we are increasingly seeing disagreement within official sources that normally agree. Robert F. Kennedy Jr., the secretary of the Department of Health and Human Services, has suggested autism is caused by vaccines, while the AAP and the CDC say it is not. This type of internal official disagreement causes confusion and engenders distrust of institutions. 

Why is this necessarily a problem? Many people would say that allowing broad access to information is all to the good, and people can see the information and make decisions for themselves. The problem is that, in many cases, the non-expert messaging is not providing a complete picture. 

Here’s an example. Recently, a large Instagram account—run by a doctor—posted a study on ADHD and diet, with the claim that 78 percent of children experienced significant improvement in their ADHD by dietary changes. The implication, in the caption, was that by cutting out a few offending food ingredients (gluten! food dyes!) ADHD might be cured. 

The problem is that this post was woefully incomplete. There was a study (from 15 years ago) about diet and ADHD, but the intervention was not the removal of food dyes but instead a complete elimination diet in which children were allowed to eat only a small number of foods (venison or lamb, rice, some vegetables, and pears). Some children’s ADHD symptoms did improve in the short term on this diet, but such a restricted diet is nearly impossible for families to maintain. Other studies that focus more directly on items like sugar and food dyes have not shown the same impacts. An expert-driven analysis of the possible links between ADHD and diet would make all of these points, which would provide enough context for families to talk to their doctor about whether dietary changes are a reasonable option. 

Instead, parents hear that their child’s ADHD might be fixed if they just eliminate popsicles and Froot Loops, which is both false and may cause people to delay going to the doctor to get treatment that might be effective. 

If we accept that cherry-picking research in this way is a problem, the natural question is: How do we return to a separating equilibrium? For individuals, this boils down to asking whether there are any signals you can reliably use to figure out who is providing good information and who is, basically, a quack. Relying on someone using scientific studies doesn’t necessarily work, and seemingly legitimate credentials like having “Dr.” in a title are not always reliable indicators. 

There is no perfect answer to this, which is part of why I spend my days debunking online misinformation about topics like expensive prenatal vitamins and plastic utensils. But there is a key differentiator to watch for: whether the messaging suggests an easy fix. Much of this online content focuses on hard problems in parenting or health—the rise of autism or ADHD, how to improve your blood pressure or heart health, how to lose weight. 

None of these difficult issues has an easy fix or a simple answer. But the essence of snake oil is that the person selling it promises a full solution with very little work. Cut out sugar, and your child will be cured. Buy my supplement and your health will improve. All you need for your inflammation is my $85.99 parasite cleanse. We will know the cause of autism by September. The “one simple trick” is inherent in the sales pitch, so these messengers cannot give this up, even though it makes it clear they are not true experts. 

In the end, the way to separate at least some of this misinformation from authoritative knowledge may be very simple: If something seems too good to be true, it probably is. 

In addition to being a professor of economics at Brown University and contributing writer at The Dispatch, Emily Oster is the founder and CEO of ParentData, a data-driven guide to pregnancy, parenting, and beyond. She is also a New York Times best-selling author, whose books include Expecting Better, Cribsheet, The Family Firm and The Unexpected.

Gift this article to a friend

Your membership includes the ability to share articles with friends. Share this article with a friend by clicking the button below.

Please note that we at The Dispatch hold ourselves, our work, and our commenters to a higher standard than other places on the internet. We welcome comments that foster genuine debate or discussion—including comments critical of us or our work—but responses that include ad hominem attacks on fellow Dispatch members or are intended to stoke fear and anger may be moderated.

With your membership, you only have the ability to comment on The Morning Dispatch articles. Consider upgrading to join the conversation everywhere.

https://d3tp52qarp2cyk.cloudfront.net/polly-audio/post-85905-generative-Stephen.95bf4430-2d6d-4b9c-809c-3d7daccb6114.mp3
/

Speed