“The Study Shows” or Does It? How to Spot Misused Research on Social Media
Coral Red: Mostly False
Orange: Misleading
Yellow: Mostly True
Green: True
When you see nutrition advice on social media, how many times have you heard phrases like, “This is backed by science”, “studies show”, or “science shows that…”?
Many people say it, but how do you know if they’re actually using scientific research properly to support their claims?
Many influencers misuse or misinterpret studies, and it’s easy to get confused if you haven’t gone through the training, often through a health or science-related degree, to read and dissect a research paper properly.
When someone's aim is to create viral videos and not to provide useful, accurate nutrition advice, it’s less likely that they’re reading every study they cite. In fact, this is pretty clear when we start to look at how many influencers cite a study, only for it to say something completely opposite to what they're claiming.
Sometimes it’s not enough to just check whether someone is citing research; you need to understand how they’re using it. Here are some key ways to tell if an influencer or expert is using scientific research appropriately or misleadingly.
1. Using studies isn’t always enough
Our guides on spotting misinformation say, “Check if they cite a reputable scientific study.” While this is a good start, the truth is that anyone can cite a study—whether it supports their point or not. The real question is how they’re using that research.
Some influencers cherry-pick studies or twist the findings to suit their narrative. They might not even look past the title. And if they’re making a big claim but only mention one study, it’s a sign they might be cherry-picking the data. So, don’t be swayed just because you see a reference to a scientific paper; take a closer look. Sometimes, research on a particular topic can have mixed results. If an influencer is only presenting one side of the evidence, they might be cherry-picking the data to support their argument while ignoring studies that disagree with their point.
For example, we recently fact-checked a post by Candi Frazier, where she claimed that grains are "depression foods," causing inflammation and, ultimately, depression. At first glance, her video seems convincing, especially when she presents a study that supposedly supports what she’s saying. However, when we took a closer look, the study actually examined a potential link between gluten and mood disorders in people with a gluten-related disorder. It did not look at the effect of grains on inflammation and depression in the general population.
Easy Step: A quick Google search can help you find other studies that show different results. If the influencer ignores or downplays contradictory findings, that’s a red flag.
2. Animal studies: mice are not humans
One common red flag is when influencers use animal studies to support their nutrition claims about human health. While these studies are often used as an initial step in biomedical research, to understand more about human health and disease, they do not provide conclusive evidence for humans. Claims based on animal studies alone lack the strength to be reliable and translated to humans.
For example, Paul Saladino has used animal-based studies to support his claims, likely because there are no robust trials supporting what he’s saying. In one video where he claims cruciferous vegetables like broccoli can be bad for health (anything from stomach issues, to thyroid health, to skin issues, or autoimmune diseases) he uses a pig study to support it. In his appearance on the Joe Rogan podcast, he claims that “LDL helps protect us against infection because it’s part of the immune system” and he uses evidence from a mouse study, which Bio Layne debunks on his blog, explaining “trying to equate a rodent knockout model with real human metabolism is ridiculous”.
How to Spot It: Animal studies are usually easy to identify from the paper title or abstract. If someone’s making a claim based on a study done in mice or other animals, be cautious—this doesn’t mean the same results will apply to humans.
3. Weak study designs: they don’t prove cause and effect
Some influencers use studies that are designed to observe patterns but can’t prove a direct cause-and-effect relationship. These weaker study designs might show correlations (things that happen together) but don’t show whether one thing causes another.
A common error is when influencers imply that a study showing correlation means causation. For example, if a study looking at nut consumption and blood pressure shows that people who eat more nuts also have lower blood pressure, that doesn’t necessarily mean eating nuts causes lower blood pressure—other factors, like exercise or genetics, could be involved.
Here are a few types of weaker study designs:
- Observational studies: These look at how variables are associated with each other (like how eating vegetables is linked to lower heart disease), but they can’t prove that one causes the other. They are useful for researchers, but they don’t give us any definite outcomes.
- Case reports: These describe individual cases or a small group of people, but they don’t give broad, reliable conclusions for larger populations.
Claims based on these kinds of studies should be taken with caution, especially if they imply definitive outcomes (like “X food will make you lose weight”). If someone says, "This food causes X," ask yourself: Is this just an association? Correlation doesn’t mean one thing caused the other. Influencers often confuse these terms to make stronger claims.
4. Stronger Study Designs: What You Can Trust More
On the flip side, some studies are much more reliable because they use rigorous methods designed to show cause and effect. If an influencer or expert cites one of these study types, the information is more likely to be trustworthy:
- Randomized Controlled Trials (RCTs): Considered the gold standard, these studies randomly assign participants to different groups and compare outcomes. This helps isolate the effect of a specific diet or supplement, making it easier to determine if it actually works.
- Systematic Reviews and Meta-Analyses: These review multiple studies on a specific topic, combining the results for a more comprehensive understanding. They’re often used to give a clearer picture of the overall evidence.
Also, consider the sample size. A reliable study should have a large enough sample size to make its findings meaningful. If an influencer is citing a study with only a handful of participants (for example, fewer than 30 people), this can be a red flag. Small studies are more prone to random chance influencing the results, and they often aren’t representative of the general population.
If someone references these types of studies, it’s a good sign the claim is backed by solid science.
5. Peer-reviewed research: what does that mean?
Peer review is an essential part of the scientific process. Before a study is published in a reputable journal, it goes through peer review, where other experts in the field evaluate the research. This process helps ensure that the study’s methods, analysis, and conclusions are sound.
While peer-reviewed studies are more trustworthy, it’s still important to consider the study’s quality (e.g., sample size, methodology) and whether it has been replicated by other researchers.
Red Flag: If someone cites a study that hasn’t been peer-reviewed, for example it might be a “pre-print”. Always check the source of the study to see if it went through the peer-review process.
6. Be skeptical of product-specific research
When someone is selling a product—especially supplements—they may claim it’s “backed by science.” But more often than not, the product itself hasn’t been tested. Instead, they may be citing studies on individual ingredients and then extrapolating the results to suggest their product works.
For example, Jessie Inchauspé, known as the ‘Glucose Goddess,’ promotes her anti-spike formula with bold claims like “Reduce your meal’s glucose spike by up to 40%” and “Lower fasting glucose by 8 mg/dL.” However, these claims are not based on robust clinical trials of the formula itself. Instead, they likely rely on a mix of testing on individuals—similar to a ‘case report’—combined with studies on the individual ingredients, which her website states are backed by “gold-standard, double-blind clinical trials.” This distinction is important, as these trials apply only to the ingredients, not to her specific product.
How to Check: Look at whether the product itself has been tested in clinical trials, not just the ingredients. Also, check if the research was funded by the company selling the product, which can lead to bias in how the results are presented.
7. Consider who funded the study
Sometimes, research is funded by companies with a vested interest in the results. For example, a supplement company may fund a study on the effectiveness of its own product. This doesn’t automatically mean the study is wrong, but it could introduce bias into how the results are reported or interpreted. Marion Nestle (who is not associated with Nestle the company) talks about this extensively in her book ‘Unsavory Truth’ and on her website, Food Politics.
Easy Step: Check the funding source. Studies funded by companies selling the product or diet being promoted should be taken with caution, as there may be a conflict of interest. It doesn’t mean the study is useless just because it was funded by industry.
8. Simplified conclusions from complex data
Many scientific studies are nuanced and don’t have clear, black-and-white conclusions. However, influencers might oversimplify complex findings to make them more appealing. They could say things like "Studies prove this food is bad for you," when the research may show a small effect under specific circumstances. Simplified or absolute conclusions can be a red flag that the research is being misused.
Easy Step: Look out for words like "always," "never," or "proven." Science rarely deals in absolutes, so these phrases can be an indicator that the influencer is oversimplifying the findings.
9. Was the study conducted on a population similar to you?
Some studies are conducted on very specific groups of people, like athletes, people with specific health conditions, or a certain gender or age group. If an influencer cites a study but applies it to everyone, this can be misleading. For example, if a study was conducted on elite athletes, its findings may not apply to the average person.
Again, we can look at the example of Candi Frazier and her “grains cause depression” claim. She generalised this claim to her whole audience, yet the study only found an association for people who have a gluten-related disorder, and that was still only an association rather than identifying a cause-and-effect relationship.
Easy Step: Look at the population studied. If the study participants are very different from you, the results may not apply to your situation.
10. Are the findings too good to be true?
Extraordinary claims require extraordinary evidence. If an influencer is making claims that seem too good to be true—like a product that will “melt away fat in days” or a diet that “cures” chronic diseases—then the research they’re citing may not hold up to scrutiny.
Easy Step: Be sceptical of big promises or miracle solutions. If the claim seems extreme, it’s worth looking more closely at the research behind it.
Foodfacts.org is an independent non-profit fact-checking platform dedicated to exposing misinformation in the food industry. We provide transparent, science-based insights on nutrition, health, and environmental impacts, empowering consumers to make informed choices for a healthier society and planet.
Help us fight false information.
Help us debunk false claims and provide consumers with the truth about the food system. Your support allows us to continue our vital work in fact-checking and advocating for transparency. Together, we can make a real difference.
Was this article helpful?