Alarming headlines like that swept across news outlets in mid-October. The articles reported on a study conducted by the Healthy Babies Bright Futures organization that was released this October.
Pushback on the study arose immediately after the story took off in the news. There were questions about the groups funding the study and casting doubt based on the study not being published in a peer-reviewed journal.
So what does this all mean? Is this study trustworthy? Should we be worried about what’s in our baby food?
Navigating scientific reports and analyzing them for credibility is no small task. A better understanding of these studies, however, can certainly help. VERIFY has broken down how to interpret scientific studies so you approach them a little more easily.
First of all, it’s important to keep in mind that scientific studies don’t actually determine facts. Science is all about developing and testing theories. It’s in NASA’s explanation of the scientific method: scientists form a hypothesis--or testable question--as their first step, then test and observe, interpret the results, draw their conclusions and publish their findings.
Even if a scientist tests a theory and the results correlate with the theory, that doesn’t mean it’s a fact. Even something we take as an absolute like gravity is still just a theory--although good theories, like gravity, have a lot of evidence and studies to back them up.
This is especially important to take note of in fields that are constantly changing. A great example of this is dietary studies. One study can find something like salt is bad for you, while another study years later can find that the same thing, in this case salt, is good for you.
Neither study is necessarily wrong. Over time, our understanding of the human body and our dietary needs changed. And this happens all the time. So a singular study’s results aren’t fact: they could be true, false or some combination of both.
Often times, these studies get reported as fact, including dubious ones that make for good headlines. A journalist named John Bohannon conducted a study purposely fraught with problems to see if media outlets would run with it. They did.
His study seemed to provide evidence that chocolate could help people lose weight. But his study used a small group of people (allowing for outliers to have a more significant impact on the results) and tested a broad range of variables (this method, called p-hacking, allows the researchers to cherry pick for the conclusions they want.)
Furthermore, there were red flags from the beginning of the study. He was the lead researcher--except he was Johannes Bohannon within the study, an identity a Google search would have put into question. Additionally, the study was conducted by the Institute of Diet and Health, an organization that was nothing more than a website that had been recently registered at the time.
Bohannon wrote about his deception and the problems it illustrates in Gizmodo, but the gist of it is that it’s easy for dubious studies to gain traction. He and his collaborators actually conducted a study, they just used methods that should have rightly put it into question.
So how can you better VERIFY whether or not a study should be taken seriously? Well Bohannon highlighted a few good places to start.
A sound study should use a fairly large sample size. Scientists don’t even consider taking seriously sample sizes of under 30 anymore. The study should also be analyzing a few, narrow variables that have a direct impact on the initial hypothesis.
If that sounds daunting, there are easier things to look for.
Is it peer-reviewed?
Most credible scientific studies should be peer-reviewed by other scientists as evidence of their thorough research methods.
Who is funding the research?
If the parties conducting and/or funding the research have an interest in the result, it may skew the findings. For example, if there is a study funded by groups in the tobacco industry claiming that cigarettes aren’t as harmful as we thought, that should be viewed skeptically.
Is the hypothesis clearly defined?
A study should have a hypothesis you can clearly identify and the results should directly correlate to that hypothesis.
Is there government research to corroborate this?
While government research isn’t always perfect and can even get things wrong, they do have multiple layers of oversight and are held to higher standards than most. While you shouldn’t dismiss a study that doesn’t line up with government research, it should give you a bit more confidence in the study if it does.
So what about the study that set off everything here? Are there toxic chemicals in baby food? Well, that can’t be said for sure yet. There are some red flags: the study isn’t peer-reviewed, the advocacy organization self-funded the report and hasn’t validated it with a neutral source, there is unclear methodology on how they chose which foods and brands to test and the majority of the levels found in baby food in this study did not exceed levels the FDA has deemed safe (heavy metals are naturally occurring elements and are found in all kinds of foods).
However, that isn’t reason to dismiss it entirely. In fact, the FDA told us this study shows the need for more research. So if you want something definitive, it may take some time.
Something you’d like VERIFIED? Click here to submit your story