A relatively common “problem” I face with industry colleagues is the habit of quoting a particular sentence or paragraph from an article (be it a web article, something in the news, an academic paper, or anything they chanced upon) and use it as the “absolute truth” in supporting their argument. Many a times, these individuals will also claim such quotes to be “evidence-based”, hoping that the use of another buzzword (“evidence-based”) will add weight to their argument.
Any attempts at my end to explain what evidence-based literature / medicine really means or trying to examine the cited quote (often without a source) in the right context and perspective will result in either the”deers in the headlights” or “spooked puppy” look, after-which they will repeat their quote and argument, hoping that I will finally have an eureka moment and thank them for their insights.
On a few occasions, I even made the mistake of trying to explain the hierarchy of evidence (I was trying to explain that not all published materials are equal for some could be bias or contain systematic error) and the results were consistent – looks of disbelief, it seems that the myth of “if it’s published, it must be true” is deeply entrenched in many people.
These days, I would pick my battles wisely (very few one-off debates are truly meaningfully, understanding the context is the key), just smile, acknowledge their claims (often telling them it is “interesting”) and let it slide. After which (if time permits), google for a relevant article (that debates the issue more objectively) and send it to the relevant parties.
Earlier today, I chanced upon a fantastic article from Asian Scientist Magazine, illustrating to the layman on why some research are more worthy than others and my immediate reaction was to share it with you folks. The original article is available at http://www.asianscientist.com/2015/06/columns/science-eat-this/.
AsianScientist (Jun. 24, 2015) – Reading about science is particularly enjoyable when it also happens to vindicate your lifestyle choices.
Three to five cups of coffee a day is considered a “moderate amount” and is associated with the lowest risk of heart disease? I’m only at one measly cup, but I will be more than happy to make up for lost time.
The higher your nut consumption (preferably at least seven times a week), the less likely you are to die from heart disease, cancer, and respiratory diseases? I just stopped feeling guilty about my peanut butter habit—hand over the jar and spoon please.
Sometimes, though, this warm fuzzy feeling can be entirely unwarranted. A few months ago, the widely-reported “A glass of red wine is equivalent to an hour at the gym” story vindicated everyone’s inner couch potato by miraculously transforming the hedonistic into the healthy. This despite the fact that neither the actual study nor its press release made anything even remotely close to that claim.
What we eat affects our health, and unlike many other aspects of life, we have considerable control over our food choices. So it’s no surprise that research into nutrition generates a great deal of press and public interest. Unfortunately for those seeking clear direction, however, studies are often inconclusive or conflicting, and the field is notoriously rife with bad science and bad reporting.
The chocolate sting
Recently, researchers at the Institute of Diet and Health found that people who followed a low-carbohydrate diet and ate a bar of dark chocolate daily lost weight faster than those on the low-carb diet alone. In late March, they published the results in the journal International Archives of Medicine. Thanks in part to a craftily-written press release from the Institute, the story received print, online, and television coverage in almost twenty countries.
There was one problem—the study turned out to be a sting operation, designed to show how easily junk science can be turned into diet advice for the masses. The perpetrators: a science reporter, and two documentary makers working on a program about junk science and the diet industry. The Institute of Diet and Health? It only exists on the Internet.
The study really was carried out, but had deliberately been set up to contain serious flaws. A mere fifteen people were each assigned to follow one of three diets: low-carb, low-carb plus chocolate, and control (no change to their usual diet). The “researchers” tracked their subjects over three weeks, measuring 18 different parameters, including weight, cholesterol, sodium levels, and sleep quality.
As the exposé article explains, measuring a large number of parameters in just a handful of people is almost guaranteed to produce a statistically significant result—in this case, a ten percent faster weight loss for the low-carb plus chocolate group over low-carb alone. Other flaws: the groups were not balanced for gender or age, and no one cared what the control group ate.
The results were completely meaningless. But for a fee, the International Archives of Medicine rapidly published them, without the need for peer review. It is certainly not the only journal that would have done this—such “pay-for-play” publications have proliferated in recent years.
Media and misinformation
It’s true that any reporter who had bothered to read the actual journal article, instead of just the press release, should have had serious doubts about covering it. But while the aim of the sting was to bait and shame the media, it also showed that researchers, journals, and press offices of research institutions can be equally culpable for the enormous amount of nutritional misinformation out there. And it’s ironic that the end result of this sting was to add to it.
But let’s end on a happier note, just in case the phrase “low-carb diet” has sunk you into the depths of despair. More rigorous studies (that don’t just involve fifteen people) have indeed associated chocolate eating with a reduced risk of heart disease, stroke and diabetes. I’m already stuffed from all that coffee and peanut butter, but if science insists…