Yesterday, the internet exploded (as it does regularly) over an infographic about what a can of coke does to the human body after consumption. The original post was created by the website www.therenegadepharmacist.com. Unsurprisingly, fact checking was not utilized by many parties before this post was shared and before long it was trending on Facebook and other outlets.
While many of us have become cautious and look into the facts of a random story before sharing on facebook or retweeting on twitter, we need to be aware of a new issue that is starting to creep up on the internet: science pages using headlines for click-bait. Many pages including the extremely popular www.iflscience.com will share the headlines of a story to attract readers, but not completely review the issue. At worst, these articles will report false science or over-hype the science, and many readers who have come to trust the source fail to critically evaluate the article. While reading stories about this can of coke infographic, I found a decent article that brought in outside experts to discuss the article. It was easy to understand and trust the varied perspectives that the writer included in the article until the third paragraph from the end. This paragraph stated that a study produced in 2010 found that those who drank 2 or more sodas a week have an 87% increase in developing pancreatic cancer. The cited study came from a reputable journal, and after perusing the article the data seems to be accurately reported. At this point, it is easy enough to accept this study as fact and move on with our lives thinking that by drinking 2 sodas a week, we will likely develop pancreatic cancer.
However, these reports highlight how important it is that even science writers/pages need to be viewed with a critical eye and should not be taken at face value. It is a common practice in graduate school to teach grad students to start reading a published study by saying “No” or “Prove it”. We are taught to read a paper (no matter how prestigious) with doubt. This is a practice that any consumer of science should adopt.
Let’s use the statement above about pancreatic cancer risk to practice this critical evaluation of data:
- The first step is an initial gut check: does this statement make sense given what we know? If there was a strong correlation between pancreatic cancer and soda, wouldn’t we be more aware of it?
- The article on Yahoo Health cites the article’s year and journal name. With a small amount of internet searching, we can locate the article. Using this, we can evaluate it to make sure the authors of the paper were cited accurately. Depending on the complexity of the article, we could also look at the data presented to determine if we agreed with the data. However, as lay people in the subject let’s assume that the authors of the study have drawn accurate conclusions.
- Most journals and databases allow you to see what other articles have cited this article. Given that the original article was published in 2010, we’ve had several years for research to either confirm this original data or to rebuke it. In this case we hit the jackpot: a paper that cited this original study performed a meta-analysis. These meta-analysis studies combine multiple studies (sometimes even unpublished data) to have a large data set and stronger evidence to support the hypothesis. In the case of this study, the meta-analysis did not support the 2010 paper’s conclusion.
Given this information we can either stop researching the subject or continue looking at other publications to try to have a fuller understanding of the subject.
While this may seem like a lot of work (this search was completed within 10 minutes), step #1 is the most critical step. Think about these claims. Do they make sense to you? Do they fit with what you know? If they don’t or if you can’t verify that the information is correct, there is only one thing to do: don’t share!