The danger of disembodied "facts"
My friend Alex Zwissler, who heads up the Chabot Space and Science Center, published an interesting blog post today giving guidance about how to separate fact from fiction on scientific topics, with the promise of more to come. He focused today on determining the source of any claim, assessing whether the source is truly an authority and what their underlying motivations might be. Here are the intro two paragraphs:
In one of my recent posts I had some fun with the topic “…the six things I want our kids to know about science” … of course there are more, but it was a good start. Among the comments I received on the post, one from a friend posed a troubling question, which could be restated as, “OK wise guy, how DO we help our kids figure out what to believe with all this seemingly conflicting and confusing sciency stuff?” In re-reading my rants on this subject I realize that while I have done a passable job of laying out the challenge of figuring out how to decide what to believe, I’ve done a crap job in providing any answers to the question how. This led to some rapid self- reflection, asking myself the question, “OK wise guy, how do YOU figure out what to believe with all this seemingly conflicting and confusing sciency stuff?” …fie on self-reflection. But the effort did allow me to see that I do have a bit of method to my madness, outlined here.
My first step in assessing the validity of a claim is pretty much always the same… I take a really, really close look at the source. I feel this is the best place to start, and while taking a bit of time and effort it can often yield immediate results. If the source does not pass the smell test, then move on. This exercise breaks down into two broad categories, qualification and motivation.
I’ve written about some aspects of this topic in several places in Turning Numbers into Knowledge, including Chapter 11, which focuses on applying critical thinking to assessing arguments. Alex’s post got me thinking about the practical complexities that often arise, even for researchers in a specific field, and I saw a good example of such complexities in an news report in Science Daily that was just posted today.
This report summarized a peer reviewed article that appeared in a well regarded journal (Environmental Science and Technology). Even better, the report gave the actual citation with a link to the article’s DOI (digital object identifier, which is a record locator for scholarly papers). It also states at the end that it is a summary of materials supplied by the American Chemical Society, so it’s not original reporting by Science Daily. So far, so good.
Now it gets interesting–here’s the first paragraph of the article:
Researchers from the Centre for Energy-Efficient Telecommunications (CEET) and Bell Labs explain that the information communications and technology (ICT) industry, which delivers Internet, video, voice and other cloud services, produces more than 830 million tons of carbon dioxide (CO2), the main greenhouse gas, annually. That’s about 2 percent of global CO2emissions – the same proportion as the aviation industry produces. Projections suggest that ICT sector’s share is expected to double by 2020. The team notes that controlling those emissions requires more accurate but still feasible models, which take into account the data traffic, energy use and CO2production in networks and other elements of the ICT industry. Existing assessment models are inaccurate, so they set out to develop new approaches that better account for variations in equipment and other factors in the ICT industry.
A reader might reasonably conclude that the research article added up carbon dioxide emissions and showed that the ICT industry emits about the same amount of greenhouse gases as global aviation, roughly 2 percent of global emissions. When you read the article itself, however, you realize that the authors were simply summarizing the results of six other studies, three peer reviewed, three not, with the two key sources dating to 2007 and 2008, respectively.
In order to really understand if this claim is true you’d need to go back to those sources and read them all. If you did that, you’d realize that the 2 percent estimate is from non-peer reviewed reports published 4-5 years ago, and that the cited research article was simply reproducing those figures as context for presenting their conclusions. In essence, the “factoid” that 2% of the world’s carbon dioxide emissions come from ICT has become disembodied from the original source, making it difficult and time consuming for people unfamiliar with this literature to determine if it’s true or not.
None of this should discourage the lay reader from following Alex’s advice and assessing the credibility and motives of any information source, but it also highlights the importance of actually reading the original published source for any particular claim. Summaries of other people’s results almost invariably create disembodied statistics and other confusions, so it’s incumbent on anyone who wants to use information for an important decision to go back to the original source. That’s the only way to make sure you’ve really gotten it right.
In the Epilogue to the second edition of Turning Numbers into Knowledge I summarize a related example, in which some rather wild claims about Internet electricity use required detailed debunking. Email me if you’d like a copy of the Epilogue and a few related articles–it’s a terrific illustration of disembodied statistics run amok.
Finally, I highly recommend William Hughes’ book titled Critical Thinking: An Introduction to the Basic Skills. The book is a marvelous introduction to critical thinking, and it discusses how to evaluate whether an authority is credible in some detail. I have the 1997 edition, which was written by Hughes alone, but there seem to be two later editions coauthored by Jonathan Lavery and William Hughes, and used copies seem to be reasonably priced. Here’s the link to the 2008 edition on Amazon US. Amazon Canada has the 2008 edition new for about $48 Canadian.