Has Anyone Checked the Numbers?

In publishing circles today, there’s a premium on numbers. Editors, writers, and journalists often seek to crystallize or legitimize a story with an eye-popping statistic that will become the sound bite or “takeaway” of the piece. Yet most of these professionals have an aversion to examining the underlying data developed by expert authors or sources. Sometimes proudly asserting “I’m a word person,” they soil their hands with numbers just enough to make the story work but refuse to learn what the data really mean. In many cases, they don’t even bother to check the accuracy. No, I don’t have a statistic on this phenomenon to make you gasp. My evidence is anecdotal, but for those in publishing willing to be honest with themselves, I think it will ring true.

Sins of Omission

Consider a hypothetical example similar to many I’ve encountered. An expert author, knowing that it’s difficult to use argument and expertise alone to persuade readers nowadays, conducts “research” to prove his point. Let’s say, for the sake of simplicity, he interviews 25 people and asserts that 94% gave thus-and-such surprising answer to an important question. An observant editor asks the author whether he really means 92% (23 out of 25) or 96% (24 out of 25), given that 94% is not possible, and offers to look over the data for him. The author casually replies, “Let’s just go with 96%.” The editor never reviews the original data to check whether 96% — or any of the author’s other figures — are accurate.

Another author asserts that in a survey she conducted, half of respondents said they never do something that you’d expect they do every day — a “Wow!” for the reader. Unwilling to “open a can of worms” or “get bogged down in details” (phrases I hear often in these scenarios), the in-house editors don’t ask the author whether there’s any contradictory evidence in what the other 50% of respondents said. The findings from one half of the survey takers are presented as if they tell the whole story.

To be sure, you shouldn’t weigh down an article written for a non-expert audience with extraneous contextual data. It’s obviously appropriate to make judgment calls about how much information the reader actually needs. But if no one at the publisher does the basic background work of checking the legitimacy of the numbers, how can the quality of what the reader is getting be assured? A colleague once said to me, “We’re simply not qualified to interpret the data,” straightforward as the data in question were. Such excuses for omission are, in my experience, rarely viewed as a problem in publishing. But I’ve found them to be insidious and rampant.

Sins of Ignorance

Much more widely discussed are the sins of data ignorance, of which there are many varieties. Take, as one example, a news article that reports that a particular lifestyle behavior increases the risk for a disease by, say, 75% — a staggering figure at face value. A check of the actual research reveals that the number of people who get the disease is so small that a 75% increase amounts to just a few more cases. Readers mistakenly assume that huge numbers of people who engage in the behavior are at risk for the disease. It’s possible that this was a sin of omission (the journalist didn’t bother to check the original data), but if you talk to the people who report on these types of stories, you’ll more often find that they just didn’t know how to interpret the statistics in the research. Either way, the reader is ill served.

Then there’s the ignorance of how the data were developed. If, for instance, 65% of workers at a particular site were deemed to be “inadequately trained,” what precisely were the criteria for adequacy? And how was the evidence that a worker did or did not fulfill those criteria collected? That’s the type of information that readers really must have to understand the context, yet frequently even the editors never ask to see it, thereby guaranteeing that the readers won’t get access to it either.

The Devil in the Unverified Details

Publishers of many stripes have become intoxicated with the reporting of numbers, in part because consumers have shown that they have a taste for that elixir. The compelling, sometimes shocking statistic dances seductively in the headline, the subtitle, or the callout — and the reader succumbs to its charms. A number, potent and seemingly unassailable, is worth a thousand words. As long as readers and producers of content alike are addicted to the allure of statistics but simultaneously allergic to the task of understanding what’s behind them, the premium on a well-placed, poorly vetted percentage will remain very high indeed.

Advertisements

About Steven DeMaio
Steven DeMaio teaches English and math at the Community Learning Center in Cambridge, Massachusetts, and the Somerville Center for Adult Learning Experiences in Somerville, Massachusetts. He also works as a freelance editor and writer. This is a continuation of his blog that ran for 10 months in 2009 on HBR.org.

2 Responses to Has Anyone Checked the Numbers?

  1. Rachel says:

    Like my uncle always said his economics teacher always said… figures lie and liars figure!

  2. Thank you for a valuable insight that really rings true for me. It also explains why people use graphs that do nothing to really explain the numbers but look “pretty.”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: