What A Tangled Web We Weave
I’ve just turned over the last page of How to Read Numbers; I’m vacillating between writing a cheerful blog post or a doomsday one.
On one hand, that was a delightful read. Tom Chivers and David Chivers explain a complicated topic in a very approachable way. The sprinkling of dry humor and clever examples didn’t hurt either.
On the other hand, after reading this, I am never going believe any statistic I read ever again! I knew statistics were tricky and easy to manipulate, but I had no idea there were so many ways to go wrong. To make matters worse, it seems that some of these misleading treatments of numbers can even occur without the author (journalist, unsuspecting doctoral student, etc.) being aware.
I began reflecting on the numbers that I encounter in life and work. The first thing that came to mind is the bi-monthly metrics that we fill out for our organization. As Chivers and Chivers say, “Metrics are necessary. But there’s a trade-off.”  We document things like how many gospel conversations occur, how many intentional discipleship relationships we’re engaged in and of course how many took a first-time step of following Jesus. Goodhart’s effect can easily take over here. In an effort to hit certain targets, the behavior that we want to measure can be altered, perhaps becoming less natural or authentic and certainly becoming less effective. There’s also just the simple fact that, as you can imagine, different missionaries interpret these questions differently, so how reliable are the numbers in the end?
Some more numbers I’ve been digging into recently come out of the Fuller Youth Institute. For example, director Kara Powell presents some of the Institute’s research and explains that of children growing up in Christian families 50% leave the faith after high school. She goes on to say that about half of those return later in adulthood especially as they become parents
themselves.  Since we have two things going on here – kids leaving the faith and those same kids eventually finding their way back – I wonder if Bayes theorem might come into play. I also have questions about exactly how the researchers came up with their sample. How did they avoid sample bias? It’s difficult to know how alarming these statistics really are without getting the answers to these questions.
Also related to my NPO research, I’ve been collecting studies on parenting styles and specifically corporal punishment. This area is a good example of what Chivers and Chivers describe as an “average result” showing a reliable conclusion over time.  Some studies show clear links between corporal punishment and a child later developing all sorts of mental health and behavioral problems. Other studies are less conclusive. These studies also raise the question of causality. So many factors play into a person’s mental health; can a study really point to the fact that they were spanked years earlier? Obviously, it’s probably a lot more complicated than that. (For the record, I’m personally opposed to any form of corporal punishment of children, but I live in a context where spanking is accepted, hence my interest in studying the topic in depth.)
There is one more aspect worth mentioning regarding studies on corporal punishment. There seems to be little discussion regarding absolute vs relative risk.  Presumably, every child has a certain risk of becoming mentally ill as an adult. In the studies pointing to an increase in mental illness, at least the ones I have seen, make no effort to relativize the risk due to spanking.
“I wonder, but I don’t have the answers.” That seems to be the conclusion in many of Chivers and Chivers’ examples. We just don’t know. Sometimes it’s a matter of digging a little deeper, looking at the original studies. But sometimes those original studies aren’t available; even if they are, it’s unlikely that I am competent enough in that given scientific field to evaluate the soundness of a study. The authors of How to Read Numbers attempt to land their reflections by giving journalists 11 helpful tips to report statistics ethically.  It’s true that if there were higher standards of how to report statistics, we could be more confident in what we read. That said, we can’t just wait around for somebody else, let alone the whole system to change. In the meantime, I will definitely be more skeptical of what I read. From now on, I’ll be taking every statistic with a grain of salt.
 Chivers, Tom, and David Chivers. How to Read Numbers: A Guide to Statistics in the News (and Knowing When to Trust Them). Paperback edition. London: Weidenfeld & Nicolson, 2022. 161.
 “The Holy Post : Episode 171: ‘Sticky Faith’ with Guest Kara Powell!” Accessed November 15, 2022. https://thephilvischerpodcast.libsyn.com/episode-171-sticky-faith-with-guest-kara-powell.
 Chivers, Tom, and David Chivers. How to Read Numbers: A Guide to Statistics in the News (and Knowing When to Trust Them). Paperback edition. London: Weidenfeld & Nicolson, 2022. 98.
 Ibid. 77.
 Ibid. 166-171.
9 responses to “What A Tangled Web We Weave”
Leave a Reply
You must be logged in to post a comment.
I too am appalled by how easily I am misled. Apparently I am Mr. Anecdotal. Sigh. You mentioned…”The authors of How to Read Numbers attempt to land their reflections by giving journalists 11 helpful tips to report statistics ethically.”
I glommed onto that in my way forward. I found 5 things I had not though about enough and will plow ahead in the statistics that confront me in my Immigration NPO.
Thanks for your comments…Shalom…Russ
Kim, I am glad to see you also were looking at the metrics your team uses. I am curious if you see any difference between the data we use for reporting findings from studies and the data we use for gauging our performance in a team setting? I am working on my own thoughts on this and would like to hear yours.
Great post linking the author’s work to some of the research you are already digging into. Your first point is particularly interesting to me as I have had conversations with full-time Christian workers who are ‘sent out’ and supported by friends, family and church…and then feel ‘pressure’ to produce ‘results’ (in the form of statistics) to justify their supporters investment. What a tough way to live! Not only that, I must confess that I can tend to roll my eyes a bit at some of the spectacular statistics certain big international para-church organizations trumpet on glossy brochures for their supporters in North America. Does anyone check those stats….more scary: does anyone WANT to check those stats because, frankly, those nice big numbers help fuel the mission/machine. It seems to me that it’s not just scientists wanting to get published in prestigious journals that will ‘bend’ the results…it’s Christians as well. May God help us all live with integrity!
Like you, I am left with suspicion of every statistic I now hear! I don’t feel like I have the bandwidth or sometimes, the intellect to look into the studies and figure out the whole truth. I am particularly suspicious of the “gaming” of metrics found in any kind of church survey. I know in my particular context some refer to our church as 1000 member congregation. Ten years ago we probably were a 1000 member congregation but not any more. We simply haven’t purged the rolls in way too long. Our online service is said to have over 500 viewers. My question has always been, “Yeah, but how many of those ‘views’ are an accidental click on facebook?” We sound great when we “game” the metrics but the reality is never quite as shiny.
You mentioned the bi-monthly data your organization tracks. It got me thinking about what kinds information I collect and why. Questions I am asking of myself are: What is the purpose behind collecting this data? What am I going to do with the information? How will the information shape my NPO? I want to make sure I am counting and tracking the right things for the right reasons. I don’t want to waste my time! I think it could be very easy to get lost in all the data. I am keeping it simple but hopefully on target.
Kim, I’ve found christian organizations playing fast and loose with numbers when they want to highlight a crisis.
I’m not suggesting we may not have critical issues at hand (the decline of rising generation Christians or the rise of the nones) but like you now am questioning more deeply than ever statistics that are given in service to an agenda (whether the agenda is good or bad).
Really thought provoking post, and I appreciate your tie in to your NPO. Thanks.
Good work on starting to connect with your NPO. That’s inspiring for me.
I too found this to be an enjoyable read – not necessarily a barn-burning page turner, but not a dud either. And yet, I too, came to the last page with that sense of “Dear Lord, what are we supposed to believe? Is anything accurate and true?” I posed the question to Russell if this brings him to a place of “cynicism” – I know I struggle with that. How about you? Are you able to, as you say, “take everything with a grain of salt?” Or does it go deeper into your soul in some way?
Hey Kim, you’re such a deep thinker and yet humility bleeds throughout your post! So much caught my attention but there is one thing that kept coming back to me. In regard to the bi-monthly metrics that you fill out in your organization, you said, “There’s also just the simple fact that, as you can imagine, different missionaries interpret these questions differently, so how reliable are the numbers in the end?”
The fact that you recognized that no matter how simple or good the questions are, it is possible to actually interpret the questions differently. My first thought was, in this case how in the world do I ask a question to make sure my audience is interpreting it correctly? What are the factors I need to keep in mind to make sure everyone is on the same page? I must keep in mind, everyone because of past experience will not interpret the question the way I meant it?
All of this makes my work a little more challenging ut in the end it will be more fruitful…at least that’s what I’m telling myself! Thanks for a thought provoking blog!
Thank you for always delving deep into the topic at hand. The book does give us a greater understanding of keeping a critical eye open to fallacy. I listened to a Tedx Talk on statistics by Alan Smith “Why You Should Love Statistics” where he stated that the science of statistics is truly a “science of us”. Instead of just seeing it as a science of uncertainty and inaccuracy, it reflects humanity. His thought is that we should not fear this field, but become better statiticians so we can get a more accurate view of the state or community we live in. His talk inspired me to not be discouraged regarding statistics but apply myself and be more vigilant myself as I press into my research.
Thanks again for causing us all to think on deeper levels.