Narrative risks and the curse of the “10%” answer
Storylines and narratives are powerful tools to convey analytics results. However, they also risk analysts and audiences fixating on conclusions too early and disregarding new information to the contrary.
While a ‘narrative’ attached to data analysis can be powerful and persuasive, it can also turn into a runaway freight train that can be counterproductive. This article resonated strongly with me, and considers both sides of the argument on when to use a storyline to get across results. While it covers the analytics field as a broad whole, I see it as being especially relevant in the context of people analytics.
In short, it considers storylines and narratives that provide an intuitive explanation of the data. These can be critical to understanding, providing a memorable mental model to a client or other information consumer that drives home the point of the analysis.
While this phenomenon is broadly understood, the article also covers the other side of the equation: the idea that a convincing storyline may be very difficult to let go of, even if the analysis subsequently proves it to be incorrect. Crucially, this risk is not limited to the analyst: the audience or intermediate managers are also at risk of fixating too early on a narrative.
A Particular Vulnerability for People Analytics
To bring the conversation to people analytics specifically: I am especially wary of the danger of premature narratives in this space. There are two characteristics of people analytics that I see as increasing the risk.
- The nature of the data: Given the challenges around collection and interpretation, people analytics datasets are likely to be subject to even more caveats and limitations than usual. Due to both these limitations and the inherent complexity of the data, analyses will rarely neatly resolve to a simple, clean, interpretable solution. Analysts therefore tasked with connecting a lot of different dots, which will typically require a clean narrative to be created.
- Intuitive conclusions: I see people analytics being more at risk of bias driven by intuitions. In some fields, analysts are unlikely to have clear hypotheses about what might be driving the changes (e.g., handwritten digit recognition). Conversely, often people analytics deals with very human issues and challenges, that both the analyst and the audience will likely have some experience of. The temptation to use personal anecdotal evidence to construct a unifying narrative Is extremely strong, even compared to other data analytics fields.
These factors put together make this risk especially notable in people analytics. The possible risks clearly do not obviate the need for storylines and narratives to drive impact and change. However, I believe it is a tool that should be used with caution, and without undue haste.
Implications for Consulting and the “10%” Answer
One of the reasons this article hit home for me was because of my pre-HBS experience in consulting. One of the hallmarks of consulting is an evolving storyline: a Day 1 theory of the case that, at least in theory, evolves over time with new data and analysis. While this can be a valuable tool to orient clients around, it is also possible for internal or external stakeholders to fixate on a particular narrative, irrespective of what new information might show.
In my experience, this risk holds double for analytics-centric cases. Given the norms of the industry, clients often seek a “10%” answer very close to the start of the project, when the analyst may still be early stages of the extract/transform/load and data validation process. Despite the push to deliver something, the preferable approach may be to avoid wrapping extremely early data analysis in a convenient narrative.
The article provides a cogent example of this, where the author provided a clear and compelling narrative that generated a huge amount of interest and passed untouched through multiple inspections. Ultimately, it turned out to be based on entirely false assumptions. While this happened in a harmless weekend long hackathon, care should be taken to avoid this problem in higher stakes arenas.
Reference: https://towardsdatascience.com/the-dangerous-allure-of-the-narrative-bae44b38cfde
I think you’ve identified an important tension! Curious how we could structurally manage this in People Analytics. For example, in our LPA class we have a “best practice” to develop a hypothesis (aka narrative) before we run a regression, which pushes us in one direction.
Confirmation bias: easy to identify, difficult to overcome! (I speak from experience). Laura, I think you’re on to something that we could take from LPA and apply to the future. If we’re involved in people analytics work (or really, any data-driven work), we could ask the question “what do we know to be true, and HOW do we know that?” Going through those discussions and surfacing assumptions could be a great first step to overcome bias. I think data transparency helps here, too: the more people can see the data and follow the analysis, the more likely we are to gather cool ideas and challenge any prevailing/confirmed views. It puts more onus on us as analytically-minded leaders to teach and lead through the data. If we hear signals of narrative bias (“the data should show…”, “this is how it’s always done…”, or “we know the answer already…”), we can respond with this trick: If we truly know the answer, why are we even asking the question and wasting everyone’s time??
Guilty as charged. I’ve fallen to this multiple times as a fellow consultant. Given that time is always in short supply, I think the cure involves two things: (1) approaching project decision-making in an aggressively speedy but incremental manner, in a way that allows for course-corrections and (2) some level of intuition, developed over time and imparted by senior leaders for junior team members, of how wide the range of potential conclusions can be. Getting fixated on a narrative with a limited, close set of outcomes is less of a problem than a situation where potential outcomes vary by a huge factor.
I think storytelling in general is seen as such an asset if you are someone who moves between data and leadership teams that there is a strong incentive to craft a narrative and stick with it, even if the data starts to point another way. When it comes to people-related data, I totally agree with your point about the data scientists having experience, as humans, with a lot of these situations and thus making assumptions or jumping to conclusions in their stories. Perhaps one approach to try to help with this would be to frame things more explicitly as hypotheses rather than stories along the way. Hypotheses more naturally invoke the idea of needing to be proven or disproven, whereas stories show up as explainers.
This really hits home for me. As someone who hates being wrong, I often struggle with the tension of changing my mind once I’ve established a hypothesis or have started building out a storyline that I don’t want to change as we learn more information. This reminds me of a book I recently read by Adam Grant, called Think Again. He talks a lot about the difficulty in changing people’s minds and helping them understand what they “don’t know” even if they think they already know the answer. I wonder if, in practice, there is a way to celebrate the moments where we break away and disprove hypotheses. If we can normalize changing answers and evolving storylines in the workplace, perhaps we can lessen some of the unconscious tension driving us to want to go full-speed ahead with only early, half-baked analysis
Very interesting article, Tom! And it perfectly echoes my perception that data analysts should be extremely mindful of potential confirmation bias, and as you point out, they may be more vulnerable to these types of biases in the early stages of analysis.
What safeguards can we implement to protect analysts from jumping to conclusions too quickly? Should we go so far as to say that no third-party pressure should be put? I can definitely see that as a challenge when working with demanding clients, just like in consulting.