Data Storytelling as a Moral Practice
We live in an age that worships data but starves for wisdom. Every click, comment, and keystroke becomes another offering to the algorithmic gods, who promise certainty and deliver spreadsheets. We have mistaken collection for comprehension. But the future of analytics will not belong to those who count the most. It will belong to those who listen best.
Because data is not truth. It is evidence. And like all evidence, it needs context, conscience, and care.
When organizations talk about data-driven decision-making, they often forget that driving without steering still gets you lost. Metrics can describe the world but never define it. What matters is how we interpret what we see and what we choose to do with that interpretation.
MIT Sloan researchers call this the difference between descriptive analytics, which explains what happened, and moral analytics, which asks why it matters. The real challenge is not gathering the numbers. It is giving them meaning.
Ruha Benjamin, in Race After Technology, warns that data systems often replicate the very biases they claim to solve. Safiya Noble’s Algorithms of Oppression makes the same case. Search engines do not reflect society neutrally. They amplify its prejudices. When you tell a story with data, you are shaping power, and power always has a moral direction.
That is why data storytelling is not just communication. It is responsibility.
McKinsey’s State of People Analytics (2023) found that fewer than 35 percent of organizations have ethical review boards for analytics projects, even as they use data to influence hiring, promotion, and pay. Without moral infrastructure, analytics teams operate like emotional accountants, tracking what is measurable while ignoring what is meaningful.
But here is the truth every People leader should remember. Data about humans is human. It carries our histories, our inequities, and our potential. Treating it like a neutral dataset is not only naïve. It is negligent.
The moral practice of data storytelling begins with three disciplines: empathy, transparency, and context.
Empathy demands understanding the lives behind the numbers. When an engagement score drops, it is not just a metric. It is a message. People are telling you something. The role of analytics is to translate, not to judge.
Transparency requires showing how stories are built. Who gathered the data. Who was left out. What assumptions shaped the model. The OECD’s research on Digital Trust (2023) found that transparency directly correlates with data adoption. People are far more likely to engage with systems they understand.
Context reminds us that data never floats free. Every chart sits inside a story, social or economic or cultural. The question is whether that story honors the truth or manipulates it.
So how do we move from extraction to interpretation. By designing analytics processes that center humanity from the start. This means forming cross functional ethics committees that review algorithms before deployment. It means pairing data scientists with behavioral researchers, DEI experts, and frontline employees. It means teaching leaders to ask, What story does this data tell, and who gets to tell it.
In 2024, Accenture’s Responsible AI Framework called for human in the loop storytelling. Analytics that treat humans not as variables but as narrators. The organizations that adopt this mindset will lead the next era of evidence based empathy.
Because the best stories do not just report reality. They reimagine it.
The moral dimension of data storytelling is not a constraint. It is a compass. It ensures that insights are not weaponized but humanized. It ensures that analytics becomes an act of understanding, not surveillance.
Maybe that is the work ahead for leaders and strategists. To transform data from something we extract into something we exchange. To use information not to predict behavior, but to build belonging. To let numbers remind us that people were always the point.
Because data without empathy is noise. But data with soul. That is wisdom.
Key Takeaway
The future of analytics isn’t artificial; it’s ethical. Numbers will tell us what’s happening, but only conscience can tell us what’s right.
Practical Tool for Leaders: The E.T.H.I.C.S. Data Review Framework
A People First Strategies practice for ethical data storytelling
Ethical data storytelling does not begin with dashboards. It begins with questions.
At People First Strategies, we use the E.T.H.I.C.S. framework as a pause point before high-impact decisions. Not to slow organizations down unnecessarily, but to slow them down intentionally, in the moments where speed can do harm.
E is for Evidence, not conclusion.
Before asking what action the data justifies, leaders ask what the data actually shows. What patterns are visible? What is uncertain? What cannot be inferred? This protects against the common failure of treating correlation as truth and numbers as verdicts.
T is for Those represented, and those missing.
Every dataset reflects choices. Who was counted? Who was excluded? Whose experience was flattened into averages? Ethical review requires leaders to ask whose voices are absent and how that absence might distort the story being told.
H is for Human impact.
Leaders pause to consider how this interpretation will land in real lives. Not abstract employees, but actual people. Careers. Dignity. Trust. This is where analytics becomes leadership, and where responsibility replaces detachment.
I is for Intent.
Why is this data being used now? To learn. To improve. To justify. To control. Research consistently shows that people can sense intent, even when leaders believe they are being neutral. Clarity of purpose is an ethical safeguard.
C is for Context.
Data never exists in isolation. Economic pressure, organizational change, social conditions, and power dynamics all shape outcomes. Ethical storytelling requires leaders to locate numbers inside the larger system that produced them.
S is for Stewardship.
Finally, leaders ask how this data will be held going forward. Who has access? How long it lives? What safeguards protect against misuse? Stewardship recognizes that data about people creates an ongoing obligation, not a one-time insight.
The E.T.H.I.C.S. framework is not a compliance tool. It is a discipline of attention. It reminds leaders that every analytic story carries weight, and that weight must be carried deliberately.
Used well, it does something quietly powerful. It turns analytics from an instrument of authority into an act of care. In a time when trust is fragile, that shift matters more than any metric.
📚 Further Reading on Data Ethics, Storytelling, and Human-Centered Analytics
MIT Sloan Management Review. (2023). The Human Side of Data Analytics. https://sloanreview.mit.edu/article/the-human-side-of-data-analytics/
🌱 Explores how organizations can balance analytics capability with ethical judgment.
Ruha Benjamin. (2019). Race After Technology: Abolitionist Tools for the New Jim Code. Polity Press.
🌱 Examines how data and algorithms reproduce racial inequity — and how to resist.
Safiya Umoja Noble. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. NYU Press.
🌱 Reveals how algorithmic systems distort representation and deepen bias.
OECD. (2023). Building Digital Trust for a Human-Centered Economy. https://www.oecd.org/digital/
🌱 Links transparency and ethics to trust in digital and analytics systems.
Accenture. (2024). Responsible AI Framework. https://www.accenture.com/responsible-ai-framework
🌱 Recommends integrating human oversight and ethical storytelling into AI design.
© Susanne Muñoz Welch, Praxa Strategies LLC. All rights reserved.